Skip to content

Support more datatypes #1

@StrongChris

Description

@StrongChris

Currently only F32 is supported.

Would like to support all datatypes especially uint16, f16, bf16, complex64 and complex128.

uint16 isn't currently supported by pytorch. I've added a note in support of it's inclusion in pytorch/pytorch#58734

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions