Skip to content

Module.to(ScalarType) has restrictions in PyTorch which aren't restricted in TorchSharp #1180

@shaltielshmid

Description

@shaltielshmid

When calling Module.to(ScalarType) on a module, PyTorch has a restriction that the target dtype must be a floating point or a complex number. See here.

Also, when it gets a floating point type, PyTorch only moves parameters that are already a floating_point type to the new type. See for example here.

This is relatively trivial to add, would TorchSharp want to add the same restrictions?

Metadata

Metadata

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions