You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When calling Module.to(ScalarType) on a module, PyTorch has a restriction that the target dtype must be a floating point or a complex number. See here.
Also, when it gets a floating point type, PyTorch only moves parameters that are already a floating_point type to the new type. See for example here.
This is relatively trivial to add, would TorchSharp want to add the same restrictions?