-
Notifications
You must be signed in to change notification settings - Fork 6
Open
Milestone
Description
Given that the API provides a vector length, "preparation" (allocation of buffers, caches, precomputations, everything that DifferentiationInterface.prepare_...
methods do) just needs a type.
I am proposing that ADgradient
always performs this preparation step, with the default type T = Float64
. The user may call the function with other types, in which case the layer should just silently convert as needed, without warning.
Cf JuliaDiff/DifferentiationInterface.jl#859, but we can do this in this package if that is not supported.
The x = ...
argument should be retired.
Metadata
Metadata
Assignees
Labels
No labels