You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 1, 2021. It is now read-only.
Thanks for the heads up. We tried to stay away from the more "exotic" call-signatures in torch, but I think there's no reason we couldn't accept a PR implementing this gradient.
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Example code:
The last line throws an error
.../DirectTape.lua:165: missing gradient for argument 3 in function torch.add
.Not a big deal, as this use of
torch.add
can be decomposed into a mul followed by an add. Just thought I'd mention it.The text was updated successfully, but these errors were encountered: