-
Notifications
You must be signed in to change notification settings - Fork 529
Add capability to pass additional grad_kwargs for LayerGradientXActivation #1286
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
This pull request was exported from Phabricator. Differential Revision: D57756842 |
…ation (pytorch#1286) Summary: Pull Request resolved: pytorch#1286 Differential Revision: D57756842
This pull request was exported from Phabricator. Differential Revision: D57756842 |
…ation (pytorch#1286) Summary: Pull Request resolved: pytorch#1286 Differential Revision: D57756842
This pull request was exported from Phabricator. Differential Revision: D57756842 |
…ation (pytorch#1286) Summary: Pull Request resolved: pytorch#1286 Differential Revision: D57756842
This pull request was exported from Phabricator. Differential Revision: D57756842 |
…ation (pytorch#1286) Summary: Pull Request resolved: pytorch#1286 Differential Revision: D57756842
This pull request was exported from Phabricator. Differential Revision: D57756842 |
…ation (pytorch#1286) Summary: Pull Request resolved: pytorch#1286 Differential Revision: D57756842
This pull request was exported from Phabricator. Differential Revision: D57756842 |
…ation (pytorch#1286) Summary: `torch.autograd.grad` can take several additional arguments like `retain_graph`, `create_graph`, `allow_unused` and `materialize_grads`. This change enable users of `LayerGradientXActivation` and `compute_layer_gradients_and_eval` to pass in those arguments if they want to. Differential Revision: D57756842
This pull request was exported from Phabricator. Differential Revision: D57756842 |
…ation (pytorch#1286) Summary: `torch.autograd.grad` can take several additional arguments like `retain_graph`, `create_graph`, `allow_unused` and `materialize_grads`. This change enable users of `LayerGradientXActivation` and `compute_layer_gradients_and_eval` to pass in those arguments if they want to. This change should solve issue on Differential Revision: D57756842
This pull request was exported from Phabricator. Differential Revision: D57756842 |
This pull request has been merged in 03340ec. |
Differential Revision: D57756842