Open
Description
🚀 Feature
Replace the register_full_backward_hook with a forward hook and get grad values from the tensors.
Motivation
First of all, thank you for the great package. Unfortunately, I cannot use provided package when I have in-place nonlinear submodules. So, I am suggesting going with the solution provided here: pytorch/pytorch#61519
Pitch
So, the idea is simple. You may replace all register_full_backward_hook with the register_forward_hook and then add register_hook on output or input tensors if needed to get the value of the gradients. I hope this makes sense and can help you to enhance your module.
Metadata
Metadata
Assignees
Labels
No labels
Type
Projects
Milestone
Relationships
Development
No branches or pull requests
Activity
Switch from register_full_backward_hooks to tensor hooks (#979)