Skip to content

A solution for supporting in-place nonlinear submodules #914

Open
@arash1902

Description

@arash1902

🚀 Feature

Replace the register_full_backward_hook with a forward hook and get grad values from the tensors.

Motivation

First of all, thank you for the great package. Unfortunately, I cannot use provided package when I have in-place nonlinear submodules. So, I am suggesting going with the solution provided here: pytorch/pytorch#61519

Pitch

So, the idea is simple. You may replace all register_full_backward_hook with the register_forward_hook and then add register_hook on output or input tensors if needed to get the value of the gradients. I hope this makes sense and can help you to enhance your module.

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

    Development

    No branches or pull requests

      Participants

      @vivekmig@arash1902

      Issue actions

        A solution for supporting in-place nonlinear submodules · Issue #914 · pytorch/captum