Skip to content

2nd order gradients for activations #1099

Closed
@veqtor

Description

@veqtor

Describe the feature and the current behavior/state.
Currently the activation functions in tf-addons are missing 2nd order gradients, this makes it impossible to use them for training GAN's that need various forms of gradient penalties (WGAN-GP, StyleGAN 1/2, etc).
I suggest adding 2nd order gradients for these functions

Relevant information

  • Are you willing to contribute it (yes/no):
    No
  • Are you willing to maintain it going forward? (yes/no):
    No
  • Is there a relevant academic paper? (if so, where):
    different for every activation function
  • Is there already an implementation in another framework? (if so, where):
    Unknown
  • Was it part of tf.contrib? (if so, where):
    No

Which API type would this fall under (layer, metric, optimizer, etc.)
activations
Who will benefit with this feature?
Anyone doing research and/or training GAN's using activation functions in tf-addons
Any other info.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingcustom-opshelp wantedNeeds help as a contribution

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions