Skip to content

Convergence criteria for optimizers #390

Open
@mohit-surana

Description

@mohit-surana

Many optimization frameworks provide an optional argument - tol or tolerance that lets one stop optimization if the function, gradient or parameter does not change much for a given iteration. This is used in conjunction with a parameter max_iter for a hard upper bound on the number of iterations the optimizer is to be run for.

The optimizers in the autograd package currently do not support this.

  1. Are there plans of implementing this functionality?
  2. If not, would it be a good idea for me to go ahead and implement this or do you feel that the increased computational complexity isn't worth it?

Thanks and great work on the package. Just used this for an assignment and it is really powerful!
A more navigable documentation would be great though.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions