Open
Description
Many optimization frameworks provide an optional argument - tol
or tolerance
that lets one stop optimization if the function, gradient or parameter does not change much for a given iteration. This is used in conjunction with a parameter max_iter
for a hard upper bound on the number of iterations the optimizer is to be run for.
The optimizers in the autograd package currently do not support this.
- Are there plans of implementing this functionality?
- If not, would it be a good idea for me to go ahead and implement this or do you feel that the increased computational complexity isn't worth it?
Thanks and great work on the package. Just used this for an assignment and it is really powerful!
A more navigable documentation would be great though.
Metadata
Metadata
Assignees
Labels
No labels