Out of the Documentation, there is no reference of max_tokens. the ModelSettings does not accept the max_tokens parameter.. This becomes a problem especially when using anthropic models as they dont assume a max tokens value and need one to get passed.
Activity
s44002 commentedon Mar 12, 2025
I am fixing the issue
Fixes openai#71: Added support for max_tokens in ModelSettings
rm-openai commentedon Mar 12, 2025
Apologies, I didn't see this issue/PR in time and implemented it myself via #105
s44002 commentedon Mar 13, 2025
No worries, getting that fixed was the whole point.