Skip to content

max_tokens is not an accepted parameter #71

Closed
@s44002

Description

@s44002

Out of the Documentation, there is no reference of max_tokens. the ModelSettings does not accept the max_tokens parameter..

This becomes a problem especially when using anthropic models as they dont assume a max tokens value and need one to get passed.

Activity

s44002

s44002 commented on Mar 12, 2025

@s44002
Author

I am fixing the issue

added a commit that references this issue on Mar 12, 2025

Fixes openai#71: Added support for max_tokens in ModelSettings

a1b4dbc
rm-openai

rm-openai commented on Mar 12, 2025

@rm-openai
Collaborator

Apologies, I didn't see this issue/PR in time and implemented it myself via #105

s44002

s44002 commented on Mar 13, 2025

@s44002
Author

No worries, getting that fixed was the whole point.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

      Development

      Participants

      @s44002@rm-openai

      Issue actions

        max_tokens is not an accepted parameter · Issue #71 · openai/openai-agents-python