Closed
Description
What happened?
Hi @krrishdholakia , @ishaan-jaff !
The same issue solved here is happening now with the recently released models o3 and o4-mini.
Please solve it as soon as possible!
Thanks!
Relevant log output
{
"error": {
"message": "litellm.BadRequestError: OpenAIException - Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.. Received Model Group=openai/o4-mini\nAvailable Model Group Fallbacks=None",
"type": "invalid_request_error",
"param": "max_tokens",
"code": "400"
}
}
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.66.0-stable
Twitter / LinkedIn details
No response