Description
What happened?
There is an incorrect implementation in the map_openai_params method of litellm/llms/bedrock/chat/converse_transformation.py that forcefully injects tool_choice into optional_params, even when it is not required. This causes issues specifically with Mistral models on AWS Bedrock when using the converse API.
Affected Models:
mistral.mistral-large-2402-v1:0
mistral.mistral-small-2402-v1:0
Problematic Code:
if ( litellm.utils.supports_tool_choice( model=model, custom_llm_provider=self.custom_llm_provider ) and not is_thinking_enabled ): optional_params["tool_choice"] = ToolChoiceValuesBlock( tool=SpecificToolChoiceBlock( name=schema_name if schema_name != "" else "json_tool_call" ) )
Issue:
This code block injects tool_choice even when it is not needed. When using the converse API with Mistral models, this results in the following error:
litellm.UnsupportedParamsError: bedrock does not support parameters: ['response_format'], for model=mistral.mistral-small-2402-v1:0. To drop these, set
litellm.drop_params=True` or for proxy:
litellm_settings:
drop_params: true`
Workaround:
Removing the above code block allows all AWS Bedrock-supported models to work correctly with function calling and structured output features via the converse API.
Dependency:
This bug is related to and potentially dependent on the fix for Issue #11748.
CURL
curl -X POST \ https://bedrock-runtime.us-east-1.amazonaws.com/model/mistral.mistral-large-2402-v1%3A0/converse \ -H 'Content-Type: ap****on' -H 'X-Amz-Date: 20****9Z' -H 'Authorization: AW****cc' -H 'Content-Length: *****' \ -d '{"messages": [{"role": "user", "content": [{"text": "Solve 8x + 31 = 2"}]}], "additionalModelRequestFields": {}, "system": [{"text": "You are a helpful math tutor."}], "inferenceConfig": {}, "toolConfig": {"tools": [{"toolSpec": {"inputSchema": {"json": {"type": "object", "properties": {"steps": {"type": "array", "items": {"type": "object", "properties": {"explanation": {"type": "string"}, "output": {"type": "string"}}, "required": ["explanation", "output"]}}, "final_answer": {"type": "string"}}, "required": ["steps", "final_answer"]}}, "name": "math_response", "description": "math_response"}}], "toolChoice": {"tool": {"name": "math_response"}}}}'
Relevant log output
litellm.UnsupportedParamsError: bedrock does not support parameters: ['response_format'], for model=mistral.mistral-small-2402-v1:0. To drop these, set `litellm.drop_params=True` or for proxy:
litellm_settings:
drop_params: true
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.72.2