Skip to content

Fixing watsonx error: 'model_id' or 'model' cannot be specified in the request body for models in a deployment space #11854

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

cbjuan
Copy link
Contributor

@cbjuan cbjuan commented Jun 18, 2025

Title

Fixing watsonx error: 'model_id' or 'model' cannot be specified in the request body for models in a deployment space.

Relevant issues

Fixes #11837

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
  • I have added a screenshot of my new test passing locally
  • My PR passes all unit tests on make test-unit
  • My PR's scope is as isolated as possible, it only solves 1 specific problem

Type

🐛 Bug Fix

Changes

This PR sets the field "model" in the JSON payload to None for watsonx provider when the model is part of a custom deployment to fix the issue #11837. As explain in the issue, this issue happens for the provider watsonx and not for watsonx_text.

Screenshot with integration tests passing for watsonx
Screenshot 2025-06-18 at 15 33 50

Screenshot with unit tests passing
Screenshot 2025-06-18 at 15 37 45

Error before the changes

Screenshot 2025-06-18 at 15 37 00

Result after the changes

Screenshot 2025-06-18 at 15 37 36

…e request body for models in a deployment space
Copy link

vercel bot commented Jun 18, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jun 23, 2025 4:01pm

# watsonx: Deployment models do not support 'model_id' in their payload
# https://github.com/BerriAI/litellm/issues/11837
"model": None
if custom_llm_provider == "watsonx" and model.startswith("deployment/")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is the wrong place for this change. it should be inside watsonx/chat/transformation or watsonx/completion/transformation.

@@ -218,3 +219,4 @@ def test_watsonx_deployment_space_id_embedding(monkeypatch, watsonx_embedding_ca
assert mock_post.call_count == 1
json_data = json.loads(mock_post.call_args.kwargs["data"])
assert my_fake_space_id not in json_data
assert json_data.get("model") is None
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please add a unit test for your change inside test_litellm/

cbjuan added 2 commits June 22, 2025 16:32
…ed in the request body for models in a deployment space"

This reverts commit 9d16a30.
@cbjuan
Copy link
Contributor Author

cbjuan commented Jun 23, 2025

Thanks for the feedback @krrishdholakia. I've implemented the requested changes

Screenshot 2025-06-23 at 18 00 13

@krrishdholakia krrishdholakia merged commit 962fd67 into BerriAI:main Jun 23, 2025
6 checks passed
@cbjuan cbjuan deleted the fix-watsonx-chat-deployment-model-issue branch June 24, 2025 09:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug]: watsonx custom deployment fails with "model_id cannot be specified" error
2 participants