Skip to content

feat: Make gemini accept the openai parameter parallel_tool_calls #11125

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
May 26, 2025

Conversation

aholmberg
Copy link
Contributor

When mapping, allow the parameter: True because that is the intrinsic behavior of Gemini. Allow False, but reject if there are multiple tools because there's no actual equivalent in Gemini.

fixes #9686

ref: issues/9686

Title

Relevant issues

fixes #9686

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details

  • I have added a screenshot of my new test passing locally
    image

  • My PR passes all unit tests on make test-unit

  • My PR's scope is as isolated as possible, it only solves 1 specific problem

Type

🆕 New Feature
🐛 Bug Fix

Changes

When mapping, allow the parameter: True because that is the intrinsic behavior of Gemini. Allow False, but reject if there are multiple tools because there's no actual equivalent in Gemini.

Copy link

vercel bot commented May 24, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback May 26, 2025 2:00pm

@aholmberg
Copy link
Contributor Author

@krrishdholakia I think this CI failure is unrelated to the change. Can you corroborate?

@krrishdholakia
Copy link
Contributor

Hey @aholmberg if you rebase with main, it should be fixed

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

move this test inside tests/litellm so it can run on the github action (llm_translation tests are mixed with real api calls, so don't run on contributor pr's)

if value is False:
tools = non_default_params.get("tools", non_default_params.get("functions"))
num_function_declarations = len(tools) if isinstance(tools, list) else 0
if num_function_declarations > 1:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add support for user setting either completion(...,drop_params = True) or litellm.drop_params = True

if so - i assume we would just not set this flag parallel_tool_calls=False

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure what is meant by this comment. The way I was thinking about it, parallel tool calls True is the default. If it's set True we don't need to inject anything. This new logic is just bailing if it's being set False and more than one tool is present (no way I know of to emulate that in Gemini).

aholmberg added 2 commits May 26, 2025 08:44
When mapping, allow the parameter: True because that is the
intrinsic behavior of Gemini. Allow False, but reject if there
are multiple tools because there's no actual equivalent in Gemini.

fixes BerriAI#9686

ref: issues/9686
@aholmberg
Copy link
Contributor Author

Rebased and moved the test.

@krrishdholakia krrishdholakia merged commit c93a78c into BerriAI:main May 26, 2025
5 of 7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Feature]: Support parallel_tool_calls with Gemini models
2 participants