Skip to content

[Bug]: LiteLLM doesn't support 'web_search_options' for Perplexity' Sonar Pro model #11976

Closed
@franchukpetro

Description

@franchukpetro

What happened?

I've been using Perplexity API through LiteLLM proxy for a few months already. Specifically, I was using Sonar Pro model in the next way:

headers = {
        "Authorization": f"Bearer {OPENAI_API_KEY}",
        "Content-Type": "application/json"
}
    
payload = {
      "model": model,
      "messages": messages,
      "temperature": temperature,
      "web_search_options": {"search_context_size": search_context_size},
      **kwargs
}
    
async with httpx.AsyncClient() as client:
      response = await client.post(
          OPENAI_BASE_URL + "chat/completions",
          headers=headers,
          json=payload,
          timeout=request_timeout # seconds
      )
      
      if response.status_code != 200:
          raise Exception(f"Perplexity API request failed with status {response.status_code}: {response.text}")
      
      res = response.json()

Recently, I've updated LiteLLM to the latest stable release version, and started getting this error (added full log in the corresponding section):

perplexity does not support parameters: ['web_search_options'], for model=sonar-pro

I've been using web_search_options for at least last 2 months with older LiteLLM version, and all was working perfectly fine. Now, after update, it started failing, forcing me to drop this param, after which API seems work again.

According to PPLX API docs, this parameter is totally valid, and I've not seen any updates about it deprecation, so I tend to believe the issue is on LiteLLM side.

Would appreciate any help here!

Relevant log output

LiteLLM Router: INFO: router. py:1131 - Litellm. acompletion(model=perplexity/sonar-pro) Exception litellm. UnsupportedParamsError: perplexity does not support parameters: ['web_search_options'], for model=sonar-pro. To drop these, set 'litellm.drop_params=True' or for proxy:

'litellm_settings:
drop_params: true'

If you want to use these params dynamically send allowed_openai_params=['web_search_options'] in your request.

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.72.6.post1

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions