Skip to content

Add new model provider Novita AI #7582

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

jasonhp
Copy link
Contributor

@jasonhp jasonhp commented Jan 6, 2025

Title

Add new model provider Novita AI

Type

🆕 New Feature
📖 Documentation

Changes

  • Added new model provider Novita AI
  • Added related docs.

[REQUIRED] Testing - Attach a screenshot of any new tests passing locally

If UI changes, send a screenshot/GIF of working UI fixes

image image

Copy link

vercel bot commented Jan 6, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Mar 25, 2025 0:47am

@jasonhp
Copy link
Contributor Author

jasonhp commented Jan 21, 2025

@ishaan-jaff Hi! I believe it’s ready for review and would be great to have it merged soon if everything looks good to you. Please let me know if there’s anything I should adjust.

model: str,
drop_params: bool,
) -> dict:
mapped_openai_params = super().map_openai_params(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this looks like redundant code just inherit and pass

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed

litellm/main.py Outdated
@@ -2973,6 +2973,49 @@ def completion( # type: ignore # noqa: PLR0915
)
return response
response = model_response
elif custom_llm_provider == "novita":
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

there is no need for this if block, just register as litellm.openai_compatible_providers in init

and it will route it into the openai block

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need to add a custom header in this if block. I'm not sure where else to add it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you should add it in the validate_environment in the config, that's where headers can be set.

Copy link
Contributor

@krrishdholakia krrishdholakia Mar 14, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

in this case, it would be better to have it use base_llm_http_handler.py

response = base_llm_http_handler.completion(

This is where we are moving all llm calls to - this way all your custom logic exists only within your transformation.py file

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed

litellm/main.py Outdated
@@ -3923,6 +3965,7 @@ async def atext_completion(
or custom_llm_provider == "huggingface"
or custom_llm_provider == "ollama"
or custom_llm_provider == "vertex_ai"
or custom_llm_provider == "novita"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no need for this, see line below - just register as an openai_compatible_provider

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed

litellm/utils.py Outdated
@@ -2879,6 +2882,7 @@ def get_optional_params( # noqa: PLR0915
and custom_llm_provider != "bedrock"
and custom_llm_provider != "ollama_chat"
and custom_llm_provider != "openrouter"
and custom_llm_provider != "novita"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no need for this, just register as an openai compatible provider

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed

@@ -4650,6 +4650,46 @@ def test_humanloop_completion(monkeypatch):
messages=[{"role": "user", "content": "Tell me a joke."}],
)

def test_completion_novita_ai():
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this test will fail in prod. please make this a mock test, see hosted_vllm/ example

@krrishdholakia
Copy link
Contributor

Hi @jasonhp thank you for your PR

  1. Please resolve merge conflicts
  2. Please follow the updated contribution guidelines - https://docs.litellm.ai/docs/extras/contributing_code

I would expect to see

  1. Your e2e mock test added here -
  2. any additional unit tests added in the corresponding test_ file, in here https://github.com/BerriAI/litellm/tree/main/tests/litellm
  3. A screenshot of your pr passing all tests from make test-unit

@jasonhp
Copy link
Contributor Author

jasonhp commented Mar 17, 2025

@krrishdholakia Hello, thank you for reviewing. I have fixed the problems you mentioned. And I fixed all unit tests and added e2e tests. I have run make test-unit successfully.

image

@krrishdholakia krrishdholakia changed the base branch from main to litellm_contributor_prs_03_24_2025_p1 March 24, 2025 16:23
@jasonhp
Copy link
Contributor Author

jasonhp commented Mar 25, 2025

@krrishdholakia Hello! All conflicts have been resolved.

@krrishdholakia krrishdholakia merged commit 7a92a03 into BerriAI:litellm_contributor_prs_03_24_2025_p1 Mar 25, 2025
2 checks passed
@krrishdholakia
Copy link
Contributor

Yup - thank you for your work on this

@jasonhp
Copy link
Contributor Author

jasonhp commented Mar 27, 2025

@krrishdholakia Hello! I wonder when this litellm_contributor_prs_03_24_2025_p1 branch can be merged into main branch and be released into production package?

@viktor2077
Copy link

@krrishdholakia Hi Krish, can you help merge these changes to the main branch?

krrishdholakia added a commit that referenced this pull request May 13, 2025
* Add new model provider Novita AI (#7582)

* feat: add new model provider Novita AI

* feat: use deepseek r1 model for examples in Novita AI docs

* fix: fix tests

* fix: fix tests for novita

* fix: fix novita transformation

* ci: fix ci yaml

* fix: fix novita transformation and test (#10056)

---------

Co-authored-by: Jason <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants