Skip to content

Releases: BerriAI/litellm

v1.70.2.dev5

22 May 21:26
197c608
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.70.2-nightly...v1.70.2.dev5

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.2.dev5

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 490.0 556.7359308349169 5.626256137716844 0.0 1684 0 437.79858300001706 2137.070654000013
Aggregated Failed ❌ 490.0 556.7359308349169 5.626256137716844 0.0 1684 0 437.79858300001706 2137.070654000013

v1.70.2-nightly

21 May 01:03
Compare
Choose a tag to compare

Full Changelog: v1.70.1.dev2...v1.70.2-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.2-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 480.0 551.6473700786378 5.693462012007638 0.0 1704 0 435.9962309999901 1522.4978100000044
Aggregated Failed ❌ 480.0 551.6473700786378 5.693462012007638 0.0 1704 0 435.9962309999901 1522.4978100000044

What's Changed

New Contributors

Full Changelog: v1.70.1-stable...v1.70.2-nightly

v1.70.1.dev8

20 May 22:12
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.70.1-stable...v1.70.1.dev8

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.1.dev8

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 490.0 562.6894453097084 5.610163428556303 0.0033413719050365115 1679 1 195.35745899997892 1568.1852209999647
Aggregated Failed ❌ 490.0 562.6894453097084 5.610163428556303 0.0033413719050365115 1679 1 195.35745899997892 1568.1852209999647

v1.70.1.dev6

20 May 19:53
Compare
Choose a tag to compare

Full Changelog: v1.70.1.dev4...v1.70.1.dev6

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.1.dev6

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 520.0 614.117503553192 5.499725111392901 0.003343297940056475 1645 1 470.59824999996636 40452.63952500005
Aggregated Failed ❌ 520.0 614.117503553192 5.499725111392901 0.003343297940056475 1645 1 470.59824999996636 40452.63952500005

v1.70.1.dev4

20 May 19:39
Compare
Choose a tag to compare

Full Changelog: v1.70.1.dev2...v1.70.1.dev4

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.1.dev4

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 500.0 594.1888488711803 5.578781212015596 0.006685178204931811 1669 2 190.37631700001612 40183.55002499999
Aggregated Failed ❌ 500.0 594.1888488711803 5.578781212015596 0.006685178204931811 1669 2 190.37631700001612 40183.55002499999

v1.70.1.dev2

20 May 17:40
Compare
Choose a tag to compare

Full Changelog: v1.67.0-stable.patch2...v1.70.1.dev2

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.1.dev2

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 530.0 603.0679202423511 5.573003547206886 0.0 1667 0 488.8812669999538 2023.6071630000083
Aggregated Failed ❌ 530.0 603.0679202423511 5.573003547206886 0.0 1667 0 488.8812669999538 2023.6071630000083

v1.70.1.dev11

20 May 22:22
Compare
Choose a tag to compare

Full Changelog: v1.70.1.dev8...v1.70.1.dev11

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.1.dev11

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 510.0 588.5182217067622 5.584563874858318 0.0 1671 0 469.68324099998426 1859.8325959999897
Aggregated Failed ❌ 510.0 588.5182217067622 5.584563874858318 0.0 1671 0 469.68324099998426 1859.8325959999897

v1.67.0-stable.patch2

20 May 05:50
af73a8e
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.70.1-stable...v1.67.0-stable.patch2

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:litellm_stable_release_branch-v1.67.0-stable.patch2

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 250.0 261.5294727693982 6.120234747528359 0.0 1830 0 217.36453699998037 1204.8032490000082
Aggregated Passed ✅ 250.0 261.5294727693982 6.120234747528359 0.0 1830 0 217.36453699998037 1204.8032490000082

v1.70.1-stable

17 May 16:12
Compare
Choose a tag to compare

What's Changed

New Contributors

Read more

v1.70.0-nightly

17 May 05:39
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.69.3-nightly...v1.70.0-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.0-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.0-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 190.0 206.64782881083894 6.289821899591554 0.0 1882 0 171.67403099995227 1154.6766310000294
Aggregated Passed ✅ 190.0 206.64782881083894 6.289821899591554 0.0 1882 0 171.67403099995227 1154.6766310000294