Skip to content

Batches - support batch retrieve with target model Query Param + Anthropic - completion bridge, yield content_block_stop chunk #12228

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Jul 2, 2025

Conversation

krrishdholakia
Copy link
Contributor

@krrishdholakia krrishdholakia commented Jul 2, 2025

  • fix(batches_endpoints/endpoints.py): support passing target model names for batch list as a query param

Fixes issue where cloud run fails calls because GET can't contain request body

  • test(test_openai_batches_endpoints.py): add unit test

  • docs(managed_batches.md): update docs

  • feat(spend_tracking_utils.py): support STORE_PROMPTS_IN_SPEND_LOGS env var

ensures prompt is stored in spend logs

  • fix(streaming_iterator.py): fix anthropic - completion streaming iterator to yield content block stop

ensures claude code renders messages

  • test: skip local test

…es for batch list as a query param

Fixes issue where cloud run fails calls because GET can't contain request body
…v var

ensures prompt is stored in spend logs
…ator to yield content block stop

ensures claude code renders messages
Copy link

vercel bot commented Jul 2, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jul 2, 2025 5:14am

@krrishdholakia krrishdholakia merged commit 22d28f5 into main Jul 2, 2025
8 of 10 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant