Skip to content

Add litellm call id passing to Aim guardrails on pre and post-hooks calls #10021

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

hxmichael
Copy link
Contributor

Title

Add litellm call id passing to Aim guardrails on pre and post-hooks calls

Pre-Submission checklist

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
  • I have added a screenshot of my new test passing locally

(Unfortunately, we didn't find where the litellm/proxy/common_request_processing.py tests are, would appreciate some pointers)

  • My PR passes all unit tests on (make test-unit)[https://docs.litellm.ai/docs/extras/contributing_code]
  • My PR's scope is as isolated as possible, it only solves 1 specific problem

Type

🆕 New Feature

Thank you in advance for the review!

Copy link

vercel bot commented Apr 15, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Apr 16, 2025 9:05am

@@ -152,16 +152,16 @@ async def common_processing_pre_call_logic(
):
self.data["model"] = litellm.model_alias_map[self.data["model"]]

self.data["litellm_call_id"] = request.headers.get(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if order is important, this should be a tested flow - otherwise there can be regressions

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're totally right! Created a new test file test/litellm/proxy/test_common_request_processing.py with the test that checks this specific behavior.
Screenshot 2025-04-16 at 12 02 14

@krrishdholakia
Copy link
Contributor

Hey @hxmichael if a test file doesn't exist, please create one with the relevant imports - https://github.com/BerriAI/litellm/tree/main/tests/litellm/proxy

@krrishdholakia krrishdholakia merged commit e19d059 into BerriAI:main Apr 16, 2025
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants