Closed
Description
What happened?
I am using Proxy mode and when I send image files to LiteLLM when integrated with DataDogLLMObs, I get the following error and no telemetry data is sent. Is this the intended behavior?
Relevant log output
litellm-1 | 03:40:41 - LiteLLM:ERROR: datadog_llm_obs.py:120 - DataDogLLMObs: Error sending batch - Client error '400 Bad Request' for url 'https://api.datadoghq.com/api/intake/llm-obs/v1/trace/spans'
litellm-1 | For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400
litellm-1 | Traceback (most recent call last):
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/integrations/datadog/datadog_llm_obs.py", line 100, in async_send_batch
litellm-1 | response = await self.async_client.post(
litellm-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
litellm-1 | ...<6 lines>...
litellm-1 | )
litellm-1 | ^
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/logging_utils.py", line 135, in async_wrapper
litellm-1 | result = await func(*args, **kwargs)
litellm-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/http_handler.py", line 257, in post
litellm-1 | raise e
litellm-1 | File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/http_handler.py", line 213, in post
litellm-1 | response.raise_for_status()
litellm-1 | ~~~~~~~~~~~~~~~~~~~~~~~~~^^
litellm-1 | File "/usr/lib/python3.13/site-packages/httpx/_models.py", line 761, in raise_for_status
litellm-1 | raise HTTPStatusError(message, request=request, response=self)
litellm-1 | httpx.HTTPStatusError: Client error '400 Bad Request' for url 'https://api.datadoghq.com/api/intake/llm-obs/v1/trace/spans'
litellm-1 | For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.65.0
Twitter / LinkedIn details
No response