Open
Description
Describe the bug
It throws an error when trying to view a file
Reproduce
- Start openinterpreter with deepseek reasoner
- ask it to view any file
- throws erroir
Expected behavior
for it to view the file
Screenshots
No response
Open Interpreter version
1.0
Python version
3.13
Operating System name and version
Windows 11
Additional context
Open Interpreter 1.0.0
Copyright (C) 2024 Open Interpreter Team
Licensed under GNU AGPL v3.0
A modern command-line assistant.
Usage: i [prompt]
or: interpreter [options]
Documentation: docs.openinterpreter.com
Run 'interpreter --help' for all options
> can you review svc.py and just let me know if there are any major flaws?
I'll review the svc.py file for any major flaws. Let me first examine the file contents.
────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────
⚆ │ view C:/Users/cross/Downloads/svc/svc.py
────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────
.. Traceback (most recent call last):
File "C:\Users\cross\miniconda3\envs\sep\Lib\site-packages\litellm\llms\custom_httpx\llm_http_handler.py", line 170, in _make_common_sync_call
response = sync_httpx_client.post(
url=api_base,
...<8 lines>...
logging_obj=logging_obj,
)
File "C:\Users\cross\miniconda3\envs\sep\Lib\site-packages\litellm\llms\custom_httpx\http_handler.py", line 754, in post
raise e
File "C:\Users\cross\miniconda3\envs\sep\Lib\site-packages\litellm\llms\custom_httpx\http_handler.py", line 736, in post
response.raise_for_status()
~~~~~~~~~~~~~~~~~~~~~~~~~^^
File "C:\Users\cross\miniconda3\envs\sep\Lib\site-packages\httpx\_models.py", line 763, in raise_for_status
raise HTTPStatusError(message, request=request, response=self)
httpx.HTTPStatusError: Client error '400 Bad Request' for url 'https://api.deepseek.com/beta/chat/completions'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Users\cross\miniconda3\envs\sep\Lib\site-packages\litellm\main.py", line 1550, in completion
raise e
File "C:\Users\cross\miniconda3\envs\sep\Lib\site-packages\litellm\main.py", line 1524, in completion
response = base_llm_http_handler.completion(
model=model,
...<14 lines>...
provider_config=provider_config,
)
File "C:\Users\cross\miniconda3\envs\sep\Lib\site-packages\litellm\llms\custom_httpx\llm_http_handler.py", line 430, in completion
completion_stream, headers = self.make_sync_call(
~~~~~~~~~~~~~~~~~~~^
provider_config=provider_config,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<17 lines>...
optional_params=optional_params,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "C:\Users\cross\miniconda3\envs\sep\Lib\site-packages\litellm\llms\custom_httpx\llm_http_handler.py", line 520, in make_sync_call
response = self._make_common_sync_call(
sync_httpx_client=sync_httpx_client,
...<8 lines>...
logging_obj=logging_obj,
)
File "C:\Users\cross\miniconda3\envs\sep\Lib\site-packages\litellm\llms\custom_httpx\llm_http_handler.py", line 195, in _make_common_sync_call
raise self._handle_error(e=e, provider_config=provider_config)
~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\cross\miniconda3\envs\sep\Lib\site-packages\litellm\llms\custom_httpx\llm_http_handler.py", line 2375, in _handle_error
raise provider_config.get_error_class(
...<3 lines>...
)
litellm.llms.openai.common_utils.OpenAIError: {"error":{"message":"The reasoning_content is an intermediate result for display purposes only and will not be included in the context for inference. Please remove the reasoning_content from your message to reduce network traffic.","type":"invalid_request_error","param":null,"code":"invalid_request_error"}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "C:\Users\cross\miniconda3\envs\sep\Scripts\interpreter.exe\__main__.py", line 7, in <module>
sys.exit(main())
~~~~^^
File "C:\Users\cross\miniconda3\envs\sep\Lib\site-packages\interpreter\cli.py", line 301, in main
asyncio.run(async_main(args))
~~~~~~~~~~~^^^^^^^^^^^^^^^^^^
File "C:\Users\cross\miniconda3\envs\sep\Lib\asyncio\runners.py", line 195, in run
return runner.run(main)
~~~~~~~~~~^^^^^^
File "C:\Users\cross\miniconda3\envs\sep\Lib\asyncio\runners.py", line 118, in run
return self._loop.run_until_complete(task)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^
File "C:\Users\cross\miniconda3\envs\sep\Lib\asyncio\base_events.py", line 725, in run_until_complete
return future.result()
~~~~~~~~~~~~~^^
File "C:\Users\cross\miniconda3\envs\sep\Lib\site-packages\interpreter\cli.py", line 225, in async_main
async for _ in global_interpreter.async_respond():
pass
File "C:\Users\cross\miniconda3\envs\sep\Lib\site-packages\interpreter\interpreter.py", line 740, in async_respond
raw_response = litellm.completion(**params)
File "C:\Users\cross\miniconda3\envs\sep\Lib\site-packages\litellm\utils.py", line 1303, in wrapper
raise e
File "C:\Users\cross\miniconda3\envs\sep\Lib\site-packages\litellm\utils.py", line 1178, in wrapper
result = original_function(*args, **kwargs)
File "C:\Users\cross\miniconda3\envs\sep\Lib\site-packages\litellm\main.py", line 3311, in completion
raise exception_type(
~~~~~~~~~~~~~~^
model=model,
^^^^^^^^^^^^
...<3 lines>...
extra_kwargs=kwargs,
^^^^^^^^^^^^^^^^^^^^
)
^
File "C:\Users\cross\miniconda3\envs\sep\Lib\site-packages\litellm\litellm_core_utils\exception_mapping_utils.py", line 2271, in exception_type
raise e
File "C:\Users\cross\miniconda3\envs\sep\Lib\site-packages\litellm\litellm_core_utils\exception_mapping_utils.py", line 369, in exception_type
raise BadRequestError(
...<6 lines>...
)
litellm.exceptions.BadRequestError: litellm.BadRequestError: DeepseekException - {"error":{"message":"The reasoning_content is an intermediate result for display purposes only and will not be included in the context for inference. Please remove the reasoning_content from your message to reduce network traffic.","type":"invalid_request_error","param":null,"code":"invalid_request_error"}}
Metadata
Metadata
Assignees
Labels
No labels