Skip to content

choose_agent failing on OpenAI models that don't support temperature #1427

Open
@pambram

Description

@pambram

Describe the bug
When setting smart_llm to "o3", the whole research fails on choose_agent()

To Reproduce
Steps to reproduce the behavior:

  1. create a json configuration file that sets smart_llm to o3. Example:

{
"FAST_LLM": "openai:gpt-4.1",
"SMART_LLM": "openai:o3-2025-04-16",
"STRATEGIC_LLM": "openai:o4-mini",
"RETRIEVER": "tavily, exa",
"TOTAL_WORDS": 15000,
"FAST_TOKEN_LIMIT": 20000,
"SMART_TOKEN_LIMIT": 80000,
"STRATEGIC_TOKEN_LIMIT": 80000,
"MAX_SUBTOPICS": 5,
"MAX_SEARCH_RESULTS_PER_QUERY": 10,
"DEEP_RESEARCH_BREADTH": 5,
"DEEP_RESEARCH_DEPTH": 4,
"DEEP_RESEARCH_CONCURRENCY": 20
}

  1. Run deep mode research

Expected behavior
Research runs fine

Screenshots

Image

Desktop (please complete the following information):

  • OS: MacOS

Additional context
Additionally, choosing an agent should probably be best performed by the FAST_LLM not the SMART_LLM

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions