Not planned
Description
I'm struggling to find a way to declare a function which has an optional parameter. Ideally, I want something like this. Let me know what's the way, if you don't mind!
@function_tool
def get_latest_elasticsearch_version(major_version: int | None = None) -> str:
"""Returns the latest GA version of Elasticsearch in "X.Y.Z" format.
Args:
major_version: Major version to filter by (e.g. 7, 8). Defaults to latest
"""
--snip--
Activity
rm-openai commentedon Mar 12, 2025
When you use @function_tool, it enforces "strict mode" (https://platform.openai.com/docs/guides/function-calling?api-mode=chat#strict-mode), which means params are required.
See code/output
prints:
If you specifically don't want strict mode, you can manually create a
FunctionTool
. Note that this is not recommended - strict mode makes JSON much more reliable. But here's an example of how you might do this, still leveraging thefunction_schema
helper:output:
codefromthecrypt commentedon Mar 12, 2025
@rm-openai thanks for the workaround!
Since other genai frameworks support optional parameters (via supplying defaults), is it possible to convert this to a feature request? I think that even if at the moment we can work around this, it would be more elegant to be able to express a function naturally. This will make it easier to port code elegantly. WDYT?
rm-openai commentedon Mar 12, 2025
Yes, makes sense. I think the most sensible way to do this is by adding a
strict_mode: bool = True
tofunction_tool()
. Will try to get it soon, and leave it open in case someone else is interested.Jai0401 commentedon Mar 12, 2025
Hey @rm-openai, I’d love to work on implementing this feature. Would you be open to a PR for adding
strict_mode: bool = True
to@function_tool()
? If so, I can start working on it and align with any specific guidelines you have in mind.rm-openai commentedon Mar 12, 2025
Sure thing. Should be straightforward - add
strict_mode
as a param, document why its not a good idea to set to False, thread it through to function_schema/FunctionTool, and add tests. Thanks for taking this on!codefromthecrypt commentedon Mar 12, 2025
heh cool, yeah I had WIP but go for it @Jai0401 I'll help you review.
edge cases are multiple optional parameters of different types. e.g.
x: int = 42
,x: string = "hello"
, and of course the motivating optional onex: int | None = None
.also independently test the things you need to change to support this (e.g. in function_schema.py)
Have fun!
rm-openai commentedon Mar 12, 2025
@codefromthecrypt - I realized I misread the original schema. This should actually work out of the box with the current SDK. Did you actually run into an issue?
codefromthecrypt commentedon Mar 13, 2025
@rm-openai yep. my original example in the desc creates the following API call. @Jai0401 maybe you can add a unit test about the schema created in your PR, though the tests you have prove the same point I think.
So in the current code, a follow-up to this fails as it sees
major_version
as a required field. I guess it is expecting the LLM to pass literally null instead of leave it out.I'm expecting more like some other frameworks, where when it is optional, the wrapped type is used without being required (ideal IMHO), or in worst case keep it defined as an optional, but don't mark it required (e.g. pydantic AI interpretation of the same signature and docstring)
Make sense?
rm-openai commentedon Mar 13, 2025
Is there any reason you'd prefer to not use strict mode? The LLM can indeed pass null as you noted. And strict mode basically guarantees valid JSON, which is a big deal.
codefromthecrypt commentedon Mar 13, 2025
The problem is that the LLM isn't passing null. Maybe I worded incorrectly. This only works if I pass a version in my question. If I don't, it breaks out asking me to supply one.
14 remaining items