Skip to content

multiple prompts in a batch is not currently supported #1270

Closed
@pseudotensor

Description

@pseudotensor

Is this planned? Seems like good idea to support full OpenAI behavior and any batching is already handled well by vLLM, so should be relatively easy I would guess?

elif isinstance(first_element, (str, list)):
# TODO: handles multiple prompt case in list[list[int]]
if len(request.prompt) > 1:
return create_error_response(
HTTPStatus.BAD_REQUEST,
"multiple prompts in a batch is not currently supported")

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions