Skip to content

Add schema support #23

@spinagon

Description

@spinagon

I added supports_schema = True to llm_openrouter.py, and it works with gemini, openai and mistral right away.
Other models just work as they normally do.
It's possible to requests whether each model+provider supports structured output, but it would take a separate API call for each model.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions