Skip to content

provide basic chat interface #1535

Open
@sozercan

Description

@sozercan

Is your feature request related to a problem? Please describe.

Describe the solution you'd like

Provide a basic chat tui (similar to ollama) that can help test models directly using localai instead of having to curl the endpoint.

  • This should include the localai server endpoint in the background instead of running a server and client

  • This should be an optional flag that's opt in.

Describe alternatives you've considered

Additional context

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requestroadmapup for grabsTickets that no-one is currently working on

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions