Open
Description
Is your feature request related to a problem? Please describe.
Describe the solution you'd like
Provide a basic chat tui (similar to ollama) that can help test models directly using localai instead of having to curl the endpoint.
-
This should include the localai server endpoint in the background instead of running a server and client
-
This should be an optional flag that's opt in.
Describe alternatives you've considered
Additional context