-
Notifications
You must be signed in to change notification settings - Fork 781
docker-compose file error #1284
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@202252197 hi, have u tried launching Wren AI using release artifact? Please check the official installation method here: https://docs.getwren.ai/oss/installation#using-wren-ai-launcher |
I used the link provided above to install wren ai and got the same bug! it occurs when you try to use your own local model instead of default gpt api. |
I guess that you didn't create the .env file with all the parameters. Use this file as example: https://github.com/Canner/WrenAI/blob/main/docker/.env.example |
I used your .env file and it worked. However the is a problem with yaml config file; it doesn't connect to local model. I'm trying to use Deepseek:14b locally with Ollama. and I followed the instruction provided here https://docs.getwren.ai/oss/installation/custom_llm Yaml Config`type: llm
type: embedder
type: engine type: engine type: engine type: document_store type: pipeline
settings: |
Hi @Ahmed-ao, here is my config.yaml for Ollama model and embedder. I think you can base it on it and modify it to your version.
|
ERROR An error occurred: 1 error(s) decoding:
The text was updated successfully, but these errors were encountered: