Open
Description
Local AI inference engines, like Ollama or Docker Model Runner have OpenAI compatible endpoints.
For Ollama: http://localhost:11434
For Docker Model Runner: http://localhost:12434/engines
In this way they are accessible from my application using the dependency spring-ai-starter-model-openai.
They don't require an api-key.
However, when i don't add the setting spring.ai.openai.api-key to my application,
I get this exception when my application starts:
java.lang.IllegalArgumentException: OpenAI API key must be set. Use the connection property: spring.ai.openai.api-key or spring.ai.openai.chat.api-key property.
For the moment i solve this by adding following setting: spring.ai.openai.api-key=dummy