forked from mudler/LocalAI
-
Notifications
You must be signed in to change notification settings - Fork 0
🤖 Self-hosted, community-driven, local OpenAI-compatible API. Drop-in replacement for OpenAI running LLMs on consumer-grade hardware. No GPU required. LocalAI is a RESTful API to run ggml compatible models: llama.cpp, alpaca.cpp, gpt4all.cpp, rwkv.cpp, whisper.cpp, vicuna, koala, gpt4all-j, cerebras and many others!
License
l3dlp-sandbox/LocalAI
ErrorLooks like something went wrong!
About
🤖 Self-hosted, community-driven, local OpenAI-compatible API. Drop-in replacement for OpenAI running LLMs on consumer-grade hardware. No GPU required. LocalAI is a RESTful API to run ggml compatible models: llama.cpp, alpaca.cpp, gpt4all.cpp, rwkv.cpp, whisper.cpp, vicuna, koala, gpt4all-j, cerebras and many others!
Resources
License
Security policy
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published
Languages
- Go 88.1%
- Python 3.1%
- JavaScript 2.9%
- HTML 2.8%
- Shell 1.0%
- Makefile 0.9%
- Other 1.2%