Documentation - API Reference - Changelog - Bug reports - Discord
Cortex.cpp is currently in active development.
Cortex is a Local AI API Platform that is used to run and customize LLMs.
Key Features:
- Straightforward CLI (inspired by Ollama)
- Full C++ implementation, packageable into Desktop and Mobile apps
- Pull from Huggingface, or Cortex Built-in Models
- Models stored in universal file formats (vs blobs)
- Swappable Engines (default:
llamacpp, future:ONNXRuntime,TensorRT-LLM) - Cortex can be deployed as a standalone API server, or integrated into apps like Jan.ai
Cortex's roadmap is to implement the full OpenAI API including Tools, Runs, Multi-modal and Realtime APIs.
Cortex has an Local Installer that packages all required dependencies, so that no internet connection is required during the installation process.
Cortex also has a Network Installer which downloads the necessary dependencies from the internet during the installation.
Windows:
cortex-windows-local-installer.exe
MacOS (Silicon/Intel):
cortex-mac-local-installer.pkg
- For Linux: Download the installer and run the following command in terminal:
sudo apt install ./cortex-local-installer.deb- The binary will be installed in the
/usr/bin/directory.
After installation, you can run Cortex.cpp from the command line by typing cortex --help.
cortex pull llama3.2
cortex pull bartowski/Meta-Llama-3.1-8B-Instruct-GGUF
cortex run llama3.2
cortex models ps
cortex models stop llama3.2
cortex stop
Refer to our Quickstart and CLI documentation for more details.
Cortex.cpp includes a REST API accessible at localhost:39281.
Refer to our API documentation for more details.
Cortex.cpp allows users to pull models from multiple Model Hubs, offering flexibility and extensive model access.
Currently Cortex supports pulling from:
- Hugging Face: GGUF models eg
author/Model-GGUF - Cortex Built-in Models
Once downloaded, the model .gguf and model.yml files are stored in ~\cortexcpp\models.
Note: You should have at least 8 GB of RAM available to run the 7B models, 16 GB to run the 14B models, and 32 GB to run the 32B models.
| Model /Engine | llama.cpp | Command |
|---|---|---|
| phi-3.5 | ✅ | cortex run phi3.5 |
| llama3.2 | ✅ | cortex run llama3.1 |
| llama3.1 | ✅ | cortex run llama3.1 |
| codestral | ✅ | cortex run codestral |
| gemma2 | ✅ | cortex run gemma2 |
| mistral | ✅ | cortex run mistral |
| ministral | ✅ | cortex run ministral |
| qwen2 | ✅ | cortex run qwen2.5 |
| openhermes-2.5 | ✅ | cortex run openhermes-2.5 |
| tinyllama | ✅ | cortex run tinyllama |
View all Cortex Built-in Models.
Cortex supports multiple quantizations for each model.
❯ cortex-nightly pull llama3.2
Downloaded models:
llama3.2:3b-gguf-q2-k
Available to download:
1. llama3.2:3b-gguf-q3-kl
2. llama3.2:3b-gguf-q3-km
3. llama3.2:3b-gguf-q3-ks
4. llama3.2:3b-gguf-q4-km (default)
5. llama3.2:3b-gguf-q4-ks
6. llama3.2:3b-gguf-q5-km
7. llama3.2:3b-gguf-q5-ks
8. llama3.2:3b-gguf-q6-k
9. llama3.2:3b-gguf-q8-0
Select a model (1-9):
Cortex.cpp is available with a Network Installer, which is a smaller installer but requires internet connection during installation to download the necessary dependencies.
Windows:
cortex-windows-network-installer.exe
MacOS (Universal):
cortex-mac-network-installer.pkg
Cortex releases 2 preview versions for advanced users to try new features early (we appreciate your feedback!)
- Beta (early preview)
- CLI command:
cortex-beta
- CLI command:
- Nightly (released every night)
- CLI Command:
cortex-nightly - Nightly automatically pulls the latest changes from upstream llama.cpp repo, creates a PR and runs tests.
- If all test pass, the PR is automatically merged into our repo, with the latest llama.cpp version.
- CLI Command:
| Version | Windows | MacOS | Linux |
| Beta (Preview) |
cortex-beta-windows-local-installer.exe
|
cortex-beta-mac-local-installer.pkg
|
cortex-beta-linux-local-installer.deb
|
| Nightly (Experimental) |
cortex-nightly-windows-local-installer.exe
|
cortex-nightly-mac-local-installer.pkg
|
cortex-nightly-linux-local-installer.deb
|
| Version Type | Windows | MacOS | Linux |
| Beta (Preview) |
cortex-beta-windows-network-installer.exe
|
cortex-beta-mac-network-installer.pkg
|
cortex-beta-linux-network-installer.deb
|
| Nightly (Experimental) |
cortex-nightly-windows-network-installer.exe
|
cortex-nightly-mac-network-installer.pkg
|
cortex-nightly-linux-network-installer.deb
|
- Clone the Cortex.cpp repository here.
- Navigate to the
enginefolder. - Configure the vpkg:
cd vcpkg
./bootstrap-vcpkg.bat
vcpkg install- Build the Cortex.cpp inside the
engine/buildfolder:
mkdir build
cd build
cmake .. -DBUILD_SHARED_LIBS=OFF -DCMAKE_TOOLCHAIN_FILE=path_to_vcpkg_folder_in_cortex_repo/vcpkg/scripts/buildsystems/vcpkg.cmake -DVCPKG_TARGET_TRIPLET=x64-windows-static
cmake --build . --config Release- Verify that Cortex.cpp is installed correctly by getting help information.
cortex -h- Clone the Cortex.cpp repository here.
- Navigate to the
enginefolder. - Configure the vpkg:
cd vcpkg
./bootstrap-vcpkg.sh
vcpkg install- Build the Cortex.cpp inside the
engine/buildfolder:
mkdir build
cd build
cmake .. -DCMAKE_TOOLCHAIN_FILE=path_to_vcpkg_folder_in_cortex_repo/vcpkg/scripts/buildsystems/vcpkg.cmake
make -j4- Verify that Cortex.cpp is installed correctly by getting help information.
cortex -h- Clone the Cortex.cpp repository here.
- Navigate to the
enginefolder. - Configure the vpkg:
cd vcpkg
./bootstrap-vcpkg.sh
vcpkg install- Build the Cortex.cpp inside the
engine/buildfolder:
mkdir build
cd build
cmake .. -DCMAKE_TOOLCHAIN_FILE=path_to_vcpkg_folder_in_cortex_repo/vcpkg/scripts/buildsystems/vcpkg.cmake
make -j4- Verify that Cortex.cpp is installed correctly by getting help information.
cortex -h- Open the Windows Control Panel.
- Navigate to
Add or Remove Programs. - Search for
cortexcppand double click to uninstall. (for beta and nightly builds, search forcortexcpp-betaandcortexcpp-nightlyrespectively)
Run the uninstaller script:
sudo sh cortex-uninstall.shFor MacOS, there is a uninstaller script comes with the binary and added to the /usr/local/bin/ directory. The script is named cortex-uninstall.sh for stable builds, cortex-beta-uninstall.sh for beta builds and cortex-nightly-uninstall.sh for nightly builds.
sudo apt remove cortexcpp- For support, please file a GitHub ticket.
- For questions, join our Discord here.
- For long-form inquiries, please email [email protected].

