A Tauri + Angular terminal application with integrated AI capabilities.

- Natural language command interpretation
- Integrated AI assistant
- Command history and auto-completion
- Cross-platform support (macOS, Windows, Linux)
- Modern UI built with Tauri and Angular
- Node.js 18+
- Rust and Cargo
- For AI features: Ollama
- macOS:
brew install ollama - Windows: Download from ollama.ai
- Linux:
curl -fsSL https://ollama.com/install.sh | sh
- macOS:
-
Clone the repository:
git clone https://github.com/your-username/ai-terminal.git cd ai-terminal -
Install dependencies and run the project:
Windows (PowerShell/Command Prompt):
cd ai-terminal npm install npm run tauri devmacOS/Linux:
cd ai-terminal npm install npm run tauri dev
For Windows users, you can build and install AI Terminal from source:
-
Prerequisites:
- Install Visual Studio Build Tools or Visual Studio with C++ development tools
- Install Node.js
- Install Rust
-
Build and Install:
git clone https://github.com/your-username/ai-terminal.git cd ai-terminal\ai-terminal npm install npm run tauri build -
Install the built package:
- Navigate to
src-tauri\target\release\bundle\msi\ - Run the generated
.msiinstaller
- Navigate to
You can install AI Terminal using Homebrew:
brew tap AiTerminalFoundation/ai-terminal
brew install --cask ai-terminalAfter installation, you can launch the application from Spotlight or run it from the terminal:
ai-terminal- Install Ollama
Open your terminal and run:
curl -fsSL https://ollama.com/install.sh | sh- Download the Model
Run the following command:
ollama pull macsdeve/BetterBash3- Download Ollama
- Visit Ollama download page.
- Download the Windows installer.
- Install Ollama
- Run the downloaded installer.
- Follow the installation prompts.
- Ollama will be added to your PATH automatically.
- Download the Model
Open Command Prompt or PowerShell and execute:
ollama pull macsdeve/BetterBash3Note for Windows Terminal users: AI Terminal now fully supports Windows Terminal, Command Prompt, PowerShell, and Git Bash environments.
AI Terminal now supports multiple AI providers, including your local AI running on localhost:8000.
-
Start your local AI server on
localhost:8000 -
Configure AI Terminal:
- Run
setup-localhost-ai.batfor guided setup, or - Use the built-in commands (see below)
- Run
-
Built-in Configuration Commands:
# Quick setup for localhost:8000 /localai your-model-name # Manual configuration /provider localai /host http://localhost:8000 /model your-model-name /params temp=0.7 tokens=2048
/help- Show all available commands/provider [ollama|localai|openai]- Switch AI providers/localai [model]- Quick setup for localhost:8000/host [url]- Change API endpoint/model [name]- Switch model/models- List available models/params temp=X tokens=Y- Set temperature and max tokens
- Ollama (default) - Local Ollama installation
- LocalAI - OpenAI-compatible local AI (localhost:8000)
- OpenAI - OpenAI API compatible services
# Setup for localhost:8000
/localai gpt-3.5-turbo
# Ask your local AI
How do I list files in Windows?
# Switch back to Ollama
/provider ollama- Download Ollama
- Visit Ollama download page.
- Click Download for macOS.
- Install Ollama
- Open the downloaded
.zipfile from yourDownloadsfolder. - Drag the
Ollama.appinto yourApplicationsfolder. - Open
Ollama.appand follow any prompts.
- Download the Model
Open Terminal and execute:
ollama pull macsdeve/BetterBash3Contributions are welcome! Please feel free to submit a Pull Request.