We welcome contributions! Please fork and submit PRs! See our CONTRIBUTING.md guide for details.
- Simplicity 🌱: strong focus on keeping the project easy to use and the codebase simple.
- Privacy 🛡️: use local LLMs through Ollama for your private data.
- Flexibility 🤸: flexible features like:
- KPI-directed: Prompt the LLMs to prioritize your KPIs in the dataset,
- Preview what you are passing: data importer wizard preview and filter the dataset by columns and/or rows,
- Dynamic filters: the LLM infers the most relevant filters for your dataset dynamically,
- Custom LLM: use your favourite 3rd party LLM (or local through Ollama).
- Reusability 🔄: each dashboard generates a reusable "viz_spec" JSON file in the "llm_responses" folder, accessible for future use through the "Import Previous Viz Specs" feature. Each plot includes a "Code" tab to reproduce it anywhere.
ai-dashboard-builder.mp4
- Start the application with Docker Compose:
docker-compose up --build
- Access the dashboard at http://localhost:8050
- Start both Ollama and the dashboard:
docker-compose -f docker-compose.all-in-one.yml up --build
- Access the dashboard at http://localhost:8050
- Start Ollama separately (if using local models)
- Run the application:
python src/app.py
To run the application in development mode:
pip install uv # if you don't have it already
uv run ai_dashboard_builder --dev
In the project root folder, you can create a .env
file and set the API keys for the LLMs you want to use, or pass them through the webapp.
OLLAMA_HOST
: Ollama server address (default: host.docker.internal)OPENAI_API_KEY
: OpenAI API key (for GPT models)ANTHROPIC_API_KEY
: Anthropic API key (for Claude models)GROQ_API_KEY
: Groq API key (for Mixtral/LLaMA models)
We welcome contributions! Please see our CONTRIBUTING.md guide for details on how to get started.
This project is licensed under a form of MIT License - see the LICENSE.md file for details.