DeepWiki is my own implementation attempt of DeepWiki, automatically creates beautiful, interactive wikis for any GitHub, GitLab, or BitBucket repository! Just enter a repo name, and DeepWiki will:
- Analyze the code structure
- Generate comprehensive documentation
- Create visual diagrams to explain how everything works
- Organize it all into an easy-to-navigate wiki
- Instant Documentation: Turn any GitHub, GitLab or BitBucket repo into a wiki in seconds
- Private Repository Support: Securely access private repositories with personal access tokens
- Smart Analysis: AI-powered understanding of code structure and relationships
- Beautiful Diagrams: Automatic Mermaid diagrams to visualize architecture and data flow
- Easy Navigation: Simple, intuitive interface to explore the wiki
- Ask Feature: Chat with your repository using RAG-powered AI to get accurate answers
- DeepResearch: Multi-turn research process that thoroughly investigates complex topics
- Multiple Model Providers: Support for Google Gemini, OpenAI, OpenRouter, and local Ollama models
# Clone the repository
git clone https://github.com/AsyncFuncAI/deepwiki-open.git
cd deepwiki-open
# Create a .env file with your API keys
echo "GOOGLE_API_KEY=your_google_api_key" > .env
echo "OPENAI_API_KEY=your_openai_api_key" >> .env
# Optional: Add OpenRouter API key if you want to use OpenRouter models
echo "OPENROUTER_API_KEY=your_openrouter_api_key" >> .env
# Run with Docker Compose
docker-compose up
💡 Where to get these keys:
- Get a Google API key from Google AI Studio
- Get an OpenAI API key from OpenAI Platform
Create a .env
file in the project root with these keys:
GOOGLE_API_KEY=your_google_api_key
OPENAI_API_KEY=your_openai_api_key
# Optional: Add this if you want to use OpenRouter models
OPENROUTER_API_KEY=your_openrouter_api_key
# Install Python dependencies
pip install -r api/requirements.txt
# Start the API server
python -m api.main
# Install JavaScript dependencies
npm install
# or
yarn install
# Start the web app
npm run dev
# or
yarn dev
- Open http://localhost:3000 in your browser
- Enter a GitHub, GitLab, or Bitbucket repository (like
https://github.com/openai/codex
,https://github.com/microsoft/autogen
,https://gitlab.com/gitlab-org/gitlab
, orhttps://bitbucket.org/redradish/atlassian_app_versions
) - For private repositories, click "+ Add access tokens" and enter your GitHub or GitLab personal access token
- Click "Generate Wiki" and watch the magic happen!
DeepWiki uses AI to:
- Clone and analyze the GitHub, GitLab, or Bitbucket repository (including private repos with token authentication)
- Create embeddings of the code for smart retrieval
- Generate documentation with context-aware AI (using Google Gemini, OpenAI, OpenRouter, or local Ollama models)
- Create visual diagrams to explain code relationships
- Organize everything into a structured wiki
- Enable intelligent Q&A with the repository through the Ask feature
- Provide in-depth research capabilities with DeepResearch
graph TD
A[User inputs GitHub/GitLab/Bitbucket repo] --> AA{Private repo?}
AA -->|Yes| AB[Add access token]
AA -->|No| B[Clone Repository]
AB --> B
B --> C[Analyze Code Structure]
C --> D[Create Code Embeddings]
D --> M{Select Model Provider}
M -->|Google Gemini| E1[Generate with Gemini]
M -->|OpenAI| E2[Generate with OpenAI]
M -->|OpenRouter| E3[Generate with OpenRouter]
M -->|Local Ollama| E4[Generate with Ollama]
E1 --> E[Generate Documentation]
E2 --> E
E3 --> E
E4 --> E
D --> F[Create Visual Diagrams]
E --> G[Organize as Wiki]
F --> G
G --> H[Interactive DeepWiki]
classDef process stroke-width:2px;
classDef data stroke-width:2px;
classDef result stroke-width:2px;
classDef decision stroke-width:2px;
class A,D data;
class AA,M decision;
class B,C,E,F,G,AB,E1,E2,E3,E4 process;
class H result;
deepwiki/
├── api/ # Backend API server
│ ├── main.py # API entry point
│ ├── api.py # FastAPI implementation
│ ├── rag.py # Retrieval Augmented Generation
│ ├── data_pipeline.py # Data processing utilities
│ └── requirements.txt # Python dependencies
│
├── src/ # Frontend Next.js app
│ ├── app/ # Next.js app directory
│ │ └── page.tsx # Main application page
│ └── components/ # React components
│ └── Mermaid.tsx # Mermaid diagram renderer
│
├── public/ # Static assets
├── package.json # JavaScript dependencies
└── .env # Environment variables (create this)
Variable | Description | Required | Note |
---|---|---|---|
GOOGLE_API_KEY |
Google Gemini API key for AI generation | Yes | |
OPENAI_API_KEY |
OpenAI API key for embeddings | Yes | |
OPENROUTER_API_KEY |
OpenRouter API key for alternative models | No | Required only if you want to use OpenRouter models |
PORT |
Port for the API server (default: 8001) | No | If you host API and frontend on the same machine, make sure change port of NEXT_PUBLIC_SERVER_BASE_URL accordingly |
NEXT_PUBLIC_SERVER_BASE_URL |
Base URL for the API server (default: http://localhost:8001) | No |
You can use Docker to run DeepWiki:
# Pull the image from GitHub Container Registry
docker pull ghcr.io/asyncfuncai/deepwiki-open:latest
# Run the container with environment variables
docker run -p 8001:8001 -p 3000:3000 \
-e GOOGLE_API_KEY=your_google_api_key \
-e OPENAI_API_KEY=your_openai_api_key \
-e OPENROUTER_API_KEY=your_openrouter_api_key \
-v ~/.adalflow:/root/.adalflow \
ghcr.io/asyncfuncai/deepwiki-open:latest
Or use the provided docker-compose.yml
file:
# Edit the .env file with your API keys first
docker-compose up
You can also mount a .env file to the container:
# Create a .env file with your API keys
echo "GOOGLE_API_KEY=your_google_api_key" > .env
echo "OPENAI_API_KEY=your_openai_api_key" >> .env
echo "OPENROUTER_API_KEY=your_openrouter_api_key" >> .env
# Run the container with the .env file mounted
docker run -p 8001:8001 -p 3000:3000 \
-v $(pwd)/.env:/app/.env \
-v ~/.adalflow:/root/.adalflow \
ghcr.io/asyncfuncai/deepwiki-open:latest
If you want to build the Docker image locally:
# Clone the repository
git clone https://github.com/AsyncFuncAI/deepwiki-open.git
cd deepwiki-open
# Build the Docker image
docker build -t deepwiki-open .
# Run the container
docker run -p 8001:8001 -p 3000:3000 \
-e GOOGLE_API_KEY=your_google_api_key \
-e OPENAI_API_KEY=your_openai_api_key \
-e OPENROUTER_API_KEY=your_openrouter_api_key \
deepwiki-open
The API server provides:
- Repository cloning and indexing
- RAG (Retrieval Augmented Generation)
- Streaming chat completions
For more details, see the API README.
DeepWiki now supports OpenRouter as a model provider, giving you access to hundreds of AI models through a single API:
- Multiple Model Options: Access models from OpenAI, Anthropic, Google, Meta, Mistral, and more
- Simple Configuration: Just add your OpenRouter API key and select the model you want to use
- Cost Efficiency: Choose models that fit your budget and performance needs
- Easy Switching: Toggle between different models without changing your code
- Get an API Key: Sign up at OpenRouter and get your API key
- Add to Environment: Add
OPENROUTER_API_KEY=your_key
to your.env
file - Enable in UI: Check the "Use OpenRouter API" option on the homepage
- Select Model: Choose from popular models like GPT-4o, Claude 3.5 Sonnet, Gemini 2.0, and more
OpenRouter is particularly useful if you want to:
- Try different models without signing up for multiple services
- Access models that might be restricted in your region
- Compare performance across different model providers
- Optimize for cost vs. performance based on your needs
The Ask feature allows you to chat with your repository using Retrieval Augmented Generation (RAG):
- Context-Aware Responses: Get accurate answers based on the actual code in your repository
- RAG-Powered: The system retrieves relevant code snippets to provide grounded responses
- Real-Time Streaming: See responses as they're generated for a more interactive experience
- Conversation History: The system maintains context between questions for more coherent interactions
DeepResearch takes repository analysis to the next level with a multi-turn research process:
- In-Depth Investigation: Thoroughly explores complex topics through multiple research iterations
- Structured Process: Follows a clear research plan with updates and a comprehensive conclusion
- Automatic Continuation: The AI automatically continues research until reaching a conclusion (up to 5 iterations)
- Research Stages:
- Research Plan: Outlines the approach and initial findings
- Research Updates: Builds on previous iterations with new insights
- Final Conclusion: Provides a comprehensive answer based on all iterations
To use DeepResearch, simply toggle the "Deep Research" switch in the Ask interface before submitting your question.
The main interface of DeepWiki
Access private repositories with personal access tokens
DeepResearch conducts multi-turn investigations for complex topics
Watch DeepWiki in action!
- "Missing environment variables": Make sure your
.env
file is in the project root and contains the required API keys - "API key not valid": Check that you've copied the full key correctly with no extra spaces
- "OpenRouter API error": Verify your OpenRouter API key is valid and has sufficient credits
- "Cannot connect to API server": Make sure the API server is running on port 8001
- "CORS error": The API is configured to allow all origins, but if you're having issues, try running both frontend and backend on the same machine
- "Error generating wiki": For very large repositories, try a smaller one first
- "Invalid repository format": Make sure you're using a valid GitHub, GitLab or Bitbucket URL format
- "Could not fetch repository structure": For private repositories, ensure you've entered a valid personal access token with appropriate permissions
- "Diagram rendering error": The app will automatically try to fix broken diagrams
- Restart both servers: Sometimes a simple restart fixes most issues
- Check console logs: Open browser developer tools to see any JavaScript errors
- Check API logs: Look at the terminal where the API is running for Python errors
Contributions are welcome! Feel free to:
- Open issues for bugs or feature requests
- Submit pull requests to improve the code
- Share your feedback and ideas
This project is licensed under the MIT License - see the LICENSE file for details.