Welcome to the comprehensive repository of notebooks, tutorials, and projects focused on Large Language Models (LLMs), Generative AI, and Transformer architectures. This repository serves as a central hub for practitioners, researchers, and enthusiasts looking to explore and implement cutting-edge natural language processing and generative AI technologies.
- Transformer Architectures: Deep dives into the architecture that revolutionized NLP
- LLM Applications: Practical implementations of language models across various domains
- GenAI Tools & Technologies: Tutorials on the latest frameworks and tools
- Educational Resources: Curated courses and learning paths
- Implementation Guides: Step-by-step guides for building GenAI applications
| Tool | Description |
|---|---|
| Langchain | Framework for developing applications powered by language models |
| OpenAI | Implementation guides for GPT models and APIs |
| Hugging Face | Resources for the transformers library and model hub |
| LlamaIndex | Data framework for LLM applications to connect to external data |
| ChromaDB | Vector database for embedding storage and semantic search |
| DeciAI | Tools for AI model optimization and inference acceleration |
The repository includes notebooks on various topics:
- Fine-tuning strategies for open-source LLMs
- Building conversational agents with memory
- Creating multimodal applications
- Implementing retrieval-augmented generation (RAG)
- Optimizing transformer models for production
- Natural Language to SQL applications
- Fine tune open source LLMs using Lamini
- Building Natural Language to SQL Applications using LlamaIndex
The repository contains a curated list of courses, tutorials, and learning resources for GenAI and LLMs, from beginner to advanced levels. These include:
- Interactive tutorials for different transformer architectures
- Hands-on projects to build practical applications
- Guidelines for prompt engineering
- Best practices for LLM fine-tuning
Contributions to this repository are welcome! Whether you want to:
- Add new notebooks or projects
- Improve existing content
- Fix bugs or issues
- Share educational resources
Please see our CONTRIBUTING.md for guidelines on how to contribute.
- Expand transformer architecture examples
- Add more domain-specific LLM applications
- Create comprehensive evaluation frameworks
- Develop benchmarking suites for different models
- Add deployment guides for various platforms
This repository is licensed under the Apache 2.0 License - see the LICENSE file for details.
For questions, suggestions, or collaborations, please open an issue or reach out via GitHub Issues.
If you find this repository helpful, please consider giving it a star ⭐