Stars
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
🦜🔗 The platform for reliable agents.
Robust Speech Recognition via Large-Scale Weak Supervision
A high-throughput and memory-efficient inference and serving engine for LLMs
The simplest, fastest repository for training/finetuning medium-sized GPTs.
LlamaIndex is the leading framework for building LLM-powered agents over your data.
High-Resolution Image Synthesis with Latent Diffusion Models
We write your reusable computer vision tools. 💜
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (V…
Official Code for DragGAN (SIGGRAPH 2023)
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
DSPy: The framework for programming—not prompting—language models
Free and Open Source Enterprise Resource Planning (ERP)
Industry leading face manipulation platform
[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
A Django content management system focused on flexibility and user experience
Train transformer language models with reinforcement learning.
An open source implementation of CLIP.
20+ high-performance LLMs with recipes to pretrain, finetune and deploy at scale.
Databricks’ Dolly, a large language model trained on the Databricks Machine Learning Platform
Large Language Model Text Generation Inference
Modin: Scale your Pandas workflows by changing a single line of code
🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading
PyTorch3D is FAIR's library of reusable components for deep learning with 3D data
Accessible large language models via k-bit quantization for PyTorch.
🚀🎬 ShortGPT - Experimental AI framework for youtube shorts / tiktok channel automation
Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.