Lists (1)
Sort Name ascending (A-Z)
Stars
Fine-tuning & Reinforcement Learning for LLMs. 🦥 Train OpenAI gpt-oss, DeepSeek-R1, Qwen3, Gemma 3, TTS 2x faster with 70% less VRAM.
A library for mechanistic interpretability of GPT-style language models
Open Source Application for Advanced LLM + Diffusion Engineering: interact, train, fine-tune, and evaluate large language models on your own computer.
An awesome repository & A comprehensive survey on interpretability of LLM attention heads.
Guideline following Large Language Model for Information Extraction
[ACL 2024] An Easy-to-use Knowledge Editing Framework for LLMs.
Making large AI models cheaper, faster and more accessible
Fast, general, and tested differentiable structured prediction in PyTorch
[IJCAI 2024] Generate different roles for GPTs to form a collaborative entity for complex tasks.
Hackable and optimized Transformers building blocks, supporting a composable construction.
Official Codes for "Publicly Shareable Clinical Large Language Model Built on Synthetic Clinical Notes"
Code for the paper Domain Adaptation with Conditional Distribution Matching and Generalized Label Shift
DSPy: The framework for programming—not prompting—language models
Awesome resources for in-context learning and prompt engineering: Mastery of the LLMs such as ChatGPT, GPT-3, and FlanT5, with up-to-date and cutting-edge updates. - Professor Yu Liu
Deep-Learning Model Exploration and Development for NLP
Fast and memory-efficient exact attention
Instruct-tune LLaMA on consumer hardware
Experiments for understanding disentanglement in VAE latent representations
🤖 Assemble, configure, and deploy autonomous AI Agents in your browser.
Evals is a framework for evaluating LLMs and LLM systems, and an open-source registry of benchmarks.
This repo includes ChatGPT prompt curation to use ChatGPT and other LLM tools better.
Beyond the Imitation Game collaborative benchmark for measuring and extrapolating the capabilities of language models
Toolkit for creating, sharing and using natural language prompts.
A comprehensive list of pytorch related content on github,such as different models,implementations,helper libraries,tutorials etc.
Convert Machine Learning Code Between Frameworks
Code to reproduce the experiments in the paper "Transformer Based Multi-Source Domain Adaptation" (EMNLP 2020)


