Browse free open source Python LLM Inference Tools for Mac and projects below. Use the toggles on the left to filter open source Python LLM Inference Tools for Mac by OS, license, language, programming language, and project status.
Run Local LLMs on Any Device. Open-source
A high-throughput and memory-efficient inference and serving engine
Everything you need to build state-of-the-art foundation models
Standardized Serverless ML Inference Platform on Kubernetes
Optimizing inference proxy for LLMs
A set of Docker images for training and serving models in TensorFlow
Deep learning optimization library: makes distributed training easy
Framework that is dedicated to making neural data processing
Sparsity-aware deep learning inference runtime for CPUs
OpenMMLab Model Deployment Framework
Operating LLMs in production
Uncover insights, surface problems, monitor, and fine tune your LLM
Visual Instruction Tuning: Large Language-and-Vision Assistant
LMDeploy is a toolkit for compressing, deploying, and serving LLMs
Easiest and laziest way for building multi-agent LLMs applications
A Pythonic framework to simplify AI service building
Official inference library for Mistral models
Bring the notion of Model-as-a-Service to life
Single-cell analysis in Python
Replace OpenAI GPT with another LLM in your app
Adversarial Robustness Toolbox (ART) - Python Library for ML security
Unified Model Serving Framework
MII makes low-latency and high-throughput inference possible
State-of-the-art diffusion models for image and audio generation
DoWhy is a Python library for causal inference