DoWhy is a Python library for causal inference
LLMFlows - Simple, Explicit and Transparent LLM Apps
Trainable, memory-efficient, and GPU-friendly PyTorch reproduction
Django friendly finite state machine support
The Open Source Memory Layer For Autonomous Agents
LLM based data scientist, AI native data application
Android Application Identifier for Packers, Protectors and Obfuscators
Python package for AutoML on Tabular Data with Feature Engineering
The unofficial python package that returns response of Google Bard
BISHENG is an open LLM devops platform for next generation apps
Training data (data labeling, annotation, workflow) for all data types
A comprehensive set of fairness metrics for datasets
A collection of reference Jupyter notebooks and demo AI/ML application
Open-source observability for your LLM application
Supercharge Your LLM Application Evaluations
Langchain-Chatchat (formerly langchain-ChatGLM), local knowledge
Easily build, customize and control your own LLMs
Refractoring ChatBot+LLM, Gpt-3.5-turbo, ChatGPT Bot/Voice Assistant
NVIDIA Federated Learning Application Runtime Environment
A toolkit to optimize ML models for deployment for Keras & TensorFlow
MII makes low-latency and high-throughput inference possible
AI agent that streamlines the entire process of data analysis
A system for quickly generating training data with weak supervision
Basaran, an open-source alternative to the OpenAI text completion API