Open Source Python Deep Learning Frameworks for Mac

Browse free open source Python Deep Learning Frameworks for Mac and projects below. Use the toggles on the left to filter open source Python Deep Learning Frameworks for Mac by OS, license, language, programming language, and project status.

  • Our Free Plans just got better! | Auth0 Icon
    Our Free Plans just got better! | Auth0

    With up to 25k MAUs and unlimited Okta connections, our Free Plan lets you focus on what you do best—building great apps.

    You asked, we delivered! Auth0 is excited to expand our Free and Paid plans to include more options so you can focus on building, deploying, and scaling applications without having to worry about your security. Auth0 now, thank yourself later.
    Try free now
  • Get Avast Free Antivirus with 24/7 AI-powered online scam detection Icon
    Get Avast Free Antivirus with 24/7 AI-powered online scam detection

    Get protection for today’s online threats. Free.

    Award-winning antivirus protection, as well as protection against online scams, dangerous Wi-Fi connections, hacked accounts, and ransomware. It includes Avast Assistant, your built-in AI partner, which gives you help with suspicious online messages, offers, and more.
    Free Download
  • 1
    spaCy

    spaCy

    Industrial-strength Natural Language Processing (NLP)

    spaCy is a library built on the very latest research for advanced Natural Language Processing (NLP) in Python and Cython. Since its inception it was designed to be used for real world applications-- for building real products and gathering real insights. It comes with pretrained statistical models and word vectors, convolutional neural network models, easy deep learning integration and so much more. spaCy is the fastest syntactic parser in the world according to independent benchmarks, with an accuracy within 1% of the best available. It's blazing fast, easy to install and comes with a simple and productive API.
    Downloads: 14 This Week
    Last Update:
    See Project
  • 2

    Face Recognition

    World's simplest facial recognition api for Python & the command line

    Face Recognition is the world's simplest face recognition library. It allows you to recognize and manipulate faces from Python or from the command line using dlib's (a C++ toolkit containing machine learning algorithms and tools) state-of-the-art face recognition built with deep learning. Face Recognition is highly accurate and is able to do a number of things. It can find faces in pictures, manipulate facial features in pictures, identify faces in pictures, and do face recognition on a folder of images from the command line. It could even do real-time face recognition and blur faces on videos when used with other Python libraries.
    Downloads: 10 This Week
    Last Update:
    See Project
  • 3
    MMDeploy

    MMDeploy

    OpenMMLab Model Deployment Framework

    MMDeploy is an open-source deep learning model deployment toolset. It is a part of the OpenMMLab project. Models can be exported and run in several backends, and more will be compatible. All kinds of modules in the SDK can be extended, such as Transform for image processing, Net for Neural Network inference, Module for postprocessing and so on. Install and build your target backend. ONNX Runtime is a cross-platform inference and training accelerator compatible with many popular ML/DNN frameworks. Please read getting_started for the basic usage of MMDeploy.
    Downloads: 8 This Week
    Last Update:
    See Project
  • 4
    DGL

    DGL

    Python package built to ease deep learning on graph

    Build your models with PyTorch, TensorFlow or Apache MXNet. Fast and memory-efficient message passing primitives for training Graph Neural Networks. Scale to giant graphs via multi-GPU acceleration and distributed training infrastructure. DGL empowers a variety of domain-specific projects including DGL-KE for learning large-scale knowledge graph embeddings, DGL-LifeSci for bioinformatics and cheminformatics, and many others. We are keen to bringing graphs closer to deep learning researchers. We want to make it easy to implement graph neural networks model family. We also want to make the combination of graph based modules and tensor based modules (PyTorch or MXNet) as smooth as possible. DGL provides a powerful graph object that can reside on either CPU or GPU. It bundles structural data as well as features for a better control. We provide a variety of functions for computing with graph objects including efficient and customizable message passing primitives for Graph Neural Networks.
    Downloads: 6 This Week
    Last Update:
    See Project
  • Build Securely on Azure with Proven Frameworks Icon
    Build Securely on Azure with Proven Frameworks

    Lay a foundation for success with Tested Reference Architectures developed by Fortinet’s experts. Learn more in this white paper.

    Moving to the cloud brings new challenges. How can you manage a larger attack surface while ensuring great network performance? Turn to Fortinet’s Tested Reference Architectures, blueprints for designing and securing cloud environments built by cybersecurity experts. Learn more and explore use cases in this white paper.
    Download Now
  • 5
    Pyro

    Pyro

    Deep universal probabilistic programming with Python and PyTorch

    Pyro is a flexible, universal probabilistic programming language (PPL) built on PyTorch. It allows for expressive deep probabilistic modeling, combining the best of modern deep learning and Bayesian modeling. Pyro is centered on four main principles: Universal, Scalable, Minimal and Flexible. Pyro is universal in that it can represent any computable probability distribution. It scales easily to large datasets with minimal overhead, and has a small yet powerful core of composable abstractions that make it both agile and maintainable. Lastly, Pyro gives you the flexibility of automation when you want it, and control when you need it.
    Downloads: 6 This Week
    Last Update:
    See Project
  • 6
    Apache MXNet (incubating)

    Apache MXNet (incubating)

    A flexible and efficient library for deep learning

    Apache MXNet is an open source deep learning framework designed for efficient and flexible research prototyping and production. It contains a dynamic dependency scheduler that automatically parallelizes both symbolic and imperative operations. On top of this is a graph optimization layer, overall making MXNet highly efficient yet still portable, lightweight and scalable.
    Downloads: 5 This Week
    Last Update:
    See Project
  • 7
    Ray

    Ray

    A unified framework for scalable computing

    Modern workloads like deep learning and hyperparameter tuning are compute-intensive and require distributed or parallel execution. Ray makes it effortless to parallelize single machine code — go from a single CPU to multi-core, multi-GPU or multi-node with minimal code changes. Accelerate your PyTorch and Tensorflow workload with a more resource-efficient and flexible distributed execution framework powered by Ray. Accelerate your hyperparameter search workloads with Ray Tune. Find the best model and reduce training costs by using the latest optimization algorithms. Deploy your machine learning models at scale with Ray Serve, a Python-first and framework agnostic model serving framework. Scale reinforcement learning (RL) with RLlib, a framework-agnostic RL library that ships with 30+ cutting-edge RL algorithms including A3C, DQN, and PPO. Easily build out scalable, distributed systems in Python with simple and composable primitives in Ray Core.
    Downloads: 5 This Week
    Last Update:
    See Project
  • 8
    DeepCTR-Torch

    DeepCTR-Torch

    Easy-to-use,Modular and Extendible package of deep-learning models

    DeepCTR-Torch is an easy-to-use, Modular and Extendible package of deep-learning-based CTR models along with lots of core components layers that can be used to build your own custom model easily.It is compatible with PyTorch.You can use any complex model with model.fit() and model.predict(). With the great success of deep learning, DNN-based techniques have been widely used in CTR estimation tasks. The data in the CTR estimation task usually includes high sparse,high cardinality categorical features and some dense numerical features. Low-order Extractor learns feature interaction through product between vectors. Factorization-Machine and it’s variants are widely used to learn the low-order feature interaction. High-order Extractor learns feature combination through complex neural network functions like MLP, Cross Net, etc.
    Downloads: 4 This Week
    Last Update:
    See Project
  • 9
    Detectron2

    Detectron2

    Next-generation platform for object detection and segmentation

    Detectron2 is Facebook AI Research's next generation software system that implements state-of-the-art object detection algorithms. It is a ground-up rewrite of the previous version, Detectron, and it originates from maskrcnn-benchmark. It is powered by the PyTorch deep learning framework. Includes more features such as panoptic segmentation, Densepose, Cascade R-CNN, rotated bounding boxes, PointRend, DeepLab, etc. Can be used as a library to support different projects on top of it. We'll open source more research projects in this way. It trains much faster. Models can be exported to TorchScript format or Caffe2 format for deployment. With a new, more modular design, Detectron2 is flexible and extensible, and able to provide fast training on single or multiple GPU servers. Detectron2 includes high-quality implementations of state-of-the-art object detection.
    Downloads: 4 This Week
    Last Update:
    See Project
  • Gen AI apps are built with MongoDB Atlas Icon
    Gen AI apps are built with MongoDB Atlas

    The database for AI-powered applications.

    MongoDB Atlas is the developer-friendly database used to build, scale, and run gen AI and LLM-powered apps—without needing a separate vector database. Atlas offers built-in vector search, global availability across 115+ regions, and flexible document modeling. Start building AI apps faster, all in one place.
    Start Free
  • 10
    AWS Deep Learning Containers

    AWS Deep Learning Containers

    A set of Docker images for training and serving models in TensorFlow

    AWS Deep Learning Containers (DLCs) are a set of Docker images for training and serving models in TensorFlow, TensorFlow 2, PyTorch, and MXNet. Deep Learning Containers provide optimized environments with TensorFlow and MXNet, Nvidia CUDA (for GPU instances), and Intel MKL (for CPU instances) libraries and are available in the Amazon Elastic Container Registry (Amazon ECR). The AWS DLCs are used in Amazon SageMaker as the default vehicles for your SageMaker jobs such as training, inference, transforms etc. They've been tested for machine learning workloads on Amazon EC2, Amazon ECS and Amazon EKS services as well. This project is licensed under the Apache-2.0 License. Ensure you have access to an AWS account i.e. setup your environment such that awscli can access your account via either an IAM user or an IAM role.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 11
    Albumentations

    Albumentations

    Fast image augmentation library and an easy-to-use wrapper

    Albumentations is a computer vision tool that boosts the performance of deep convolutional neural networks. Albumentations is a Python library for fast and flexible image augmentations. Albumentations efficiently implements a rich variety of image transform operations that are optimized for performance, and does so while providing a concise, yet powerful image augmentation interface for different computer vision tasks, including object classification, segmentation, and detection. Albumentations supports different computer vision tasks such as classification, semantic segmentation, instance segmentation, object detection, and pose estimation. Albumentations works well with data from different domains: photos, medical images, satellite imagery, manufacturing and industrial applications, Generative Adversarial Networks. Albumentations can work with various deep learning frameworks such as PyTorch and Keras.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 12
    DIG

    DIG

    A library for graph deep learning research

    The key difference with current graph deep learning libraries, such as PyTorch Geometric (PyG) and Deep Graph Library (DGL), is that, while PyG and DGL support basic graph deep learning operations, DIG provides a unified testbed for higher level, research-oriented graph deep learning tasks, such as graph generation, self-supervised learning, explainability, 3D graphs, and graph out-of-distribution. If you are working or plan to work on research in graph deep learning, DIG enables you to develop your own methods within our extensible framework, and compare with current baseline methods using common datasets and evaluation metrics without extra efforts. It includes unified implementations of data interfaces, common algorithms, and evaluation metrics for several advanced tasks. Our goal is to enable researchers to easily implement and benchmark algorithms.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 13
    DeepCTR

    DeepCTR

    Package of deep-learning based CTR models

    DeepCTR is a Easy-to-use,Modular and Extendible package of deep-learning based CTR models along with lots of core components layers which can be used to easily build custom models. You can use any complex model with model.fit(), and model.predict(). Provide tf.keras.Model like interface for quick experiment. Provide tensorflow estimator interface for large scale data and distributed training. It is compatible with both tf 1.x and tf 2.x. With the great success of deep learning,DNN-based techniques have been widely used in CTR prediction task. The data in CTR estimation task usually includes high sparse,high cardinality categorical features and some dense numerical features. Since DNN are good at handling dense numerical features,we usually map the sparse categorical features to dense numerical through embedding technique.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 14
    Lightly

    Lightly

    A python library for self-supervised learning on images

    A python library for self-supervised learning on images. We, at Lightly, are passionate engineers who want to make deep learning more efficient. That's why - together with our community - we want to popularize the use of self-supervised methods to understand and curate raw image data. Our solution can be applied before any data annotation step and the learned representations can be used to visualize and analyze datasets. This allows selecting the best core set of samples for model training through advanced filtering. We provide PyTorch, PyTorch Lightning and PyTorch Lightning distributed examples for each of the models to kickstart your project. Lightly requires Python 3.6+ but we recommend using Python 3.7+. We recommend installing Lightly in a Linux or OSX environment. With lightly, you can use the latest self-supervised learning methods in a modular way using the full power of PyTorch. Experiment with different backbones, models, and loss functions.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 15
    Python Outlier Detection

    Python Outlier Detection

    A Python toolbox for scalable outlier detection

    PyOD is a comprehensive and scalable Python toolkit for detecting outlying objects in multivariate data. This exciting yet challenging field is commonly referred as outlier detection or anomaly detection. PyOD includes more than 30 detection algorithms, from classical LOF (SIGMOD 2000) to the latest COPOD (ICDM 2020) and SUOD (MLSys 2021). Since 2017, PyOD [AZNL19] has been successfully used in numerous academic researches and commercial products [AZHC+21, AZNHL19]. PyOD has multiple neural network-based models, e.g., AutoEncoders, which are implemented in both PyTorch and Tensorflow. PyOD contains multiple models that also exist in scikit-learn. It is possible to train and predict with a large number of detection models in PyOD by leveraging SUOD framework. A benchmark is supplied for select algorithms to provide an overview of the implemented models. In total, 17 benchmark datasets are used for comparison, which can be downloaded at ODDS.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 16
    Best-of Machine Learning with Python

    Best-of Machine Learning with Python

    A ranked list of awesome machine learning Python libraries

    This curated list contains 900 awesome open-source projects with a total of 3.3M stars grouped into 34 categories. All projects are ranked by a project-quality score, which is calculated based on various metrics automatically collected from GitHub and different package managers. If you like to add or update projects, feel free to open an issue, submit a pull request, or directly edit the projects.yaml. Contributions are very welcome! General-purpose machine learning and deep learning frameworks.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 17
    Darts

    Darts

    A python library for easy manipulation and forecasting of time series

    darts is a Python library for easy manipulation and forecasting of time series. It contains a variety of models, from classics such as ARIMA to deep neural networks. The models can all be used in the same way, using fit() and predict() functions, similar to scikit-learn. The library also makes it easy to backtest models, combine the predictions of several models, and take external data into account. Darts supports both univariate and multivariate time series and models. The ML-based models can be trained on potentially large datasets containing multiple time series, and some of the models offer a rich support for probabilistic forecasting. We recommend to first setup a clean Python environment for your project with at least Python 3.7 using your favorite tool (conda, venv, virtualenv with or without virtualenvwrapper).
    Downloads: 1 This Week
    Last Update:
    See Project
  • 18
    FATE

    FATE

    An industrial grade federated learning framework

    FATE (Federated AI Technology Enabler) is the world's first industrial grade federated learning open source framework to enable enterprises and institutions to collaborate on data while protecting data security and privacy. It implements secure computation protocols based on homomorphic encryption and multi-party computation (MPC). Supporting various federated learning scenarios, FATE now provides a host of federated learning algorithms, including logistic regression, tree-based algorithms, deep learning and transfer learning. FATE became open-source in February 2019. FATE TSC was established to lead FATE open-source community, with members from major domestic cloud computing and financial service enterprises. FedAI is a community that helps businesses and organizations build AI models effectively and collaboratively, by using data in accordance with user privacy protection, data security, data confidentiality and government regulations.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 19
    SageMaker MXNet Inference Toolkit

    SageMaker MXNet Inference Toolkit

    Toolkit for allowing inference and serving with MXNet in SageMaker

    SageMaker MXNet Inference Toolkit is an open-source library for serving MXNet models on Amazon SageMaker. This library provides default pre-processing, predict and postprocessing for certain MXNet model types and utilizes the SageMaker Inference Toolkit for starting up the model server, which is responsible for handling inference requests. AWS Deep Learning Containers (DLCs) are a set of Docker images for training and serving models in TensorFlow, TensorFlow 2, PyTorch, and MXNet. Deep Learning Containers provide optimized environments with TensorFlow and MXNet, Nvidia CUDA (for GPU instances), and Intel MKL (for CPU instances) libraries and are available in the Amazon Elastic Container Registry (Amazon ECR). The AWS DLCs are used in Amazon SageMaker as the default vehicles for your SageMaker jobs such as training, inference, transforms etc. They've been tested for machine learning workloads on Amazon EC2, Amazon ECS and Amazon EKS services as well.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 20
    tvm

    tvm

    Open deep learning compiler stack for cpu, gpu, etc.

    Apache TVM is an open source machine learning compiler framework for CPUs, GPUs, and machine learning accelerators. It aims to enable machine learning engineers to optimize and run computations efficiently on any hardware backend. The vision of the Apache TVM Project is to host a diverse community of experts and practitioners in machine learning, compilers, and systems architecture to build an accessible, extensible, and automated open-source framework that optimizes current and emerging machine learning models for any hardware platform. Compilation of deep learning models in Keras, MXNet, PyTorch, Tensorflow, CoreML, DarkNet and more. Start using TVM with Python today, build out production stacks using C++, Rust, or Java the next day.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 21
    AI-Agent-Host

    AI-Agent-Host

    The AI Agent Host is a module-based development environment.

    The AI Agent Host integrates several advanced technologies and offers a unique combination of features for the development of language model-driven applications. The AI Agent Host is a module-based environment designed to facilitate rapid experimentation and testing. It includes a docker-compose configuration with QuestDB, Grafana, Code-Server and Nginx. The AI Agent Host provides a seamless interface for managing and querying data, visualizing results, and coding in real-time. The AI Agent Host is built specifically for LangChain, a framework dedicated to developing applications powered by language models. LangChain recognizes that the most powerful and distinctive applications go beyond simply utilizing a language model and strive to be data-aware and agentic. Being data-aware involves connecting a language model to other sources of data, enabling a comprehensive understanding and analysis of information.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 22
    AWS Neuron

    AWS Neuron

    Powering Amazon custom machine learning chips

    AWS Neuron is a software development kit (SDK) for running machine learning inference using AWS Inferentia chips. It consists of a compiler, run-time, and profiling tools that enable developers to run high-performance and low latency inference using AWS Inferentia-based Amazon EC2 Inf1 instances. Using Neuron developers can easily train their machine learning models on any popular framework such as TensorFlow, PyTorch, and MXNet, and run it optimally on Amazon EC2 Inf1 instances. You can continue to use the same ML frameworks you use today and migrate your software onto Inf1 instances with minimal code changes and without tie-in to vendor-specific solutions. Neuron is pre-integrated into popular machine learning frameworks like TensorFlow, MXNet and Pytorch to provide a seamless training-to-inference workflow. It includes a compiler, runtime driver, as well as debug and profiling utilities with a TensorBoard plugin for visualization.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 23
    AutoKeras

    AutoKeras

    AutoML library for deep learning

    AutoKeras: An AutoML system based on Keras. It is developed by DATA Lab at Texas A&M University. The goal of AutoKeras is to make machine learning accessible to everyone. AutoKeras only support Python 3. If you followed previous steps to use virtualenv to install tensorflow, you can just activate the virtualenv. Currently, AutoKeras is only compatible with Python >= 3.7 and TensorFlow >= 2.8.0. AutoKeras supports several tasks with extremely simple interface. AutoKeras would search for the best detailed configuration for you. Moreover, you can override the base classes to create your own block.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 24
    Awesome Graph Classification

    Awesome Graph Classification

    Graph embedding, classification and representation learning papers

    A collection of graph classification methods, covering embedding, deep learning, graph kernel and factorization papers with reference implementations. Relevant graph classification benchmark datasets are available. Similar collections about community detection, classification/regression tree, fraud detection, Monte Carlo tree search, and gradient boosting papers with implementations.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 25
    ChainerRL

    ChainerRL

    ChainerRL is a deep reinforcement learning library

    ChainerRL (this repository) is a deep reinforcement learning library that implements various state-of-the-art deep reinforcement algorithms in Python using Chainer, a flexible deep learning framework. PFRL is the PyTorch analog of ChainerRL. ChainerRL has a set of accompanying visualization tools in order to aid developers' ability to understand and debug their RL agents. With this visualization tool, the behavior of ChainerRL agents can be easily inspected from a browser UI. Environments that support the subset of OpenAI Gym's interface (reset and step methods) can be used.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • 2
  • Next
Want the latest updates on software, tech news, and AI?
Get latest updates about software, tech news, and AI from SourceForge directly in your inbox once a month.