Autograd can automatically differentiate native Python and Numpy code. It can handle a large subset of Python's features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be composed arbitrarily. The main intended application of Autograd is gradient-based optimization. For more information, check out the tutorial and the examples directory. We can continue to differentiate as many times as we like, and use numpy's vectorization of scalar-valued functions across many different input values.

Features

  • Simple neural net
  • Convolutional neural net
  • Recurrent neural net
  • LSTM
  • Neural Turing Machine
  • Backpropagating through a fluid simulation

Project Samples

Project Activity

See All Activity >

License

MIT License

Follow Autograd

Autograd Web Site

Other Useful Business Software
Our Free Plans just got better! | Auth0 Icon
Our Free Plans just got better! | Auth0

With up to 25k MAUs and unlimited Okta connections, our Free Plan lets you focus on what you do best—building great apps.

You asked, we delivered! Auth0 is excited to expand our Free and Paid plans to include more options so you can focus on building, deploying, and scaling applications without having to worry about your security. Auth0 now, thank yourself later.
Try free now
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of Autograd!

Additional Project Details

Programming Language

Python

Related Categories

Python Source Code Analysis Tool

Registered

2021-10-12