This package makes it easy to compute gradients of complicated, deeply nested functions. It is designed for Machine Learning, wherein it is common practice to compute gradients of complex Neural Networks.
In addition to basic backpropagation, a form of reverse automatic differentiation, autofunc can perform forward automatic differentiation with respect to one variable, an operation known as the R operator. As a result, autofunc is suitable for approximating various aspects of a function's Hessian, such as the Hessian's rows' magnitudes (as suggested in this paper).
Installation is easy:
$ go get github.com/unixpickle/autofunc
To see how you might create something like a Multilayer Perceptron network, checkout bench/mlp.go.