Skip to content

It's a simple feed-forward neural network built from scratch using only NumPy. I created this project as a tutorial for my study group

License

Notifications You must be signed in to change notification settings

chayan/neural_network_tuts

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

neural_network_tuts

A feed-forward NN built from scratch with NumPy and applied on MNIST dataset

Project structure

There are 2 modules namely layers and nn which contains handy layers like ReLu, SoftMax, Dense etc. and a simple feed-forward neural net. There is a jupyter notebook under scripts package which runs the neural net on MNIST dataset.

SimpleNn

The simple_nn.SimpleNn class supports building arbitrarily complex neural net via adding layers of base type Layer. An example is there in the notebook which creates a neural net with 3 layers with ReLu activation for internal layers and softmax for the output layer.

Excerpts from the notebook

The model tends to converge in <10 epochs and starts overfitting after that. Accuracy Loss

Samples from test set with predictions

predictions

and the confusion matrix

conf_matrix

About

It's a simple feed-forward neural network built from scratch using only NumPy. I created this project as a tutorial for my study group

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published