Skip to content

tomsmierz/Activation_maximization_experiments

Repository files navigation

Activation maximization experiments

Activation maximization is technique mainly used to generade adversary examples. Its utility in interpreting and explaining decisions made by neural network is somewhat limited for now. I believe that with some improvements this method can shed some more light on neural networks decision making proces.

Files:

  • MNIST.py is a python script containing learning algorithm and neural network architecture
  • model_MNIST.pt is a trained model
  • AM.py is a python script containing activation maximisation algoritm
  • relu x-0.01 is are pictures of results for x digit using regularization factor lambda = 0.01

About

No description or website provided.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages