The Differentiable Neural Computer (DNC), developed by Google DeepMind, is a neural network architecture augmented with dynamic external memory, enabling it to learn algorithms and solve complex reasoning tasks. Published in Nature in 2016 under the paper “Hybrid computing using a neural network with dynamic external memory,” the DNC combines the pattern recognition power of neural networks with a memory module that can be written to and read from in a differentiable way. This allows the model to learn how to store and retrieve information across long time horizons, much like a traditional computer. The architecture consists of modular components including an access module for managing memory operations, a controller (often an LSTM or feedforward network) for issuing read/write commands, and submodules for temporal linkage and memory allocation tracking.
Features
- Full TensorFlow and Sonnet implementation of the Differentiable Neural Computer
- Modular RNN-based architecture with external differentiable memory
- Access, controller, and addressing modules for flexible experimentation
- Example training script for algorithmic memory-based learning tasks
- Configurable parameters for memory size, sequence length, and optimization settings
- Supports model checkpointing and resuming for long training experiments