Google integrates our Neural-Turing-Machine Implementation in TensorFlow
Google integrated our implementation of a Neural Turing Machine (NTM) into its official TensorFlow release 1.14.0. The implementation is based on the work of our student Mark Collier. The work was previously published at the 27th International Conference on Artificial Neural Networks Conference ICANN and received the best-paper award.
Neural Turing Machines (NTMs) are an instance of Memory Augmented Neural Networks, a new class of recurrent neural networks which decouple computation from memory by introducing an external memory unit. Neural Turing Machines have demonstrated superior performance over Long Short-Term Memory Cells in several sequence learning tasks.
Neural Turing Machines were introduced five years ago, but the original authors did not release their source code. A number of researchers wrote open source implementations of Neural Turing Machines but they were unstable during training and/or fail to replicate the reported performance of Neural Turing Machines. We are, to the best of our knowledge, first to release a stable open-source implementation of a Neural Turing Machine.
Our implementation received considerable interest from the machine learning community already. This includes 443 stars on GitHub and a Tweet by David Ha (Google Brain) on Twitter with over 500 likes and 200 retweets (“After 4 years, someone finally implemented a stable version of NTM”). We are glad that Google now decided to make our implementation the official reference implementation of NTMs in TensorFlow. We hope that this integration enables more researchers and machine-learning engineers to use Neural Turing Machines in their work.