Machine Learning

An Empirical Comparison of Syllabuses for Curriculum Learning (Pre-Print)

We have published a pre-print (now available on Arxiv) which outlines our work comparing different syllabuses for curriculum learning. Neural networks are typically trained by repeatedly randomly selecting examples from a dataset and taking steps of stochastic gradient descent. Curriculum learning is an alternative approach to training neural networks, inspired by human learning in which training examples are presented according to a syllabus typically of increasing “difficulty”. Curriculum learning has shown some impressive empirical results, but little is known about the relative merits of different syllabuses. In this work we provide an empirical comparison of a number of syllabuses found in the literature. Abstract Syllabuses for curriculum learning have been developed on an ad-hoc, per task basis and little is Read more…

By Mark Collier, ago
Machine Learning

A Stable Neural-Turing-Machine (NTM) Implementation (Source Code and Pre-Print)

We have released an open source implementation of a Neural Turing Machine for TensorFlow, and published on arXiv the corresponding paper which we will present at ICANN 2018. Neural Turing Machines are notoriously difficult to implement and train. Our contribution is to explain why previous implementations have not successfully replicated the results in the original Neural Turing Machines paper. We then went on and produced an open source implementation which trains reliably and quickly. The work was done as part of my undergraduate thesis in the group of Prof Joeran Beel at the School of Computer Science and Statistics and the ADAPT Centre at Trinity College Dublin. We published the implementation and pre-print only a few days ago, and have received considerable interest Read more…

By Mark Collier, ago