A fast and simple algorithm for training neural probabilistic language models

Andriy Mnih , Research Associate at University College London, London, U.K..

In spite of their superior performance, neural probabilistic language models (NPLMs) remain far less widely used than n-gram models due to their notoriously long training times. Training NPLMs is computationally expensive because they are explicitly normalized, which leads to having to consider all words in the vocabulary when computing the log-likelihood gradients. We propose a fast and simple algorithm for training..... read more

Click to watch the video

Published on : Friday 24 October 2014