EPIA 205 , 9-11 September 2015
Abstract: There is an increasing interest in spectral methods for learning latent-variable language models in the form of weighted automata and context-free grammars. Spectral methods provide an algebraic formulation of the learning problem that directly exploits the recurrence relations satisfied by these functions.
I will review the spectral method from an algebraic perspective that exploits Hankel matrices to represent functions. The key idea is that if a function is computed by a finite state process then its corresponding Hankel matrix will be of low rank. Using this fact we can reduce the problem of searching over the space of finite state functions to searching over the space of low-rank Hankel