Maximum marginal likelihood estimation for nonnegative dictionary learning

30th January 2014

Cédric Févotte , research scientist at CNRS, laboratoire Lagrange, Nice, France

Abstract:

Nonnegative data decompositions form a currently popular research topic in signal processing and machine learning. They have applications in text retrieval, audio source separation, image inpainting, hyperspectral unmixing, etc. In this talk I will review recent probabilistic models for nonnegative data (additive Gaussian, Poisson, multinomial, multiplicative Gamma) and discuss two approaches for estimation, namely maximum joint likelihood estimation (MJLE, closely related to penalized nonnegative matrix factorization) and the less usual maximum marginal likelihood estimation (MMLE) that involves the integration of the expansion coefficients. MMLE provides a better-posed estimator than MJLE and furthermore embeds automatic model order selection, a surprising result validated empirically. I will present examples of decomposition of word counts from song lyrics for semantic analysis and examples of decomposition of spectrograms for audio source separation. Bibliography:

  • [1] O. Dikmen and C. Févotte. Maximum marginal likelihood estimation for nonnegative dictionary learning in the Gamma-Poisson model. IEEE Transactions on Signal Processing, 2012. [download ]
  • [2] O. Dikmen and C. Févotte. Nonnegative dictionary learning in the exponential noise model for adaptive music signal representation. NIPS, 2011. [download ]