Integrating MAP, Marginals, and Unsupervised Language Model Adaptation

Citation

Wang, W., & Stolcke, A. (2007). Integrating MAP, marginals, and unsupervised language model adaptation. In Eighth Annual Conference of the International Speech Communication Association.

Abstract

We investigate the integration of various language model adaptation approaches for a cross-genre adaptation task to improve Mandarin ASR system performance on a recently introduced new genre, broadcast conversation (BC). Various language model adaptation strategies are investigated and their efficacies are evaluated based on ASR performance, including unsupervised language model adaptation from ASR transcripts and ways to integrate supervised Maximum A Posteriori (MAP) and marginal adaptation within the unsupervised adaptation framework. We found that by effectively combining these adaptation approaches, we can achieve as much as 1.3 pct. absolute gain (6 pct. relative) on the final recognition error rate in the BC genre.


Read more from SRI