Variable-length class sequences based on a hierarchical approach: MCnv
Résumé
In contrast to conventional n-gram approaches, which are the most used language model in continuous speech recognition system, the multigram approach models a stream of variable-length sequences. Motivated by the success of class based methods in language modeling, we explore their potential use in a multigram framework. To overcome the independence assumption in classical multigram, we propose in this paper a hierarchical model which successively relaxes this assumption. We called this model: MCnv. The estimation of the model parameters can be formulated as a Maximum Likelihood estimation problem from incomplete data used at different levels (j € {1..v}). We show that estimates of the model parameters can be computed through an iterative Expectation-Maximization algorithm. A few experimental tests were carried out on a class corpus extracted from the French "Le Monde" word corpus labeled automatically. Results show that MCnv outperforms based class multigram and interpolated class trigram model.