AMBV: An Optimized Generic Viterbi Algorithm for Bayesian Networks
Résumé
Bayesian Networks is a family of machine learning models widely used in various applications such as speech recognition, Protocol Reverse Engineering, or more generally the retrieval of hidden information in data. These models, at the crossroads of probability theory and graph theory, allow intuitive modelling and simple interpretation of the results. In this paper, we are interested in inferring the most probable state of a discrete Bayesian network. In the case of a Hidden Markov Model, the Viterbi algorithm is usually used. However, although accurate, it is not optimized, and when generalized to any number of variables per time slice, its complexity increases exponentially. This is why we have developed an optimized version of the Viterbi algorithm, Automatic Markov Boundaries construction optimized Viterbi (AMBV), taking advantage of the fact that once a model is trained, it is usually used to analyze many observations. Moreover, in case of sparse probability distributions of the variables, an additional level of optimization is used. Finally, in order to make possible the inference of the most probable state of any discrete Bayesian network, a mechanism for automatically generating a set of Markov Boundaries for the network has been proposed. We will show that AMBV performs significantly better than the classical Viterbi algorithm as soon as the complexity of the network increases sufficiently.
Origine | Fichiers éditeurs autorisés sur une archive ouverte |
---|---|
Licence |