Expressivity of hidden Markov chains vs. Recurrent neural networks from a system theoretic viewpoint - Archive ouverte HAL
Article Dans Une Revue IEEE Transactions on Signal Processing Année : 2023

Expressivity of hidden Markov chains vs. Recurrent neural networks from a system theoretic viewpoint

Résumé

Hidden Markov Chains (HMC) and Recurrent Neural Networks (RNN) are two well known tools for predicting time series. Even though these solutions were developed independently in distinct communities, they share some similarities when considered as probabilistic structures. So in this paper we first consider HMC and RNN as generative models, and we embed both structures in a common generative unified model (GUM). We next address a comparative study of the expressivity of these models. To that end we assume that the models are furthermore linear and Gaussian. The probability distributions produced by these models are characterized by structured covariance series, and as a consequence expressivity reduces to comparing sets of structured covariance series, which enables us to call for stochastic realization theory (SRT). We finally provide conditions under which a given covariance series can be realized by a GUM, an HMC or an RNN.
Fichier principal
Vignette du fichier
main.pdf (688.83 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03746170 , version 1 (12-08-2022)

Identifiants

Citer

François Desbouvries, Yohan Petetin, Achille Salaün. Expressivity of hidden Markov chains vs. Recurrent neural networks from a system theoretic viewpoint. IEEE Transactions on Signal Processing, 2023, 71, pp.4178-4191. ⟨10.1109/TSP.2023.3328108⟩. ⟨hal-03746170⟩
59 Consultations
78 Téléchargements

Altmetric

Partager

More