Cascade of phase transitions in the training of Energy-based models - Archive ouverte HAL
Communication Dans Un Congrès Année : 2024

Cascade of phase transitions in the training of Energy-based models

Résumé

In this paper, we investigate the feature encoding process in a prototypical energy-based generative model, the Restricted Boltzmann Machine (RBM). We start with an analytical investigation using simplified architectures and data structures, and end with numerical analysis of real trainings on real datasets. Our study tracks the evolution of the model's weight matrix through its singular value decomposition, revealing a series of phase transitions associated to a progressive learning of the principal modes of the empirical probability distribution. The model first learns the center of mass of the modes and then progressively resolve all modes through a cascade of phase transitions. We first describe this process analytically in a controlled setup that allows us to study analytically the training dynamics. We then validate our theoretical results by training the Bernoulli-Bernoulli RBM on real data sets. By using data sets of increasing dimension, we show that learning indeed leads to sharp phase transitions in the high-dimensional limit. Moreover, we propose and test a mean-field finite-size scaling hypothesis. This shows that the first phase transition is in the same universality class of the one we studied analytically, and which is reminiscent of the mean-field paramagnetic-to-ferromagnetic phase transition.
Fichier non déposé

Dates et versions

hal-04844953 , version 1 (18-12-2024)

Identifiants

Citer

Dimitrios Bachtis, Giulio Biroli, Aurélien Decelle, Beatriz Seoane. Cascade of phase transitions in the training of Energy-based models. NeurIPS2024, Dec 2024, Vancouver, United States. ⟨10.48550/arXiv.2405.14689⟩. ⟨hal-04844953⟩
0 Consultations
0 Téléchargements

Altmetric

Partager

More