An overview of asynchronous delay iterations: mathematical analysis and algorithms - Archive ouverte HAL
Communication Dans Un Congrès Année : 2019

An overview of asynchronous delay iterations: mathematical analysis and algorithms

Résumé

Asynchronous delay iterations from discrete mathematics have long been used to accelerate convergence in high-performance computing. They have the particularity of being defined on an infinite (discrete) set that can be seen as a Cartesian product: at each iteration, a vector of fixed-size bits is updated from a function whose return also depends on a parameter, itself obtained from the first term of a sequence that is iterated at each iteration. The resulting product space is infinite but only handles bounded integers. In addition, only the bit vector must be stored in the finite state machine, a new term of the infinite sequence can be read at each iteration. In doing so, we obtain a discrete dynamic system that can be implemented as is on our computers, but which, from the moment they receive new data at each clock stroke, iterates over an infinite set. Any algorithm performing the function mentioned above, and iterating in such a way as to update the machine’s memory (the bit vector) from a new input provided to it, is therefore ultimately a discrete dynamic system iterating over an infinite discrete set, and whose chaos can be studied and measured. Such an approach has proved to be rich in perspectives. Indeed, provided that the above framework is respected, it is possible to design algorithms whose realization on the machine corresponds exactly to the discrete dynamic system studied mathematically: the Cartesian product mentioned above corresponds to all the bit vectors that can be stored in the memory coupled to all the bit sequences that can be provided at the input of the machine, which is indeed an infinite set. This exact framework therefore makes it possible to define and design concretely, and without any shenanigans, programs that have been mathematically proven to be chaotic, and fine-state systems such as Turing’s machines that have been proven to be just as precisely chaotic. This framework has been used over the past decade to design algorithms in computer security and hazard generation, for the Internet of Things, and more recently to design new artificial intelligence algorithms. The purpose of this arti! cle is to review the various theoretical advances in these asynchronous iterations, and in such concepts as the characterization of Devaney’s chaos, topological or metric1 entropy, Lyapunov’s exponent and ergodicity, and the concrete algorithmic applications that have been published over the past ten years.
Fichier non déposé

Dates et versions

hal-03221929 , version 1 (20-05-2021)

Identifiants

  • HAL Id : hal-03221929 , version 1

Citer

Christophe Guyeux. An overview of asynchronous delay iterations: mathematical analysis and algorithms. International Conference on Chaotic Modeling, Simulation and Applications, Jun 2019, Chania, Greece. ⟨hal-03221929⟩
18 Consultations
0 Téléchargements

Partager

More