Analysis of fast versions of the Euclid Algorithm
Résumé
There exist fast variants of the gcd algorithm which are all based on principles due to Knuth and Schönhage. On inputs of size n, these algorithms use a Divide and Conquer approach, perform FFT multiplications and stop the recursion at a depth slightly smaller than lg n. A rough estimate of the worst-case complexity of these fast versions provides the bound O(n(log n) 2 log log n). However, this estimate is based on some heuristics and is not actually proven. Here, we provide a precise probabilistic analysis of some of these fast variants, and we prove that their average bit-complexity on random inputs of size n is Θ(n(log n) 2 log log n), with a precise remainder term. We view such a fast algorithm as a sequence of what we call interrupted algorithms, and we obtain three results about the (plain) Euclid Algorithm which may be of independent interest. We precisely describe the evolution of the distribution during the execution of the (plain) Euclid Algorithm; we obtain a sharp estimate for the probability that all the quotients produced by the (plain) Euclid Algorithm are small enough; we also exhibit a strong regularity phenomenon, which proves that these interrupted algorithms are locally "similar" to the total algorithm. This finally leads to the precise evaluation of the average bit-complexity of these fast algorithms. This work uses various tools, and is based on a precise study of generalised transfer operators related to the dynamical system underlying the Euclid Algorithm.
Origine | Fichiers produits par l'(les) auteur(s) |
---|