Discrete Mathematics & Theoretical Computer Science |
We study a gcd algorithm directed by Least Significant Bits, the so―called LSB algorithm, and provide a precise average―case analysis of its main parameters [number of iterations, number of shifts, etc...]. This analysis is based on a precise study of the dynamical systems which provide a continuous extension of the algorithm, and, here, it is proved convenient to use both a 2―adic extension and a real one. This leads to the framework of products of random matrices, and our results thus involve a constant $γ$ which is the Lyapunov exponent of the set of matrices relative to the algorithm. The algorithm can be viewed as a race between a dyadic hare with a speed of 2 bits by step and a "real'' tortoise with a speed equal to $γ /\textit{log} \ 2 \sim 0.05$ bits by step. Even if the tortoise starts before the hare, the hare easily catches up with the tortoise [unlike in Aesop's fable [Ae]\ldots], and the algorithm terminates.