Euclidean Algorithm
One common trick taught in competitive programming is the formula to calculate the greatest common divisor of two numbers. That is, for some integers , the largest integer such that . The algorithm can be thought of as the following: Given two integers , we can say that
The latter follows from the fact that is the largest integer to divide itself, and everything divides 0 (because ).
But how do we know this to be true? Well, let’s prove it. Take . If we can show that , then we it must be true that , which means the GCDs are indeed equal.
By definition of GCD, . This means that for some integer k, and for some integer j. To show that , we look to first show that , since it already divides b. By definition, for some integer q. Substituting in our earlier definitions for a and b, we get . Rearranging the equation, we get . By definition, , and so we get that . Because y is their GCD, this means that has been established.
Now, we want to show that . We will do something similar to the first part, and show that . We’re given that . This give us for some integers k’, j’. By definition, for some integer k. With substitutions, we get that . By definition, , and thus y divides a. Since x is the gcd of a and b, and y divides both a and b, it must hold that .