Since elementary school, I have been taught a method to multiplicate two numbers: get a sub-product by multiplying each digit of one number with the other number. However, today I read chapter 5.5 – Integer Multiplication of the book Algorithm Design, and it first occurred to me that the process can be completed in less than O(n * n) time!
The method uses a divide-and-conquer spirit. Say two numbers x and y are n-digit base-k numbers. We now split two numbers by half:
x = x1 * k ^ (n/2) + x2
y = y1 * k ^ (n/2) + y2
Multiplying two numbers we get:
x * y = x1y1 * k ^ n + (x2y1 + x1y2) * k ^ (n/2) + x2y2
In which the numbers we need to get are x1y1, x1y2, x2y1 and x2y2
Right now the recurrence function is T(n) = 4 * T(n/2) + O(n) which would be O(n * n). This is no different from the old way, so we need to reduce the recursion call down to 3 instead of 4. Notice x2y1 + x1y2 = (x1 + x2) * (y1 + y2) – x1y1 – x2y2, and we can get (x1 + x2) * (y1 + y2) in T(n/2) time. Therefore, if we “circle around”, magically the recurrence function turns into T(n) = 3 * T(n/2) + O(n) = O(n ^ 1.59)!
The more I learn, the more I realize that a lot of stuff they taught us in school aren’t the best way to go, but I would say it’s much more difficult for kids to do recursions like computers than calculating sub-products.