313 Three algorithms for Fibonacci numbers
This problem compares the efficiency of three methods for computing the $n$th Fibonacci number $F_n$, given $n$. Assume that the cost of adding, subtracting, or multiplying two numbers is $O(1)$, independent of the size of the numbers.
a. Show that the running time of the straightforward recursive method for computing $F_n$ based on recurrence $\text{(3.22)}$ is exponential in $n$. (See, for example, the FIB procedure on page 775.)
b. Show how to compute $F_n$ in $O(n)$ time using memoization.
c. Show how to compute $F_n$ in $O(\lg n)$ time using only integer addition and multiplication. ($\textit{Hint:}$ Consider the matrix
$$ \begin{pmatrix} 0 & 1 \\ 1 & 1 \end{pmatrix} $$
and its powers.)
d. Assume now that adding two $\beta$bit numbers takes $\Theta(\beta)$ time and that multiplying two $\beta$bit numbers takes $\Theta(\beta^2)$ time. What is the running time of these three methods under this more reasonable cost measure for the elementary arithmetic operations?
a. In order to solve $\text{FIB}(n)$, we need to compute $\text{FIB}(n  1)$ and $\text{FIB}(n  1)$. Therefore we have the recurrence
$$T(n) = T(n  1) + T(n  2) + \Theta(1).$$
We can get the upper bound of Fibonacci as $O(2^n)$, but this is not the tight upper bound.
The Fibonacci recurrence is defined as
$$F(n) = F(n  1) + F(n  2).$$
The characteristic equation for this function will be
$$ \begin{aligned} x^2 & = x + 1 \\ x^2  x  1 & = 0. \end{aligned} $$
We can get the roots by quadratic formula: $x = \frac{1 \pm \sqrt 5}{2}$.
We know the solution of a linear recursive function is given as
$$ \begin{aligned} F(n) & = \alpha_1^n + \alpha_2^n \\ & = \bigg(\frac{1 + \sqrt 5}{2}\bigg)^n + \bigg(\frac{1  \sqrt 5}{2}\bigg)^n, \end{aligned} $$
where $\alpha_1$ and $\alpha_2$ are the roots of the characteristic equation.
Since both $T(n)$ and $F(n)$ are representing the same thing, they are asymptotically the same.
Hence,
$$ \begin{aligned} T(n) & = \bigg(\frac{1 + \sqrt 5}{2}\bigg)^n + \bigg(\frac{1  \sqrt 5}{2}\bigg)^n \\ & = \bigg(\frac{1 + \sqrt 5}{2}\bigg)^n \\ & \approx O(1.618)^n. \end{aligned} $$
b. This is same as 15.15.
c. Assume that all integer multiplications and additions can be done in $O(1)$. First, we want to show that
$$ \begin{pmatrix} 0 & 1 \\ 1 & 1 \end{pmatrix}^k = \begin{pmatrix} F_{k  1} & F_k \\ F_k & F_{k + 1} \end{pmatrix} . $$
By induction,
$$ \begin{aligned} \begin{pmatrix} 0 & 1 \\ 1 & 1 \end{pmatrix}^{k + 1} & = \begin{pmatrix} 0 & 1 \\ 1 & 1 \end{pmatrix} \begin{pmatrix} 0 & 1 \\ 1 & 1 \end{pmatrix}^k \\ & = \begin{pmatrix} 0 & 1 \\ 1 & 1 \end{pmatrix} \begin{pmatrix} F_{k  1} & F_k \\ F_k & F_{k + 1} \end{pmatrix}^k \\ & = \begin{pmatrix} F_k & F_{k + 1} \\ F_{k  1} + F_k & F_k + F_{k + 1} \end{pmatrix} \\ & = \begin{pmatrix} F_k & F_{k + 1} \\ F_{k + 1} & F_{k + 2} \end{pmatrix} . \end{aligned} $$
We show that we can compute the given matrix to the power $n  1$ in time $O(\lg n)$, and the bottom right entry is $F_n$.
We should note that by 8 multiplications and 4 additions, we can multiply any two $2$ by $2$ matrices, that means matrix multiplications can be done in constant time. Thus we only need to bound the number of those in our algorithm.
It takes $O(\lg n)$ to run the algorithm $\text{MATRIXPOW}(A, n  1)$ becasue we half the value of $n$ in each step, and within each step, we perform a constant amount of calculation.
The recurrence is
$$T(n) = T(n / 2) + \Theta(1).$$
MATRIXPOW(A, n)
if n % 2 == 1
return A * MATRIXPOW(A^2, (n  1) / 2)
return MATRIXPOW(A^2, n / 2)
d. First, we should note that $\beta = O(\log n)$.

For part (a),
We naively add a $\beta$bit number which is growing exponentially each time, so the recurrence becomes
$$ \begin{aligned} T(n) & = T(n  1) + T(n  2) + \Theta(\beta) \\ & = T(n  1) + T(n  2) + \Theta(\log n), \end{aligned} $$
which has the same solution $T(n) = O\Big(\frac{1 + \sqrt 5}{2}\Big)^n$ since $\Theta(\log n)$ can be absorbed by exponential time.

For part (b),
The recurrence of the memoized verstion becomes
$$M(n) = M(n  1) + \Theta(\beta).$$
This has a solution of $\sum_{i = 2}^n \beta = \Theta(n\beta) = \Theta(2^\beta \cdot \beta)$ or $\Theta(n \log n)$.

For part (c),
We perform a constant number of both additions and multiplications. The recurrence becomes
$$P(n) = P(n / 2) + \Theta(\beta^2),$$
which has a solution of $\Theta(\log n \cdot \beta^2) = \Theta(\beta^3)$ or $\Theta(\log^3 n)$.