GVSU CIS 263
Week 12 / Day 1
Big-O for divide and conquer
- In general, divide and conquer algorithms fit this pattern:
T(n) = aT(n/b) + O(n^k)
b
Describes the number of pieces that you cut the input into. (e.g.,b = 2
for Merge sort.)a
is the number of recursive calls.- It is common for
a == b
- It is common for
O(n^k)
is the amount of work needed to put the pieces together. (For Mergesort, this isO(n)
, for the Matrix multiply it isO(N^2)
)
- This recursive formula is called a recurrence relation.
- Let’s see if we can get the formula in a closed form.
- For simplicity, assume
N
is a power ofb
:N = b^m
- For example, if
b
is 2, we assumeN
is a power of 2. T(b^m) = aT(b^m/b) = (b^m)^k
- Simplify and rewrite
(b^m)^k
T(b^m) = aT(b^(m-1)) + (b^k)^m
- For example, if
- Now for an algebra trick: Divide everything by
a^m
T(b^m)/a^m = T(b^(m-1))/(a^(m-1)) = ((b^k)/a)^m
- At this point, we can see how to “unwind” the recursion.
- Switch to paper notes
Week 12 / Day 2
Dynamic Programming
Sometimes the algorithm naturally repeats work. Make sure you save it instead of repeat it.
- Recursive Fibonacci
- Can create a lookup table on the side, and keep the recursion; but, it’s faster to recognize you need to fill the table from bottom to top, so we can just do that with a loop and skip the recursion.
- Ordering Matrix Multiplications
- Given non square matrices ABCD, it matters whether we do
- (A)(B(CD))
- (AB)(CD)
- (ABC)(D)
- etc.
- Interestingly, none of the obvious greedy algorithms work.
- What is the obvious recursive algorithm?
- Given non square matrices ABCD, it matters whether we do
optimal_parens(start, end) {
for i = start to end {
a = optimal_parens(start, i)
b = optimal_parens(i + 1, end)
c = cost of (start, i)*(i, end)
cost = a + b + c
min = minimum(cost, min)
}
return min
}
- However, just as with fibonacci a lot of work would get duplicated
- (e.g., (ABC) and (ABCD) would evaluate many of the same subtrees.
- How many different recursive calls are here?
- Just
O(n^2)
. Each combination ofi
andj
where1 <= i < j <= N
- Just
- Thus, if we store each result in a table the first time we calculate it,
the algorithm tops out at
O(N^3)
- As with Fibonacci, we can save time by avoiding the recursion and recognizing the order the table will get filled. Start on diagonal of matrix and move up toward the corner.