# CS代考 THE UNIVERSITY OF NEW SOUTH WALES – cscodehelp代写

THE UNIVERSITY OF NEW SOUTH WALES

3. INTEGER MULTIPLICATION I

Raveen de Silva,

office: K17 202

Course Admin: ,

School of Computer Science and Engineering UNSW Sydney

Term 3, 2021

Table of Contents

1. Master Theorem 1.1 Statement

1.2 Examples

1.3 Proof

2. Arithmetic Operations

2.1 Applying D&C to multiplication of large integers 2.2 The Karatsuba trick

3. Puzzle

Table of Contents

1. Master Theorem 1.1 Statement

1.2 Examples

1.3 Proof

2. Arithmetic Operations

2.1 Applying D&C to multiplication of large integers 2.2 The Karatsuba trick

3. Puzzle

Master Theorem

Setup

Let:

a ≥ 1 be an integer and and b > 1 be a real number;

f (n) > 0 be a non-decreasing function defined on the positive integers;

T(n) be the solution of the recurrence

T (n) = a T (n/b) + f (n).

Define the critical exponent c∗ = logb a and the critical polynomial nc∗.

Master Theorem

Theorem

1. Iff(n)=O(nc∗−ε)forsomeε>0,thenT(n)=Θ(nc∗);

2. Iff(n)=Θ(nc∗),thenT(n)=Θ(nc∗ logn);

3. If f(n) = Ω(nc∗+ε) for some ε > 0,and for some c < 1 and some n0,
af (n/b)≤cf(n) (1) holds for all n > n0,then T (n) = Θ(f (n));

Exercise

Prove that f (n) = Ω(nc∗+ε) is a consequence of (1).

Master Theorem

Theorem (continued)

4. If none of these conditions hold, the Master Theorem is NOT applicable.

Often, the proof of the Master Theorem can be tweaked to (asymptotically) solve such recurrences anyway! An example is T (n) = 2T (n/2) + n log n.

Master Theorem

Remark

Recall that for a,b > 1, loga n = Θ(logb n), so we can omit the base and simply write statements of the form f(n) = Θ(g (n) log n).

However, nloga x is not interchangeable with nlogb x – the base must be specified in such expressions.

Table of Contents

1. Master Theorem 1.1 Statement

1.2 Examples

1.3 Proof

2. Arithmetic Operations

2.1 Applying D&C to multiplication of large integers 2.2 The Karatsuba trick

3. Puzzle

Master Theorem

Example 1

Let T (n) = 4 T (n/2) + n.

Then the critical exponent is c∗ = logb a = log2 4 = 2, so the

critical polynomial is n2.

Now, f (n) = n = O(n2−ε) for small ε (e.g. 0.1).

This satisfies the condition for case 1, so T(n) = Θ(n2).

Master Theorem

Example 2

Let T (n) = 2T (n/2) + 5 n.

Then the critical exponent is c∗ = logb a = log2 2 = 1, so the

critical polynomial is n.

Now, f (n) = 5 n = Θ(n).

This satisfies the condition for case 2, so T (n) = Θ(n log n).

Master Theorem

Example 3

Let T (n) = 3 T (n/4) + n.

Then the critical exponent is c∗ = log4 3 ≈ 0.7925, so the critical

polynomial is nlog4 3.

Now, f (n) = n = Ω(nlog4 3+ε) for small ε (e.g. 0.1). Also,af(n/b)=3f(n/4)=3/4n

Therefore the Master Theorem does not apply!

Master Theorem

Exercise

Prove (2), that is, for all ε > 0, c > 0 and N > 0 there is some n > N such that

log2 n < c · nε.
Hint
Use L’Hˆopital’s rule to show that
log n → 0. nε
Table of Contents
1. Master Theorem 1.1 Statement
1.2 Examples
1.3 Proof
2. Arithmetic Operations
2.1 Applying D&C to multiplication of large integers 2.2 The Karatsuba trick
3. Puzzle
Master Theorem
Suppose T(n) satisfies the recurrence
T(n) = aT n + f (n) (3)
b
However, the T(n/b) term can itself be reduced using the recurrence as follows:
T n = aT n +f n b b2 b
Substituting into (3) and simplifying gives
T (n) = aaT n + f n + f (n)
b2 b
= a2T n +af n+f(n). b2 b
Master Theorem
We have now established
T(n) = a2T n +af n+f(n). (4)
b2 b
But why stop there? We can now reduce the T n/b2 term, again
using (3):
Tn=aTn+fn. b2 b3 b2
We now substitute this into (4) and simplify to get
T(n) = a2aT n +f n +af n+f(n)
b3 b2 b
= a3T n +a2 f n +af n+f(n).
b3 b2 b We can see a pattern emerging!
Master Theorem
Continuing in this way, we find that T(n)=akTn+ak−1f n +...+afn+f(n)
n k−1 n =akT bk +aif bi .
i=0
We stop when k = ⌊logb n⌋, since this gives n/bk ≈ 1.
⌊log n⌋−1 nbn
T(n)≈alogbnT + aif . blogb n bi
i=0
bk bk−1 b
Master Theorem
Now we have
T(n)≈alogbnT + aif
blogb n
We can use the identity alogb n = nlogb a to get:
.
bi
⌊log n⌋−1 nbn
T(n)≈nlogbaT(1)+ aif i=0
.
(5)
i=0
⌊log n⌋−1
b n
bi
Importantly, we have not assumed anything about f (n) yet! We will now analyse the sum S in the simplest case of the Master Theorem, namely Case 2.
S
Master Theorem: Case 2
Suppose f (n) = Θ nlogb a. Then ⌊log n⌋−1
b n S= aif bi
i=0
⌊logbn⌋−1 n b
i loga = aΘbi
=Θ
i=0
⌊log n⌋−1
b nb i loga
a bi ,
using the sum property and scalar multiple property.
i=0
Master Theorem: Case 2
⌊log n⌋−1 b nb
S=Θai loga bi
i=0
=Θnloga ai b
⌊logbn⌋−1 1log a
b
bi i=0
as nlogb a is common to every term of the sum,
⌊log n⌋−1
=Θn b
.
b a
blogb a i=0
loga i
Master Theorem: Case 2
⌊log n⌋−1 b a
b S=Θnloga i
= Θlogb a left(
i=0
blogb a i=0
⌊log n⌋−1 i
a as blogb a = a,
⌊logbn⌋−1 = Θ nlogb a 1
i=0
= Θ nlogb a ⌊logb n⌋ .
b a
Master Theorem: Case 2
Finally, we return to (5). Substituting our result for S gives T (n) ≈ nlogb a T (1) + Θ nlogb a ⌊logb n⌋
=Θnlogba logn
as logarithms of any base are equivalent,
completing the proof.
Master Theorem: Other cases
The proof of Case 1 is very similar to the above. The main difference is that 1 is replaced by (bε)i , forming a geometric series, which can be summed using the identity
1+r +r2 +...+rk−1 = rk −1. r−1
In Case 3, we need to prove that T (n) = Θ(f (n)), that is, both:
T (n) = Ω(f (n)), which follows directly from the recurrence T (n) = a T (n/b) + f (n), and
T(n) = O(f (n)) (not as obvious).
Master Theorem: Case 3
Exercise
Prove that in Case 3, T(n) = O(f (n)).
Hint
You will need to bound
S= aif bi
i=0 from above. Try using the inequality
af n≤cf(n) b
to relate each term of S to f (n).
⌊log n⌋−1
b n
Table of Contents
1. Master Theorem 1.1 Statement
1.2 Examples
1.3 Proof
2. Arithmetic Operations
2.1 Applying D&C to multiplication of large integers 2.2 The Karatsuba trick
3. Puzzle
Basics revisited: how do we add two integers?
CCCCC carry
X X X X X first integer
+ X X X X X second integer
-----------
XXXXXX result
Adding 3 bits can be done in constant time.
It follows that the whole algorithm runs in linear time i.e., O(n) many steps.
Basics revisited: how do we add two integers?
Question
Can we add two n-bit numbers in faster than in linear time?
Answer
No! There is no asymptotically faster algorithm because we have to read every bit of the input, which takes O(n) time.
Basics revisited: how do we multiply two integers?
X X X X <- first input integer
* X X X X <- second input integer
-------
XXXX X XXX
XX XX XXX X
--------------- XXXXXXXX
O(n^2) intermediate operations:
/ O(n^2) elementary multiplications
/ + O(n^2) elementary additions <- resultoflength2n
We assume that two X’s can be multiplied in O(1) time (each X could be a bit or a digit in some other base).
Thus the above procedure runs in time O(n2).
Basics revisited: how do we multiply two integers?
Question
Can we multiply two n-bit numbers in linear time, like addition?
Answer
No one knows! “Simple” problems can actually turn out to be difficult!
Question
Can we do it in faster than quadratic time? Let’s try divide and conquer.
Table of Contents
1. Master Theorem 1.1 Statement
1.2 Examples
1.3 Proof
2. Arithmetic Operations
2.1 Applying D&C to multiplication of large integers 2.2 The Karatsuba trick
3. Puzzle
Multiplying large integers by divide-and-conquer
Split the two input numbers A and B into halves: A0, B0 - the least significant n/2 bits;
A1, B1 - the most significant n/2 bits.
n
A=A122 +A0 XX...XXX...X
B = B 2 n2 + B n n 1022
AB can now be calculated recursively using the following equation:
AB = A1B12n + (A1B0 + B1A0)2n2 + A0B0.
Multiplying large integers by divide-and-conquer
1: function Mult(A,B)
2: if |A|=|B|=1thenreturnAB
3: else
4: A1 ← MoreSignificantPart(A);
5: A0 ← LessSignificantPart(A);
6: B1 ← MoreSignificantPart(B );
7: B0 ← LessSignificantPart(B );
8: X←Mult(A0,B0);
9: Y←Mult(A0,B1);
10: Z←Mult(A1,B0);
11: W←Mult(A1,B1);
12: return W 2n +(Y +Z)2n/2 +X
13: end if
14: end function
Multiplying large integers by divide-and-conquer
How many steps does this algorithm take?
Each multiplication of two n digit numbers is replaced by four multiplications of n/2 digit numbers: A1B1, A1B0, B1A0, A0B0, plus we have a linear overhead to shift and add:
T (n) = 4T n + c n. 2
Let’s use the Master Theorem!
Multiplying large integers by divide-and-conquer
T (n) = 4T n + c n 2
The critical exponent is c∗ = log2 4 = 2, so the critical polynomial is n2.
Then f (n) = c n = O(n2−0.1), so Case 1 applies.
We conclude that T(n) = Θ(nc∗) = Θ(n2), i.e., we gained nothing with our divide-and-conquer!
Multiplying large integers by divide-and-conquer
Question
Is there a smarter multiplication algorithm taking less than O(n2) many steps?
Answer
Remarkably, there is!
Table of Contents
1. Master Theorem 1.1 Statement
1.2 Examples
1.3 Proof
2. Arithmetic Operations
2.1 Applying D&C to multiplication of large integers 2.2 The Karatsuba trick
3. Puzzle
History
In 1952, one of the most famous mathematicians of the 20th century, , conjectured that you cannot multiply in less than quadratic many elementary operations.
In 1960, , then a 23-year-old student, found an algorithm (later it was called “divide-and-conquer”) that multiplies two n-digit numbers in Θ nlog2 3 ≈ Θ(n1.58...) elementary steps, thus disproving the conjecture!!
Kolmogorov was shocked!
The Karatsuba trick
Once again we split each of our two input numbers A and B into halves:
n
A=A122 +A0 XX...XXX...X
B = B 2 n2 + B n n . 1022
Previously we saw that
AB = A1B12n + (A1B0 + A0B1)2n2 + A0B0,
but rearranging the bracketed expression gives
AB = A1B12n+((A1+A0)(B1+B0)−A1B1−A0B0)2n2 +A0B0, saving one multiplication at each round of the recursion!
The Karatsuba trick
1: function Mult(A,B)
2: if |A|=|B|=1thenreturnAB
3: else
4: A1 ← MoreSignificantPart(A);
5: A0 ← LessSignificantPart(A);
6: B1 ← MoreSignificantPart(B );
7: B0 ← LessSignificantPart(B );
8: U←A1+A0;
9: V←B1+B0;
10: X←Mult(A0,B0);
11: W←Mult(A1,B1);
12: Y←Mult(U,V);
13: return W 2n +(Y −X −W)2n/2 +X
14: end if
15: end function
The Karatsuba trick
How fast is this algorithm?
Addition takes linear time, so we are only concerned with the number of multiplications.
We need A1B1, A0B0 and (A1 + A0)(B1 + B0); thus
T (n) = 3 T n + c n. 2
The Karatsuba trick
Clearly, the run time T(n) satisfies the recurrence
T (n) = 3 T n + c n (6)
2
Now the critical exponent is c∗ = log2 3. Once again, we are in Case 1 of the Master Theorem, but this time
T(n) = Θnlog2 3 = Θ n1.58...
= o(n2), disproving Kolmogorov’s conjecture.
Next time: can we do even better?
Table of Contents
1. Master Theorem 1.1 Statement
1.2 Examples
1.3 Proof
2. Arithmetic Operations
2.1 Applying D&C to multiplication of large integers 2.2 The Karatsuba trick
3. Puzzle
PUZZLE!
Problem
You are given a 2n × 2n board with one of its cells missing (i.e., the board has a hole). The position of the missing cell can be arbitrary.
You are also given a supply of “trominoes”, each of which can cover three cells as below.
PUZZLE!
Problem (continued)
Your task is to design an algorithm which covers the entire board (except for the hole) with these “trominoes”.
Hint
Do a divide-and-conquer recursion!
That’s All, Folks!!