http://imgs.xkcd.com/comics/efficiency.png
i=0∑ni=i=1∑ni=1+2+...+n=2n(n+1)
Proof:
S=1+2+...+(n−1)+n
S=n+(n−1)+...+2+1
2S=(n+1)+(n+1)+...+(n+1)+(n+1)
2S=n(n+1)
S=2n(n+1)
i=l∑ucai=ci=l∑uai
Proof:
∑i=lucai=cal+cal+1+...+can
=c(al+al+1+...+an)
=c∑i=luai
i+l∑u(ai±bi)=i=l∑uai±i=l∑ubi
i=l∑u1=u−l+1
i=1∑n(ai−ai−1)=an−a0
a_n - {a_{n-1}}
{a_{n-1}} - {a_{n-2}}
...
a_1 - a_0
i=0∑n2i=2n+1−1
General: ∑i=0nai=a−1an+1−1 where a̸=1
- Sorting
- Searching
- String Processing
- Graph Problems
- Combinatorial Problems
- Geometric Problems
- Numerical Problems
- Brute Force/Exhaustive
- Greedy
- Divide and Conquer
- Decrease and Conquer
- Transform and Conquer
- Space and Time Tradeoff
- Dynamic Programming
- Iterative Improvement
- Backtracking
- Branch and Bound
- List
- Stack
- Queue
- Priority Queue/Heap
- Graph
- Tree
- Set
- Dictionary
Issues:
Approaches
-
Theoretical Analysis (with math)
- Time efficiency is analyzed by determining the number of repetitions of the basic operation as a function of input size
- Basic operation: the operation that contributes most toward the running time of the algorithm
T(n)≈copC(n)
where T(n) is the running time
n is the input size
cop is the execution time for the basic operation
and C(n) is the number of times basic operation is executed.
Problem |
Input Size |
Basic Op |
Searching for key in a list of n items |
Number of list's items |
Key comparison |
Multiplication of two matrices |
matrix dimensions or total number of elements |
Multiplication of two numbers |
Checking primality of a given integer n |
n's size = number of digits (say in binary representation) |
Division |
Typical graph problem |
Number of nodes and/or edges |
Visiting a node or traversing an edge |
-
Empirical Analysis
- Select a specific (typical) sample of inputs
- Use physical unit of time (e.g. milliseconds) or count actual number of basic operation executions
- Analyze the empirical data
-
For some algorithms, efficiency depends on input form
- Worst case - Cworst(n) = maximum over inputs of size n
- Best case - Cbest(n) = minimum over inputs of size n
- Average case - Cave(n) = "average" over inputs of size n
- Number of times the basic operation will be executed on typical input
- NOT the average of worst and best case
- Expected number of basic operations is considered as a random variable under some assumption about the proability distribution of all possible inputs
Search for a given value in an array
SequentialSearch
Input: An array, A, of length n and a search key K
Output: The index of the first element of A that matches K or −1 if there are no matching elements
i <- 0
while i < n and A[i] != K do
i <- i + 1
if i < n
return i
else
return - 1
-
Worst case: Cworst(n)=n
Analyze the algorithm to see what kind of inputs yield the largest value of the basic operation's count C(n)amongallpossibleinputsofsizen.ClearlyinthiscaseC_{worst}(n) = n$ if counting key comparisons.
-
Best case: Cbest(n)=1
-
Average case:
To analyze the algorithm's average case efficiency we must make some assumptions about possible inputs of size n.
(a) Assume the probability of a successful search is equal to p (0≤p≤1)
(b) the probability of the first match occurring in the ith position of the list is the same for every i
In the case of a successful search, the probability of the first match occurring in the ith position is np for every i, and the number of comparisons is i. In the case of unccessful match, the # of comparisons is n with the probability of such a search being (1−p).
-
Cave(n)=n(1−p)+[1np+2np+...+inp+...+nnp]
=∑i=1ninp+n(1−p)
=np∑i=1ni+n(1−p)
=np⋅2n(n+1)+n(1−p)
=2p(n+1)+n(1−p)
-
Is average case important? YES! But rarely easy to do. Probabilistic assumptions are hard to verify.
-
For most algorithms, we will simply quote already proven average case analysis
'Big Oh'
O(g(n))
The set of all functions with a lower or the same order of growth as g(n)
Examples:
n∈O(n2)
1000n+45∈O(n2)
2n(n−1)∈O(n2)
n3∈/O(n2)
10000n3∈/O(n2)
2n∈/O(n2)
Ω(g(n))
The set of all functions with a higher or same order of growth as g(n)
n3∈Ω(n2)
2n(n−1)∈Ω(n2)
2n∈Ω(n2)
1000n∈/Ω(n2)
n12n2∈/Ω(n2)
n+5∈/Ω(n2)
Θ(g(n))
The set of all functions with the same order of growth as g(n)
t(n)∈O(g(n)) if t(n) is bounded above by some constant multiple of g(n) for all large n.
12n+44∈O(n)
12n+44≤12n+44n, ∀n≥1
≤56n where c=56n and n0=1
t(n)∈Ω(g(n)) if t(n) is bounded below by some constant multiple of g(n) for all large n.
t(n)∈Θ(g(n)) if t(n) is bounded both above and below by some constant multiple of g(n)foralllargen$
Example:
log(n!)∈Θ(nlogn)
log(n1)=log(1)+log(2)+...+log(n)
≤log(n)+log(n)+...+log(n)
log(n!)≥log(2n)+log(2n+1)+...+log(n)
≥log(2n)+log(2n)+...+log(2n)
≥2nlog(2n)
General Plan for Analysis (Nonrecursive algorithms)
- Decide on parameter n indicating input size
- Identify Algorithm's basic operation
- Determine best, worst, and average cases for input size n
- Set up a sum for the number of times the basic operation is executed
- Simplify the sum using standard formulas and rules (Appendix A in the textbook)
Limits?
Basic Efficiency Classes and Descriptions (p59)
Interval Scheduling