ALGORITHM ANALYSIS
AND COMPLEXITIES
INSTRUCTOR
A. Prof. Dr. RASHIDAH Funke
Olanrewaju
Email: [email protected]
Analysis of Algorithms
An algorithm is a step-by-step procedure for solving a
problem in a finite amount of time.
4
▪ Components of Algorithm Analysis Methodology
includes:
▪ A language for describing Algorithm (Pseudo
code)
Algorithm ▪ A computational Model that the algorithms
execute within (RAM)
Analysis ▪ A metric for measuring algorithm running time
(counting Primitive operation)
▪ An approach for characterizing running times
such as a recursive/iterative algorithm
Analysis of Algorithms
5
▪ Running time is a natural measure of
“goodness,” since time is a precious
resource—computer solutions should
run as fast as possible.
▪ Running time of an algorithm or data
structure operation increases with
the input size, although it may also
Running vary for different inputs of the same
size.
Time (§1.1)
▪ Running time is affected by the
hardware environment (e.g., the
processor, clock rate, memory, disk)
and software environment (e.g., the
operating system, programming
language) in which the algorithm is
implemented and executed
Analysis of Algorithms
6
Running Time (§1.2)
▪ Most algorithms transform input objects
into output objects. best case
average case
▪ The running time of an algorithm worst case
typically grows with the input size. 120
▪ Average case time is often difficult to 100
determine.
Running Time
80
▪ We focus on the worst-case running
time. 60
▪ Easier to analyze 40
▪ Crucial to applications such as
20
games, finance, and robotics
0
1000 2000 3000 4000
Input Size
Analysis of Algorithms
7
▪ Write a program implementing the algorithm
▪ Run the program with inputs of varying size &
composition
▪ Use a method like System.currentTimeMillis() to
get an accurate measure of the actual running
time
▪ Plot the results
Experimental 9000
Studies (§ 1.6) 8000
7000
6000
Time (ms)
5000
4000
3000
2000
1000
Analysis of
0 Algorithms
0 50 100
8
▪ It is necessary to implement the algorithm, which may be
difficult
Limitations of ▪ Results may not be indicative of the running time on other
inputs not included in the experiment.
Experiments ▪ In order to compare two algorithms, the same hardware and
software environments must be used
Analysis of Algorithms
9
▪ Uses a high-level description of the algorithm
instead of an implementation
▪ Characterizes running time as a function of the
input size, n.
Theoretical
▪ Takes into account all possible inputs
Analysis
▪ Allows us to evaluate the speed of an algorithm
independent of the hardware/software
environment
Analysis of Algorithms
10
Pseudocode (§1.1)
▪ High-level description Example: find max element
of an algorithm of an array
▪ More structured than Algorithm arrayMax(A, n)
English prose Input array A of n integers
▪ Less detailed than a Output maximum element of A
program
currentMax A[0]
▪ Preferred notation for for i 1 to n − 1 do
describing algorithms
if A[i] currentMax then
▪ Hides program design currentMax A[i]
issues
return currentMax
Analysis of Algorithms
Pseudocode Details
Control flow Method call
◼ if … then … [else …] var.method (arg [, arg…])
◼ while … do … Return value
◼ repeat … until … return expression
◼ for … do … Expressions
◼ Indentation replaces braces Assignment
(like = in Java)
Method declaration = Equality testing
Algorithm method (arg [, arg…]) (like == in Java)
Input … n2 Superscripts and other
Output … mathematical
formatting allowed
Analysis of Algorithms 11
The Random Access Machine
(RAM) Model
A CPU
An potentially unbounded
bank of memory cells, 1
2
each of which can hold an 0
arbitrary number or
character
Memory cells are numbered and accessing
any cell in memory takes unit time.
Analysis of Algorithms 12
Primitive Operations
Basic computations Examples:
performed by an algorithm ◼ Evaluating an
Identifiable in pseudocode expression
◼ Assigning a value to a
Largely independent from the variable
programming language ◼ Indexing into an array
Performing an
Exact definition not important
◼
arithmetic operation
(we will see why later) ◼ Calling a method
Returning from a
Assumed to take a constant ◼
method
amount of time in the RAM ◼ Comparing two
model numbers
Analysis of Algorithms 13
14
Counting Primitive Operations (§1.1)
▪ By inspecting the pseudocode, we can determine the
maximum number of primitive operations executed by
an algorithm, as a function of the input size
Algorithm arrayMax(A, n) # operations
currentMax A[0] 2
for i 1 to n − 1 do 1+n
if A[i] currentMax then 2(n − 1)
currentMax A[i] 2(n − 1)
{ increment counter i } 2(n − 1)
return currentMax 1
Total 7n − 2
Analysis of Algorithms
Estimating Running Time
Algorithm arrayMax executes 7n − 2 primitive
operations in the worst case. Define:
a = Time taken by the fastest primitive operation
b = Time taken by the slowest primitive operation
Let T(n) be worst-case time of arrayMax. Then
a (7n − 2) T(n) b(7n − 2)
Hence, the running time T(n) is bounded by two
linear functions
Analysis of Algorithms 15
16
▪ Changing the hardware/ software
environment
▪ Affects T(n) by a constant factor, but
Growth Rate of ▪ Does not alter the growth rate of T(n)
Running Time ▪ The linear growth rate of the running time
T(n) is an intrinsic property of algorithm
arrayMax
Analysis of Algorithms
Execution of primitive steps
Loop condition for (n)times = (n+1)steps
Body of loop = nsteps
Conditional statement = 1 step
Assignment statement = 1 step
Constant = 0 step
Analysis of Algorithms 17
18
Examples
S/N statement Executable frequenc Total
statement y steps
1 Algorithm Sum(a,n) 0 - 0
2 { 0 - 0
3 s= 0 1 1 1
4 for i = 1 to n 1 n+1 n+1
5 s= s + a[i] 1 n n
6 return s 1 1 1
7 } 0 - 0
Total steps for the algorithm to run is 2n+3
Analysis of Algorithms
Growth Rates
Growth rates of functions:
◼ Linear n
◼ Quadratic n2
◼ Cubic n3
Analysis of Algorithms 19
21
O(1) – constant time, the time is independent of n, e.g.
array look-up
O(log n) – logarithmic time, usually the log is base 2, e.g.
binary search
O(n) – linear time, e.g. linear search, Tree traversal
Growth-rate O(n*log n) – e.g. efficient sorting algorithms
Functions O(n2) – quadratic time, e.g. selection sort
O(nk) – polynomial (where k is some constant)
O(2n) – exponential time, very slow!
Order of growth of some common functions
• O(1) < O(log n)< O(log2 squared n)< O(n)< O(nlogn) <
O(n2) < O(n3) < O(2n) Analysis of Algorithms
Order-of-
Magnitude
Analysis and
Big O Notation
A comparison of growth-rate functions: a) in tabular form
A comparison of growth-rate functions: b) in graphical form
Order-of-Magnitude Analysis and Big O Notation
Note on Constant Time
▪ We write O(1) to indicate something that takes a constant
amount of time
▪ E.g. finding the minimum element of an ordered array
takes O(1) time, because the min is either at the
beginning or the end of the array
▪ Important: constants can be huge, and so in practice
O(1) is not necessarily efficient --- all it tells us is that the
algorithm will run at the same speed no matter the size
of the input we give it
Constant Factors
The growth rate is not affected by
◼ constant factors or
◼ lower-order terms
Examples
◼ 102n + 105 is a linear function
◼ 105n2 + 108n is a quadratic function
◼ Watch this
Analysis of Algorithms 26
Big-Oh Notation (§1.2)
Given functions f(n) and g(n), we say
that f(n) is O(g(n)) if there are positive
constant c and positive integer constant
n0 such that
f(n) cg(n) for all n n0
Example: 2n + 10 is O(n)
◼ 2n + 10 cn
◼ (c − 2) n 10
◼ n 10/(c − 2)
◼ Pick c = 3 and n0 = 10
Analysis of Algorithms 27
Big-Oh Example
Example: the function n2 is not O(n)
◼ n2 cn
◼ nc
◼ The above inequality cannot be satisfied since c must
be a constant
Analysis of Algorithms 28
29
More Big-Oh Examples
7n-2 is O(n)
need to find two positive constants: c, n0, such that for all n n0
7n-2 c•n
c = 7 and n0 = 1
◼ 3n3 + 20n2 + 5 is O(n3)
need to find two positive constants: c, n0, such that for all n n0
3n3 + 20n2 + 5 c•n3
c = 4 and n0 = 21
◼ 3 log n + log log n is O(log n)
need to find two positive constants: c, n0, such that 3 log n +
log log n c•log n for all n n0
this is true for c = 4 and n0 = 2
Analysis of Algorithms
Big-Oh and Growth Rate
The big-Oh notation gives an upper bound on the
growth rate of a function
The statement “f(n) is O(g(n))” means that the growth
rate of f(n) is no more than the growth rate of g(n)
We can use the big-Oh notation to rank functions
according to their growth rate
f(n) is O(g(n)) g(n) is O(f(n))
g(n) grows more Yes No
f(n) grows more No Yes
Same growth Yes Yes
Analysis of Algorithms 30
Big-Oh Rules
If f(n) is a polynomial of degree d, then f(n) is
O(nd), i.e.,
1. Drop lower-order terms
2. Drop constant factors
Use the smallest possible class of functions
◼ Say “2n is O(n)” instead of “2n is O(n2)”
Use the simplest expression of the class
◼ Say “3n + 5 is O(n)” instead of “3n + 5 is O(3n)”
Analysis of Algorithms 31
Searching costs using
O-notation
▪ Linear search
▪ Best case: O(1)
▪ Average case: O(n)
▪ Worst case: O(n)
▪ Binary search
▪ Best case: O(1)
▪ Average case: O(log n)
▪ Worst case: O(log n)
▪ Selection sort
▪ Best case: O(n2) (can vary with implementation)
▪ Average case: O(n2)
▪ Worst case: O(n2)
▪ Insertion sort
▪ Best case: O(n) (can vary with implementation)
▪ Average case: O(n2)
▪ Worst case: O(n2)
Asymptotic Algorithm Analysis
The asymptotic analysis (as n grows toward infinity)
of an algorithm determines the running time in big-Oh
notation
To perform the asymptotic analysis
◼ We find the worst-case number of primitive operations
executed as a function of the input size
◼ We express this function with big-Oh notation
Example:
◼ We determine that algorithm arrayMax executes at most
7n − 1 primitive operations
◼ We say that algorithm arrayMax “runs in O(n) time”
Since constant factors and lower-order terms are
eventually dropped anyhow, we can disregard them
when counting primitive operations
Analysis of Algorithms 33
Relatives of Big-Oh
big-Omega
◼ f(n) is (g(n)) if there are a constant c > 0
and an integer constant n0 1 such that
f(n) c•g(n) for all n n0
big-Theta
◼ f(n) is (g(n)) if there are constants c’ > 0 and c’’ > 0 and an
integer constant n0 1 such that c’•g(n) f(n) c’’•g(n) for n n0
little-oh
◼ f(n) is o(g(n)) if, for any constant c > 0, there is a integer constant
n0 1 such that f(n) < c•g(n) for all n n0
little-omega
◼ f(n) is (g(n)) if, for any constant c > 0, there is an integer
constant n0 1 such that f(n) >c•g(n) for all n n0
Analysis of Algorithms 35
36
Intuition for Asymptotic Notation
Big-Oh
◼ f(n) is O(g(n)) if f(n) is asymptotically less than or equal to g(n)
big-Omega
◼ f(n) is (g(n)) if f(n) is asymptotically greater than or equal to g(n)
big-Theta
◼ f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
little-oh
◼ f(n) is o(g(n)) if f(n) is asymptotically strictly less than g(n)
little-omega
◼ f(n) is (g(n)) if is asymptotically strictly greater than g(n)
Analysis of Algorithms