Algo Analysis

Download as pdf or txt
Download as pdf or txt
You are on page 1of 33

Marathwada Mitramandal’s

COLLEGE OF ENGINEERING
Karvenagar, Pune
Accredited with ‘A’ Grade by NAAC
Unit - 01
Part - B
Algorithm Analysis

DSA
Complexity

• A solution is said to be efficient if it solves the problem within its


resource constraints.

• Space
• Time

• The cost of a solution is the amount of resources that the


solution consumes.

j
SpaceComplexity

• amount of memory program occupies

• usually measured in bytes, KB or MB

j
SpaceComplexity

• Why is this of concern?


• We could be running on a multi-user system where programs
are allocated a specific amount of space.
• We may not have sufficient memory on our computer.
• There may be multiple solutions, each having different space
requirements.
• The space complexity may define an upper bound on the
data that the program can handle.

j
SpaceComplexity

• Components of Program Space

• Program space = Instruction space + data space + stack space

• The instruction space is dependent on several factors.


• the compiler that generates the machine code
• the compiler options that were set at compilation time
• the target computer

j
SpaceComplexity

• Data space
• very much dependent on the computer architecture and
compiler

• The magnitude of the data that a program works with is


another factor

char 1 float 4
short 2 double 8
int 4 long double 10
long 4 pointer 2

Unit: bytes
j
SpaceComplexity

• Data space

• Choosing a “smaller” data type has an effect on the overall


space usage of the program.

• Choosing the correct type is especially important when


working with arrays.

j
SpaceComplexity

• Environment Stack Space

• Every time a function is called, the following data are saved


on the stack.

• the return address

• the values of all local variables and value formal


parameters

• the binding of all reference and const reference


parameters

j
TimeComplexity

• execution time

• usually measured by the number of executions

j
Time Complexity

Best time
The minimum amount of time required by the algorithm for any
input of size n.
Seldom of interest.

Worst time (our focus)


The maximum amount of time required by the algorithm for any
input of size n

Average time
The average amount of time required by the algorithm over all
inputs of size n

Similar definitions can be given for space complexity

j
Worst Case Complexity

• Of the three cases the really only useful case (from the
standpoint of program design) is that of the worst case.

• Worst case helps answer the software lifecycle issue of:

• If its good enough today, will it be good enough tomorrow

j
FrequencyCount

Suppose there is an assignment statement


x := x +1
in your program.

• We’d like to determine...


• The time a single execution would take the number of times it is

executed (Frequency Count)

j
FrequencyCount-Example
Examine a piece of code and predict the number of
instructions to be executed
for each instruction predict how mant times each will be encountered as the
code runs

Inst # Code F.C.


1 for (int i=0; i< n ; i++) n+1
2 { cout << i; n
3 p = p + i; n
} ____
3n+1

totaling the counts produces the F.C. (frequency count)


FrequencyCount-Example

Inst # Code F.C. F.C.


1 for (int i=0; i< n ; i++) n+1 n+1
2 for int j=0 ; j < n; j++) n(n+1) n2+2n+1
3 { cout << i; n*n n2
4 p = p + i; n*n n2
} ____ ____
3n+1 3n2+3n+2

discarding constant terms produces : 3n2+3n


Big O = O(n2)
clearing coefficients : n2+n
picking the most significant term: n2
FrequencyCount

Product of time and frequency is the total time taken

Frequency count will vary from data set to data set

Since the execution time will be very machine dependent (and


compiler dependent), we neglect it and concentrate on the
frequency count

Consider the following examples:

j
FrequencyCount

Program 1 Program 2
x := x + 1 FOR i := 1 to n
DO
x := x + 1
END
Program 3
FOR i := 1 to n
DO
FOR j := 1 to n
DO
x := x + 1
END
END
j
FrequencyCount

Program 1:
– statement is not contained in a loop (implicitly or explicitly)
– Frequency count is 1

Program 2
– statement is executed n times

Program 3
– statement is executed n2 times
j
FrequencyCount

1, n, and n2 are said to be different and increasing orders of

magnitude (e.g. let n = 10)

We are chiefly interested in determining the order of magnitude of

an algorithm

j
FrequencyCount :Example 1

sum = 0.0;
for (int i = 0; i < n; i++) {
sum += array[i]; // </ / / / / a primitive operation
}
Ignoring the update to the loop variable i, we note the marked
statement above is executed n times, so the cost function is
C(n)=n.

j
FrequencyCount:Example 2

sum = 0.0;
for (int i = 0; i < n; i += 2) {
sum += array[i]; // </ / / / / a primitive operation
}
Here however, since the counter variable i is going up by 2 with
each pass through the loop, the number of times the marked
statement above gets executed this time is halved / / leading to
C(n)=n/2.

j
FrequencyCount:Example 3

for (int i = 0; i < n; i ++) {


for (int j = 0; j < n; j ++) {
int x = i*j; // </ / / / / a primitive operation
sum += x; // </ / / / / a primitive operation
}
}
When nested loops are encountered, notice that the two primitive
operations each get executed n2 times, yielding C(n)=2n^2.

j
FrequencyCount:Example 4

for (int i = 0; i < n; i ++) {


for (int j = i; j < n; j ++) {
sum += i*j; // </ / / / / a primitive operation
}
}
Here, the cost function is a little more complicated to calculate. Note that the
primitive operation is executed
n+(n−1)+(n−2)+⋯+3+2+1times
However, and identity from algebra will help consolidate this into C(n)=n(n+1)/2.
[C(n) is O(n^2)]
j
Efficiency of an algorithm

• Determined in terms of the order of growth of the function.


• Only compare the rate of growth (order of magnitude) of the
functions and compare the orders.
• how fast does the algorithm grow as the size of input, N, grows?
• Normally, the growth function will approach some simpler
function asymptotically.
• For growth functions having different orders of magnitude, exact
frequency counts and the constants become insignificant.

j
Classifying Functions

Asymptotic growth : The rate of growth of a function

Given a particular differentiable function f(n), all other differentiable


functions fall into three classes:

• growing with the same rate


• growing faster
• growing slower

j
Asymptotic Notation

Describes the behavior of the time or space complexity for large


instance characteristic

Big Oh (O) notation provides an upper bound for the function f

Omega (Ω) notation provides a lower-bound

Theta (Q) notation is used when an algorithm can be bounded both


from above and below by the same function

Little oh (o) defines a loose upper bound.

j
OrderofGrowth

O(1) Constant (computing time)


O(n) Linear (computing time)
O(n^2) Quadratic (computing time)
O(n^3) Cubic (computing time)
O(2^n) Exponential (computing time)
O(log n) is faster than O(n) for sufficiently large n
• O(n log n) is faster than O(n^2) for sufficiently large n

j
OrderofGrowth

The algorithms are grouped into groups according to the order of growth
of the their time complexity functions, which include:
O(1) < O(logN) < O(N) < O(NlogN) < O(N2) < O(N3) < … < O(eN)

j
BigOhNotation

The notation f(n) = O(g(n)) has a precise mathematical definition


/
Read f(n) = O(g(n)) as f of n equals big/ oh of g of n
Definition:
f(n) = O(g(n)) iff there exist two constants c and n0 such that
|f(n)| ≤ c|g(n)| for all n ≥ n0

j
BigOhNotation

f(n) will normally represent the computing time of some algorithm

– Time complexity T(n)

f(n) can also represent the amount of memory an algorithm will

need to run

– Space complexity S(n)

j
BigOhNotation

If an algorithm has a time complexity of O(g(n)) it means that its


execution will take no longer than a constant times g(n)

n is typically the size of the data set

j
PracticalComplexities

logn n nlogn n2 n3 2n

0 1 0 1 1 2
1 2 2 4 8 4
2 4 8 16 64 16
3 8 24 64 512 256
4 16 64 256 4096 65536
5 32 160 1024 32768 4294967296

j
BigOhNotation - Example

f(n) = 10n + 5

f(n) = 3n2 + 4n + 1

f(n) = 5n2

f(n) = 7n ~ 2

f(n) = 2n2 +5n + 4


j

You might also like