0% found this document useful (0 votes)
12 views

Unit 2 Sorting

Uploaded by

pooja0100
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views

Unit 2 Sorting

Uploaded by

pooja0100
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 43

1

Divide and Conquer


•Divide the problem into a number of subproblems that
are smaller instances of the same problem.
•Conquer the subproblems by solving them recursively.
If they are small enough, solve the subproblems as base
cases.
•Combine the solutions to the subproblems into the
solution for the original problem.

2
Algorithms to be done under divide and conquer technique

•Merge sort

•Quick Sort

•Heap Sort

•Binary search

3
Merge sort

•Divide and Conquer

•Recursive

•Out-of-place

•Space complexity: O(n)

•Time complexity: O(n logn) in worst case

4
Merge Sort Working
Process:
Think of it as a recursive algorithm continuously splits the array in half until it cannot be
further divided. This means that if the array becomes empty or has only one element left, the
dividing will stop, i.e. it is the base case to stop the recursion. If the array has multiple
elements, split the array into halves and recursively invoke the merge sort on each of the
halves. Finally, when both halves are sorted, the merge operation is applied. Merge operation
is the process of taking two smaller sorted arrays and combining them to eventually make a
larger one.

5
Pseudo-Code

Merge_sort(A, p, r)

If p<r

q= └ (p+r)/2 ┘

Merge_sort (A, p, q)

Merge_sort (A,q+1,r)

Merge ( A, p, q, r)
6
}
Pseudo-Code

Merge(A, p, q, r)
{
n1=q-p+1 //count the number of elements in first list

n2=r-q

Let L [ 1………..n1+1] and

R[1…….n2+1] be the two new arrays

For(i=1 to n1)
L[i]=A[p+i-1] // copy array into first list
For(j=1 to n2)
R[j]=A[q+j]
L[n1+1]=∞

R[n2+1]=∞
7
For(k=p to r) //k will be incremented at every step
If(L[i]<= R[j])
A[k]=L[i]
i+1
Else
A[k]=R[j]
j=j+1

8
Merge sort with Example

9
Time complexity of Merge Sort

Merge_sort(A, p, r)………………T(n)

If p<r…………………………….(1)

q= └ (p+r)/2 ┘…………………….(1)

Merge_sort (A, p, q)………….T(n/2)

Merge_sort (A,q+1,r)…………T(n/2)

Merge ( A, p, q, r)……………(n)
10
}
Recurrence Relation of merge sort

T(n)= 1 , n=1
2T(n/2) +n , n>0

Solve by Recursive Tree/Master Theorem


T(n)= O(nlogn)

Space Complexity=O(n)

11
Merge sort Recursive tree

12
Practice:

sort the following number using


merge sort:
<15,10,5,20,25,30,40,35>
And show the recursive tree

13
HEAP SORT

14
Combines the better attributes of merge sort and
insertion sort.
Like merge sort, but unlike insertion sort,
running time is O(n lg n).
Like insertion sort, but unlike merge sort, sorts
in place.

15
1. Heap is a binary tree .
2. Every heap is almost a complete binary tree.
3. Elements should be filled from left to right before
going to next level.

Heap Properties:

Heap length: Total number in the array is the heap


length
Heap size: till how many elements is it a heap?

16
Position of elements in a heap:

If the position of parent node is i then,


Position of left child =2( i)
Position of right child= 2(i)+1

Types of Heap:

Max heap: root should always be maximum

Min heap: root should always be minimum

17
Elements A.Length A.Heapsize
25, 12, 16, 13, 10, 8, 14 7 1
25,14,16,13,10,8,12 7 7
25,14,13,16,10,8,12
25,14,12,13,10,8,16
14,13,12,10,8
14,12,13,8,10
14,13,8,12,10
14,13,12,8,10
89,19,40,17,12,10
2,5,7,11,6,9,70

18
MAX HEAP

•getMax(): It returns the root element of Max Heap. The


Time Complexity of this operation is O(1).
•If array is in descending order than max heap

19
•getM()n: It returns the root element of Min Heap. The
Time Complexity of this operation is O(1).
•If array is in ascending order than min heap

20
Algorithm

Heap_sort(A)
{

Build_max_heap(A)
For i= length[A] down to 2
Do exchange A[1]=A[i]
Heap_size[A]=Heap_size[A-1]
Max_Heapify(A,1)

21
Algorithm

Build_Max_Heap(A)
{
A.heap_size=A.length //assuming that all nodes are creating a heap
For ( i=floor(A.length/2) down to 1) //maximum non-leaf node
Max_Heapify(A,i)
}

22
Max_heapify(A,i)
{
l =2i
r =2 i+1
If(l <=A.heapsize and A[l]>A[i]) //if leftchild exists & greater than parent
Largest=l
Else largest=i
If(r <=A.heapsize and A[r]>A[largest])
Largest=r
If(largest != i)
Exchange A[i] with A[largest]
Max_heapify(A,largest)
}
23
Time-Complexity of Heap Sort:

The height of a heap


The height of a node in a heap is the number of edges
on the longest simple downward path from the node to a
leaf, and the height of the heap to be the height of the
root, that is Θ(lg n).
For example: `
the height of node 2 is 2
the height of the heap is 3

24
Number of call of Max_heapify function depends on height of node that
“i” is pointing to.

Height of node “i” No. of times Max_heapify() will


be called
1 1
2 2
3 3
… …
Log n (height of heap) Log n

•Max_Heapify funciton which run in O(log n) time to maintain the heap


property.

•For 1 call of build_Max_heap function Max_Heapify Function is called O(log


n) times.

•And there are O(n) such calls


= running time isO(n log n) 25
a.) To delete the largest element in a max heap tree : O(1).
b.) To delete the smallest element in a max heap tree : O(logn).
c.) To delete n elements in a max heap tree : O(nlogn).

Now to sort the heap tree requires,

1. arrange or insert the elements in a form of heap i.e O(nlogn) and

2. delete the elements one by one after swapping operation O(nlogn)

This gives Heap sort complexity = O(nlogn) + O(nlogn).

= 2 O(nlogn) = O(nlogn).

Space complexity of O(1)

26
Practice:

sort the following number using Heap sort:


<27,13,3,16,13,10,1,5,7,12,4,8,9,0>
show every step

27
QUICK-SORT

28
Quick sort

•Divide and Conquer

•Recursive

•Out-of-place

Cases Time Complexity Space Complexity

Best Case O(n log n) O(log n)

Average Case O(n log n) O(log n)

Worst Case O(n^2) O(n)

29
Quicksort picks an element as pivot, and then it
partitions the given array around the picked pivot
element. In quick sort, a large array is divided
into two arrays in which one holds values that are
smaller than the specified value (Pivot), and
another array holds the values that are greater
than the pivot.
After that, left and right sub-arrays are also
partitioned using the same approach. It will
continue until
Choosing the pivotthe single element remains in the
sub-array.
•Pivot can either be the rightmost element or the leftmost
element of the given array.
•Select median as the pivot element.

30
Partition(A, p, r)
{
X=A[r]
i=p-1
For(j=p to r-1)
{
If(A[ j ]<=x)
{
i=i+1
Exchange A[i] with A[j]
}
}
Exchange A[i+1] with A[r]
Return (i+1)
}

31
Quick_sort(A,p,r)
{
If(p<r)
{
Q=Partition(A, p, r)
Quick_sort (A,p,q-1)
Quick_sort (A,q+1,r)

}
32
Best case Time Complexity of
Quick Sort
Best Time Complexity of Quick sort comes when the sort position
of the pivot element comes almost in the middle of the array, so
that array is partitioned into almost equal halves

Balanced Tree

33
Best case Time Complexity of
Quick Sort
Quick_sort(A,p,r)……………….T(n) times
{
If(p<r)…………………………….1 times
{
Q=Partition(A, p, r)…………….n times
Quick_sort (A,p,q-1)…………..T(n/2) time
Quick_sort (A,q+1,r)…………..T(n/2)
}
}

Hence the recurrence relation of quick sort in best case:


T(n)=2*T(n/2)+n

Solve either by substitution, Master or Recursive tree


Best case Time complexity of Quick Sort comes: O(nlogn)

34
Worst case Time Complexity of
Quick Sort
Worst case Time Complexity of Quick sort comes when the sort position of the
pivot element partitions the array so that there are (n-1) elements at one end
and on the other hand there is zero element.

Unbalanced Tree

35
Worst case Time Complexity
of Quick Sort
Quick_sort(A,p,r)……………….T(n) times
{
If(p<r)…………………………….1 times
{
Q=Partition(A, p, r)…………….n times
Quick_sort (A,p,q-1)…………..T(n-1) time
Quick_sort (A,q+1,r)…………..T(0)
}
}

Hence the recurrence relation of quick sort in best case:


T(n)=2T(n-1)+n

Solve either by substitution, or Recursive tree


The worst case Time complexity of Quick Sort comes: O(n2)

36
Practice:

sort the following number using Quick sort:


<36,15,40,1,60,20,55,25,50,20>
show every step

37
Binary Search Algorithm:

The basic steps to perform Binary Search are:

•Begin with the mid element of the whole array as a search


key.
•If the value of the search key is equal to the item then return
an index of the search key.
•Or if the value of the search key is less than the item in the
middle of the interval, narrow the interval to the lower half.
•Otherwise, narrow it to the upper half.
•Repeatedly check from the second point until the value is
found or the interval is empty.

38
Binary Search Algorithm

binarySearch(arr, x, low, high)


repeat till low = high
mid = (low + high)/2
if (x == arr[mid])
return mid

else if (x > arr[mid]) // x is on the right side


low = mid + 1

else // x is on the left side


high = mid - 1

39
40
Time Complexity of Binary Search

Best Case Time Complexity of Binary Search


The best case scenario of Binary Search occurs when the target element is in
the central index.In this situation, there is only one comparison. Therefore, the
Best Case Time Complexity of Binary Search is O(1).

Average/Worst Case Time Complexity of Binary Search

In the following iterations, the size of the subarray is reduced using the result of
the previous comparison. - Initial length of array =n
Iteration 1 - Length of array =n/2
Iteration 2 - Length of array =(n/2)/2=n/22
Iteration k - Length of array =n/2k
After k iterations, the size of the array becomes 1 (narrowed down to the first
element or last element only).
Length of array =n/2k=1
=> k=log2​(n)= O(log n)

41
Time and Space Complexity

 Best Case Complexity - In Binary search, best case occurs when the element to
search is found in first comparison, i.e., when the first middle element itself is the
element to be searched. The best-case time complexity of Binary search is O(1).
 Worst Case Complexity - In Binary search, the worst case occurs, when we
have to keep reducing the search space till it has only one element. The
worst-case time complexity of Binary search is O(logn).

 The space complexity of binary search is O(1).

42
Linear Search
Linear Search is defined as a sequential search algorithm that starts at one end
and goes through each element of a list until the desired element is found,
otherwise the search continues till the end of the data set. It is the easiest
searching algorithm

Write the algorithm of linear search yourself

Time complexity: O(N)


Auxiliary Space: O(1)

43

You might also like