Comparison-Based Sorting Algorithms.md
Comparison-Based Sorting Algorithms.md
Searching Algorithms
Algorithm: LinearSearch(A, x)
for i ← 0 to A.Size() − 1 do
if A[i] == x then
Return i;
end
end
Return -1;
Runtime Analysis:
Worst-Case: Θ(n), when x is not in the array (or its first index is in Ω(n))
Best-Case: Θ(1), when x is the first element (or its first index is in O(1))
Average-Case: Usually Θ(n), but it depends on the nature of the input specification
Binary Search: Efficient for sorted arrays, divides the search space in half.
Return -1;
end
m ← l+r/2;
if A[m] == x then
Return m;
end
Algorithm: BinarySearch(A, x)
l ← 0;
r ← A.Size() − 1;
while l ≤ r do
m ← l+r/2;
if A[m] == x then
Return m;
end
end
else
l ← m + 1;
end
end
Return -1;
Runtime Analysis:
Worst-Case: Θ(logn)
Best-Case: Θ(1), if the target x is in the first index we check (middle).
Average-Case: Θ(logn)
Space Complexity:
The iterative variant requires Θ(1) auxiliary space for all cases.
The recursive variant generally requires Θ(log n) auxiliary space for the recursive calls. However,
many languages and compilers incorporate tail call elimination, which reduces the space
complexity to Θ(1) as well.
Runtime Analysis:
Worst-Case: Θ(n)
Best-Case: Θ(1)
Average-Case: Θ(n)
Space Complexity:
Selection Problem
SortSelect: Simple algorithm involving sorting the array. Θ(n log n) runtime with HeapSort, Θ(1) space.
HeapSelect: Uses a min-heap to find the k-th smallest element. Θ(n + k log n) runtime.
Algorithm: HeapSelect(A, k)
Min-Heapify(A);
repeat k times
A.DeleteMin ();
end
Return A.Root();
Runtime Analysis:
Worst-Case: Θ(logn)
Average-Case: Θ(logn)
QuickSelect: Algorithm based on partitioning and pivoting, aiming to solve the selection problem
efficiently.
Algorithm: ChoosePivotFirst(A)
Return A[0];
Algorithm: Partition(A, p)
i ← 1, j ← n − 1;
while True do
if i ≥ j then Break;
else
Swap(A[i], A[j]);
i + +, j − −;
end
end
Swap(A[0], A[j]);
Return j;
if k == pvind then
end
else
end
QuickSelect Analysis
Average-case runtime: Analysis based on random pivot selection, resulting in Θ(n) expected runtime.
Best-Case runtime: Θ(n) if the element at rank k is the pivot after the first call to Partition, which runs in
Θ(n) time.
Randomized QuickSelect:
Formal definition: Given an input array A = [a0 , a1 , ..., an−1 ], return an output array B, a
Applicability: Sorting is useful for speeding up searching (e.g., binary search) and selection.
2. Sorting Algorithms:
SelectionSort:
Idea: Find the smallest element, move it to the first index, repeat.
Algorithm: SelectionSort(A)
for i ← 0 to A.Size − 1 do
mn ← i;
for j ← i + 1 to A.Size − 1 do
mn ← j;
end
Swap A[i] with A[mn]
end
BubbleSort:
Algorithm: BubbleSort(A)
for i ← 1 to A.Size − 1 do
for j ← 1 to A.Size − i do
if A[j] < A[j − 1] then
end
end
3. InsertionSort:
Idea: Maintain a sorted array, add elements one by one at appropriate positions.
Algorithm: O(n2 ) worst-case, O(n) best-case (sorted array), in-place (Θ(1) space).
Algorithm: InsertionSort(A)
for i ← 1 to A.Size − 1 do
val ← A[i];
for j ← i downto 0 do
Break;
end
end
end
4. QuickSort:
Return;
p ← ChoosePivotRandom(A);
pvind ← Partition(A, p);
5. MergeSort:
Algorithm: Merge(A, B)
Initialize C as an empty output array;
i ← 0;
j ← 0;
C.Append(A[i]);
i ← i + 1;
end
else
C.Append(B[j]);
j ← j + 1;
end
end
Return C;
if lA > rA then
if lB > rB then
end
else
end
if l > r then
m ← l+r/2;
MergeSort for stability, HeapSort for space efficiency, QuickSort for practical speed.
Conclusion:
Non-Comparison-Based Sorting:
2. BucketSort:
Algorithm: BucketSort(A, R)
for i ← 0 to A.Size − 1 do
end
idx ← 0;
for i ← 0 to R − 1 do
idx + +;
end
end
Return A;
Complexity:
Space: Θ(n + R) .
Stability:
3. CountSort:
Algorithm: StableCountSort(A, R)
for i ← 0 to A.Size − 1 do C
[Key(A[i])]++;
P ← Array of size R;
P[0] ← 0;
for i ← 1 to R − 1 do
for i ← 0 to A.Size − 1 do
k ← Key(A[i]);
B[idx] ← A[i];
P[k]++;
end
Return B;
Complexity:
Runtime: Θ(n + R) .
Space: Θ(n + R) .
Stability:
4. Space-Saving CountSort:
Algorithm: SpaceSavingCountSort(A, R)
for i ← 0 to A.Size − 1 do C
[Key(A[i])]++;
P ← Array of size R;
P[0] ← 0;
for i ← 1 to R − 1 do
if i == P[k] then
i++, P[k]++;
else
idx ← P[k];
end
end
Return A;
Swaps elements within the input array instead of using a separate output array.
Complexity:
Runtime: Θ(n + R) .
Space: Θ(R) .
Stability:
Asymptotic Analysis:
6. LSD-RadixSort:
Algorithm: LSDRadixSort(A, R, d)
for j ← d downto 1 do
Sort A with a stable sort, defining Key(x) = j-th digit of original key of x;
end
Return A;
Sorts keys based on individual digits from the least significant to the most significant.
Complexity:
Space: Θ(n + R) .
Stability:
Always stable.
7. MSD-RadixSort:
if l ≥ r then
Return A;
Return A;
Sorts keys based on individual digits from the most significant to the least significant.
Runtime: Θ(d * n * R) .
Space: Θ(d + n + R) .
Stability:
Advantages:
9. Conclusion:
Algorithm Overview:
Provides a summary of algorithms, their runtimes, space complexities, stability, and conditions of
use.
BucketSort / CountSort Θ(n + R) Θ(n + R) Yes Elements from a known set of size R
d
MSD-RadixSort (Space-Saving) Θ(dnR) Θ(d+R) No Radix-R, d digits, e.g., 0 to R − 1