0% found this document useful (0 votes)
13 views

unit1 daa

The document provides an overview of algorithms, their characteristics, and complexities, including time and space complexities. It discusses various sorting algorithms like Quick Sort, Merge Sort, and Heap Sort, detailing their workings, advantages, and disadvantages. Additionally, it covers performance measurement, trade-offs between time and space, and specific techniques like the Master Theorem and Counting Sort.

Uploaded by

Sunil Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
13 views

unit1 daa

The document provides an overview of algorithms, their characteristics, and complexities, including time and space complexities. It discusses various sorting algorithms like Quick Sort, Merge Sort, and Heap Sort, detailing their workings, advantages, and disadvantages. Additionally, it covers performance measurement, trade-offs between time and space, and specific techniques like the Master Theorem and Counting Sort.

Uploaded by

Sunil Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 6
Algorithms © Astep-by-step procedure for solving a problem or performing a ask, (o Characterstics: (6 Finite: An algorithm must terminate after limited number of ‘steps. © Definite: Each step must be clearly defined. ‘# input/Output: Algorithms have 2er0 or mace inputs and one outputs ‘ ‘bffective- very step nan algorithm should be basic enough ‘0 be performed easily. Seen “© Recipe for Cooking Pasta: s+ Step 1 80 water (input: woter ond heat source) s+ Step 2: A pasta and cook foro specific ume (Input: pasta, taming) 4+ step 3: Oro tne water (Output cooked posto. “oust tke 9 ecpe, on agoritim defines the steps to achieve the ‘essed outcome: Analyzing Algorithms ‘a Time Complexity: How the running time of analgorithm Increases with the input se. “eSpace Complexity: How much extra memory the algorithm needs ‘Complexity of Algorithms |* Big O Notation (0): Represents the upper bound of an algorithm's running time (worst-case scenario). * Examples: © Constant Tine: O(1) © Unear Time: Of) © Quadratic Time: Off") (#2 (Omega): Best-case scenario, “© © (Theta): Averape-case scenario, Growth of Functions |# The growth of a funetion describes how the runtiene of an slgorithm ineresses with the size ofits input. “+ Ithelps us understand and compare the efficiency of algorithms. + Common Growth Rates: ‘+ Constant Time (0(2)}:The runtime does not increase with the input sie. Example: Accessing an element from an areayby its index. ++ Logarithinie Time (O(tagin):The runtime grows slowly as the input sue increases, Example: Binary earch on a sorted aeray. + Unear Time (O(n)} The euntime grows proportionally with the input ie. Example: Scanning through an array of n elements. ‘© _Unearithmie Time (On log n}:The runtime grows faster than liness but slower than quadratic. Example: efcient sorting algorithms ike Merge Sort. (© quadratic Time (O{e"):THe runtime inezeases “exponential as the Input grows. “Example: Comparing alps of elements in it (nested 0 ‘© Exponentiat Time (Of2N)}:The runtime doutbles with each “crease in input sae. “Example: Recursive algorts ike solving the Tower of Hanoi Seen ‘© Finding a tame in o Phonebook: (© ify search the phonebook alphabetically linear searct), it wil take Ofa) time to go through each nome. (© HHomever, using a binary search (divide and conquer method), you can find the name in Oftag.n) time, which is much foster os the input sire grows ‘Asymptotic Natations ‘© Asymptotic notations describe the limiting behavior of an algorithm's runtime as the input size grows. ‘© They help us classify algorithms based on their performance in the best, worst, and average cases. ‘© The three mast commonly used asymptotic notations are: 1. Big 0 Notation (0) ~ Worst Case ‘© Describes the upper bound of the runtime, giving the worst-case scenario. (© Big O gives the highest curve thatthe algorithm wall touch as the Input size grows (© Purpose: It helps ensure that the algorithm wall never take mace time than the spectied amount. © Bxamplesin Bubble Sort, the time complexity fs O(e") because in the worst case, it compares every element with every other element. 2.0moga Notation (0) ~ Best Case 4 Deseribes the lower bound of the runt, ging the best-case scenario. © mega ves the lowest curve representing the best cace. 4 Purpose: shows the minimum amount af tine an sig wl tate. 1 Bxample:For Linear Search, the best-case time complexity i 0(1) ‘wen the target element isthe fst inthe Uist. +3. Theta Notation (©) ~ Average Case ‘© Describes both the upper and lower bounds, giving the: average-case scenario, “# Theta shows the middle curve, representing how the algorithm behaves on average. “@ Purpose: It provides 2 more realistic idea ofthe algorithm's performance by covering both best and worst-case behavier. “@ Example:in Merge Sart, the time complexity is ©(n log n) in ll ‘cases, meaning ft performs consistently. «Sorting @ Deek of Cards: ifthe deck is already sarted, you any need to check once (est case, Qin). '¢ [the deck completely unsorted, you W have to go through the entive process (worst ease, Of). '* Most of the time, t's somewhat shuffled, ond you'l need a _micale amount af effort faverage ease, GY 10g n)). Performance Measurement “@ Performance measurement involves evaluating how efficiently {an algorithm performs in terme of Both time and space. “& tehelps in comparing afferent sigeethens and selecting the best ‘one far a specific tack ‘Tradie-offs Between Time and Space “+ often, there's trade-off between time and space (+ For example, an algoeten mgt tak lcs ime but use more memory, or vce vers. ‘+ Selectng the bes algorithm often depends on wheter time or space lsthe mare teal factor Seen Fle Compression: ‘Te Comply: Faster earths can compresses ‘more qulcy which crac when dealing ith angeles. 1+ Spoce amples: Some eigenthms sove more spoce by ‘compressing dato more effenty, ot the cost of tokng ‘more time ERR tn ote Whe meres aten al soos etas 0" went 2100" then haw that fn) = 02") Shae I esting Me, fe) 0) Seen GP Series: asues? sion! Substitution method ‘© substitution method is 2 technique used to salve recurrence ‘relations often found in the analysis of recursive algorithms. (© Using an iterative approach within the substitution method involves repeatedly substituting the recurrence relation to find a pattern and derive a closed-form solution. ‘Master Theorem (Master Theorem is a quick tick to find the time complexity of ‘divide-and-conquer algorithms that spit probleme into equal parts ‘and merge the solutions—saving you from complicated math! Ten) = aT) + f@) ‘where-T{a) isthe time complexity function. 21 [she number of subproblems. ‘nisthe problem size. ‘b>1 is the division factor far the problem size. {lo} = Cost of dividing the problem and the cost of merging the ‘solution, aiee a] latter Thesrem > o(mta) alovteret Loge) BCR) ‘oantum LE Recursion Tree © 111s 2 visual representation of the recursive calls made by 3 recursive algorithm, © tthelps in analyzing the time complexity of recursive algorithms by illustrating how the problem is divided into subproblems and. hhow many subproiblems are crested at each step. Seen A recursion tree i ke o family tee of procrastinatars—everyone ‘passes the problem to thei “kids* until the simplest version is eft for the youngest to solve. The "kids" da all the hard work while the Porents relax atthe top and then take credit ance the solution comes bock! Shell Sort Shell Sort ican in-place comparicon-baced algorithm, ‘© introduced ty Donal Shell in 1959, a an improvement over Insertion Sort. Working af shell sort © Start with a large gap and reduce it progressively. ‘© Foreach gap, perform a gapped insertion sort on elements that are the "gap" distance apart. (© After each pass, reduce the gap and repeat. (© When gap = 1, perform a standard insertion sort to complete the sorting. Example 1 :35,33,42,10,14,18,27,40,26,31 sone . Fess conoid ltFror TSF Boo ep ee el 2551030 map al :00 8c! sp ra pg Benra2 oo” 8 ® «ie ia lu be [a 12.10] #08 hangs oo @ @ @ ® oo» je ie is {ieee a 1a etna soo 8 i fe is me fb in Bg ra » : & 2 & = & we ‘a> aan ara aan SEM sea Hho api tun apna a anand ‘ila oo fm fe i is fs le lw Helge wep) CC oe etssveolann 28 8) ous i i i? ae 2634381 395365 allio oner] ow ow ew i lm is el Example 2:12 4,3 8,18,7,2,17,13,1,55 Time Complesity Best ease: Ofniogn) Average ease: Ofn') Worst ease: O(n) Advantages © inplace: Uses no extra memory. (© Eficiont for medium-sized arrays. © Easy to implement. Disadvantages (© Not stable: May not preserve the order of equal elements. © Porformance depends on the gap sequence, Algorithm of shell Sort shelsortare, gap =n//2 while gap fort=gpton- 1: ar =temp well? ‘Quick Sort “Quick Sorts an efficient, in-place, comparison-based sorting algorithm. “& truses a dive andl conquer approach. |& Apivot element is selected from the aeray. “a The array is partitioned into two parts: '¢ Elements smaller than the pivot. '¢ Elements larger than the prot. | The same peocess is recursively applied to the sub-arrays. | The recursion continues until the entice areay i sorted. ‘Working of Quick Sort “@ Choose Pivot: Select a pivot element from the array commonly the fis, last, middl, o 8 random element) “e Partitioning: '¢ Rearrange the elements sa that all elements less than the pivat are on theleft and all eloments greater than the pivat ‘recon the right. '¢ The pivot ic now in ks corteet sorted position. “# Recursively Apply Quick SortsRecursively apply the same process ‘tothe leftand right sub-aerays (excluding the pivot). |e ase Case: The recursion stops whan sub-arrays have fewer than ‘two elements, meaning they are already sorted. ‘Characteristics “a Invplace: Requires only a small amount of extra memory. “@ Not stable: May not preserve the relative order of equal elements. “# Bificient: Quick Sort is widely used due tits good average performance. ‘Time Complexity: “# Best case: O(oiogn) “a average case: O{nlogn) “@ Worst case: O(n") — occurs when the pivot selection leads to ‘unbalanced partitions (e.g. always picking the smallest or largest element as pivot). Advantages: “* Fast Average Case: O(n lag time complexity, “e- leyPlace: Requires minimal additional memory. Disadvantages: “Worst-Case Performance: Can degrade to Ola"). “Unstable: Doesa't maintain the order af equal elements. Sex “Alek Sort: Nome for is fost overage performance of Ofniogn), despite o worst-case of O(n2). t's often faster than Merge Sort in proctice.” “Python: Te sor ted{ } function uses Timsort, which combines ‘Quick Sor techniques for efficiency.” 1 start «end then 22 pivot = BARETION(A start end) 3 QuicKsoRTiA start, pot 1) // Sort left part 4. QUIEKSORTIA pivot 1, end) //Sort right part Swap AL] with A) 7 SuapAli+ t] with Ajend) — // Place pivot inthe sight position 8. Returns 1 Retwen the pwatincex ‘Merge sort (© Merge sorts. sorting algorithm that uses the idea of divide and ‘conquer. (© This algorithm divides the array into two subarray , sorts them ‘separately and then merges them. Complexity of merge sort © Best,awerage,Worst TC: O{nlogn) © SC:0(0) Adwantages Stable: Preserves the order of equal elements, © Consistent Performance: O(n log 0} time complexity, Disadvantages © Space-Intensive: Requires O(n) additional memory. “@ Not n-Place: Needs extra storage for merging. a 4. mergesetiare | id) //Sortthe left half ‘S.—mergesortiare mid +3, r) Sort the right hast 6. MERGE[ar,|, mile} // Marge the sorted halves 2. eet a END mergesort ‘ld mergetint a], nt int, intr) setrd=m-1+1 Sis of left sub-aray setrd=r-m Sue fright sub-aray Initialize Left), ight}; // Create ternpocary arrays 1 Copy the left data int left array 11 Copy the ght tata into ght array oe =Oup to a2-1.48 Fight) =alm +4) ssti=0,|=0,k=1_//nsialze indices while (int Bj <0 do Let) = Raph) then sik) = beri] ee ik] = Rignt1 kee woe (b= nt) hes] =Uettien] 1) Copy remainirg elements from Left while J All then lngest-l 3. enand af] > [largest then largest A iflargest f= 5. swaplAliAlargest) 6 HeapeyA largest) HeapSoni) 1 BuilbeapiA) 2 er = length (A down tt 2.awapiACt) Al) “4 Heapiy (A 04) Seen heaps ka ae sano se tum nos (lemons 40 ste ke Noa se he mee gts ha, Ramat beens oes pede Ash as aba ioe jate Rat * Counting Sort ‘© Counting Sort isan integer sorting algorithm that operates in Ol. +k) time complexity, ‘© Where nis the number of elements inthe array, and is the range of input values Working “e- ind Range: Get the min and max values. “@ Greate Count Array: initialize count array forthe range. |& Stove Counts: Count aceurrences of each element. | Cumulative Count: Update counts for positions. “@ Sort: Place elements in sorted order using the count array. “Time and Space Complexity:Ojo . + compe tact * Stepacei)]= ACH 2 fori= ton: cIAl“+ 44 for|= 110 max value= clys= ci.) 5. Create output array B of zen 6 fori= 1 dawn to: cpu ‘lciaman = An 1. Copy B back to A Seen "Counting Sort ote smite apne almtch mein kapde rag he hea sertior rte hope dette ho hme ee our lac kapde hai sak aly. sog acts mea arse eve, tak 28 aur are eu 22 ogah pha! Radic Sort “e Radix Sort ia non-comparative,intager-based sorting algorithm ‘that sorts numbers by processing individual digits. “et works by sorting the numbers dight hy eit, starting fromthe {east cignificant digit (LSD) to the mast significant aight (RSD) ar vice versa. “@ Radix Sort is often used for sorting integers o¢ strings with fixed lengths, suct as dates or alphanumeric codes. J J “Time Complentyo =) ‘Space Compsity oi ‘Algorithm RadinSart\A, |: 1 Find the maémurn purer in to determine the narber of cigs (d) 2. or digit fram least significant to most significant ‘Apply asiable sorting algorithm e-, Counting Sor) onthe sgt Bucket Sort ‘© Bucket Sort works by dividing the input elements into several ‘groups called buckets. ‘© Each bucket contains elements within a specific range. ‘© Once the elements are distributed into these buckets, they are ‘individually sorted, usually with another sorting algorithm such as insertion Sort, Quick Sort, or even Recursvely Bucket Sort. ‘+ Finally all the sarted buckets are combined to form the sorted output. wu" : EREEESESEOEOEOEE Time Complexity © Best and Average Case:0(n +k) © worst Case-O(n’) ‘Space Camplexity:O{n +k} oe Bucket Sort AL) att row ines ie mo Set enn & Seely een See wgeeneee i

You might also like