Sorting has always been daunting for me to comprehend.
I am taking Coursera course: Algorithms:
I am getting an intuition of why Merge is a better algorithm for sorting?
Ans: Divide and Conquer
Divide is conquer is very powerful, used in sorting, FFT and in many applications. I am now enthusiastic to know more about applications of divide and conquer.
1. Learning about Merge Sort
In merge sort, the input array is split exactly into half and work is done on both halves.
According to Master Method:
Therefore a=2, b= 2, d= 1 (O(n)
b= 2 (since we make two recursive calls)...one for left half and one for right half
d= O(n) since merge is running on length k at each level (Hint: Temp array for loop runs for 2*K times)
a = b ^d. Therefore, O(Mergesort) = O(n^d logn) = O(nlogn)
Note that the master method is useful for calculating Order of Recursive algorithms when a,b,d are clearly defined
2. Quick Sort
In merge sort, a, b,d varies depending on the choice of pivot. Therefore probablity theorems are used to get O(quicksort algorithm)
3. Another example for master method
Finding power of a number (a^b)
Naive way is to multiply a by itself b times. A better way is done using the approach below
a=??, b=???, d=???
note in this case,
a = 1 (number of recursive calls each level)
b = 2 (problem is divided into 1/2 length at each level)
d= 0 (the work is independent of length)
a = 1, b^d =1
a= b^d => O(algo) = O(n^0 logn) = 0(logn). This tallies up as there logn level and O(1) work done at each level.
I am taking Coursera course: Algorithms:
I am getting an intuition of why Merge is a better algorithm for sorting?
Ans: Divide and Conquer
Divide is conquer is very powerful, used in sorting, FFT and in many applications. I am now enthusiastic to know more about applications of divide and conquer.
1. Learning about Merge Sort
In merge sort, the input array is split exactly into half and work is done on both halves.
According to Master Method:
Therefore a=2, b= 2, d= 1 (O(n)
b= 2 (since we make two recursive calls)...one for left half and one for right half
d= O(n) since merge is running on length k at each level (Hint: Temp array for loop runs for 2*K times)
a = b ^d. Therefore, O(Mergesort) = O(n^d logn) = O(nlogn)
Note that the master method is useful for calculating Order of Recursive algorithms when a,b,d are clearly defined
2. Quick Sort
In merge sort, a, b,d varies depending on the choice of pivot. Therefore probablity theorems are used to get O(quicksort algorithm)
3. Another example for master method
Finding power of a number (a^b)
Naive way is to multiply a by itself b times. A better way is done using the approach below
FastPower(a,b) : if b = 1 return a otherwise c := a*a ans := FastPower(c,[b/2]) if b is odd return a*ans otherwise return ans end
a=??, b=???, d=???
note in this case,
a = 1 (number of recursive calls each level)
b = 2 (problem is divided into 1/2 length at each level)
d= 0 (the work is independent of length)
a = 1, b^d =1
a= b^d => O(algo) = O(n^0 logn) = 0(logn). This tallies up as there logn level and O(1) work done at each level.