Question 1 |

What is recurrence for worst case of QuickSort and what is the time complexity in Worst case?

Recurrence is T(n) = T(n-2) + O(n) and time complexity is O(n^2) | |

Recurrence is T(n) = T(n-1) + O(n) and time complexity is O(n^2) | |

Recurrence is T(n) = 2T(n/2) + O(n) and time complexity is O(nLogn) | |

Recurrence is T(n) = T(n/10) + T(9n/10) + O(n) and time complexity is O(nLogn) |

**Analysis of Algorithms**

**Sorting**

**QuickSort**

**Discuss it**

Question 1 Explanation:

The worst case of QuickSort occurs when the picked pivot is always one of the corner elements in sorted array. In worst case, QuickSort recursively calls one subproblem with size 0 and other subproblem with size (n-1). So recurrence is
T(n) = T(n-1) + T(0) + O(n)
The above expression can be rewritten as
T(n) = T(n-1) + O(n)
void exchange(int *a, int *b)
{
int temp;
temp = *a;
*a = *b;
*b = temp;
}
int partition(int arr[], int si, int ei)
{
int x = arr[ei];
int i = (si - 1);
int j;
for (j = si; j <= ei - 1; j++)
{
if(arr[j] <= x)
{
i++;
exchange(&arr[i], &arr[j]);
}
}
exchange (&arr[i + 1], &arr[ei]);
return (i + 1);
}
/* Implementation of Quick Sort
arr[] --> Array to be sorted
si --> Starting index
ei --> Ending index
*/
void quickSort(int arr[], int si, int ei)
{
int pi; /* Partitioning index */
if(si < ei)
{
pi = partition(arr, si, ei);
quickSort(arr, si, pi - 1);
quickSort(arr, pi + 1, ei);
}
}
[/sourcecode]

Question 2 |

Suppose we have a O(n) time algorithm that finds median of an unsorted array.
Now consider a QuickSort implementation where we first find median using the above algorithm, then use median as pivot. What will be the worst case time complexity of this modified QuickSort.

O(n^2 Logn) | |

O(n^2) | |

O(n Logn Logn) | |

O(nLogn) |

**Analysis of Algorithms**

**Sorting**

**QuickSort**

**Discuss it**

Question 2 Explanation:

If we use median as a pivot element, then the recurrence for all cases becomes
T(n) = 2T(n/2) + O(n)
The above recurrence can be solved using Master Method. It falls in case 2 of master method.

Question 3 |

Which of the following is not a stable sorting algorithm in its typical implementation.

Insertion Sort | |

Merge Sort | |

Quick Sort | |

Bubble Sort |

**Sorting**

**QuickSort**

**InsertionSort**

**MergeSort**

**Discuss it**

Question 3 Explanation:

See following for details.
https://tutorialspoint.dev/slugresolver/stability-in-sorting-algorithms/

Question 4 |

Which of the following sorting algorithms in its typical implementation gives best performance when applied on an array which is sorted or almost sorted (maximum 1 or two elements are misplaced).

Quick Sort | |

Heap Sort | |

Merge Sort | |

Insertion Sort |

**Sorting**

**QuickSort**

**InsertionSort**

**HeapSort**

**Discuss it**

Question 4 Explanation:

Insertion sort takes linear time when input array is sorted or almost sorted (maximum 1 or 2 elements are misplaced).
All other sorting algorithms mentioned above will take more than lienear time in their typical implementation.

Question 5 |

Given an unsorted array. The array has this property that every element in array is at most k distance from its position in sorted array where k is a positive integer smaller than size of array. Which sorting algorithm can be easily modified for sorting this array and what is the obtainable time complexity?

Insertion Sort with time complexity O(kn) | |

Heap Sort with time complexity O(nLogk) | |

Quick Sort with time complexity O(kLogk) | |

Merge Sort with time complexity O(kLogk) |

**Analysis of Algorithms**

**Sorting**

**QuickSort**

**HeapSort**

**Discuss it**

Question 5 Explanation:

See https://tutorialspoint.dev/slugresolver/nearly-sorted-algorithm/ for explanation and implementation.

Question 6 |

Consider a situation where swap operation is very costly. Which of the following sorting algorithms should be preferred so that the number of swap operations are minimized in general?

Heap Sort | |

Selection Sort | |

Insertion Sort | |

Merge Sort |

**Sorting**

**SelectionSort**

**InsertionSort**

**MergeSort**

**Discuss it**

Question 6 Explanation:

Selection sort makes O(n) swaps which is minimum among all sorting algorithms mentioned above.

Question 7 |

Which of the following is not true about comparison based sorting algorithms?

The minimum possible time complexity of a comparison based sorting algorithm is O(nLogn) for a random input array | |

Any comparison based sorting algorithm can be made stable by using position as a criteria when two elements are compared | |

Counting Sort is not a comparison based sorting algortihm | |

Heap Sort is not a comparison based sorting algorithm. |

**Analysis of Algorithms**

**Sorting**

**HeapSort**

**CountingSort**

**Discuss it**

Question 7 Explanation:

See https://tutorialspoint.dev/slugresolver/lower-bound-on-comparison-based-sorting-algorithms/ for point A. See https://tutorialspoint.dev/slugresolver/stability-in-sorting-algorithms/ for B. C is true, count sort is an Integer Sorting algorithm.

Question 8 |

Suppose we are sorting an array of eight integers using quicksort, and we have just finished the first partitioning with the array looking like this:

Which statement is correct?

2 5 1 7 9 12 11 10

Which statement is correct?

The pivot could be either the 7 or the 9. | |

The pivot could be the 7, but it is not the 9 | |

The pivot is not the 7, but it could be the 9 | |

Neither the 7 nor the 9 is the pivot. |

**Sorting**

**QuickSort**

**Discuss it**

Question 8 Explanation:

7 and 9 both are at their correct positions (as in a sorted array). Also, all elements on left of 7 and 9 are smaller than 7 and 9 respectively and on right are greater than 7 and 9 respectively.

Question 9 |

Suppose we are sorting an array of eight integers using heapsort, and we have just finished some heapify (either maxheapify or minheapify) operations. The array now looks like this:
16 14 15 10 12 27 28
How many heapify operations have been performed on root of heap?

1 | |

2 | |

3 or 4 | |

5 or 6 |

**Sorting**

**Heap**

**HeapSort**

**Discuss it**

Question 9 Explanation:

In Heapsort, we first build a heap, then we do following operations till the heap size becomes 1.
a) Swap the root with last element
b) Call heapify for root
c) reduce the heap size by 1.
In this question, it is given that heapify has been called few times and we see that last two elements in given array are the 2 maximum elements in array. So situation is clear, it is maxheapify whic has been called 2 times.

Question 10 |

What is the best time complexity of bubble sort?

N^2 | |

NlogN | |

N | |

N(logN)^2 |

**Analysis of Algorithms**

**Sorting**

**BubbleSort**

**Discuss it**

Question 10 Explanation:

The bubble sort is at its best if the input data is sorted. i.e. If the input data is sorted in the same order as expected output. This can be achieved by using one boolean variable. The boolean variable is used to check whether the values are swapped at least once in the inner loop.
Consider the following code snippet:
int main()
{
int arr[] = {10, 20, 30, 40, 50}, i, j, isSwapped;
int n = sizeof(arr) / sizeof(*arr);
isSwapped = 1;
for(i = 0; i < n - 1 && isSwapped; ++i)
{
isSwapped = 0;
for(j = 0; j < n - i - 1; ++j)
if (arr[j] > arr[j + 1])
{
swap(&arr[j], &arr[j + 1]);
isSwapped = 1;
}
}
for(i = 0; i < n; ++i)
printf("%d ", arr[i]);
return 0;
}
[/sourcecode]
Please observe that in the above code, the outer loop runs only once.

There are 58 questions to complete.

## leave a comment

## 0 Comments