Home Data Structures QuickSort Algorithm

# QuickSort Algorithm

Quicksort is a Divide and conquer based algorithm that picks a pivot element from the given array and partition(rearranges) the array in such a way that elements in subarray before pivot are less than or equal to the pivot and element in subarray after pivot are greater than pivot(but not necessarily in sorted order).

It is an in-place sorting algorithm(as it requires small additional amounts of memory to store recursive function to perform the sorting) and average quicksort makes O(nlogn) comparison to sort n elements and in the worst case, it makes O(n²) comparisons. Quicksort, when implemented properly, is 2-3 times faster than merge sort and heapsort.

##### Let’s understand the working of the QuickSort

The working of the Partition of function

1. Choose the last element of the current element as the pivot element.
2. Take an element to say m (m = start), pointing to the left of the array.
3. If an element is less than the pivot move to the left at index arr[m] and increment m by 1 otherwise do nothing
4. Do this for elements every element of the current array except the last element i.e, repeat steps 2 and 3 for elements in the range [start to end-1] inclusive.
5. Finally, move the pivot to its correct position using the variable m and return the variable m (now the variable m is the pivot element’s new position).

The working of the QuickSort of function

1. If start index if less then the end, pass the current array to partition function and get pivot element.
2. Split the current array into two subarrays left and right using pivot.
3. Pass the left subarray (ranging from start to pivot-1) to the quicksort function.
4. Now, pass the right subarray (ranging from pivot+1 to end) to the quicksort function.
5. Repeat the above steps until the start index is less than the end.
6. If end > start, then stop.

The following diagram represents the working of quicksort. The element marked orange represents the pivot element in the current array. After calling the partition function, the array is split around the pivot, into the left and right subarray.

### Algorithm

```// the paration algorithm
partation(arr, start, end)
{
pivot = arr[end]
m = 0;
for (i in range start to end-1){
if(arr[i] < pivot)
//if current element is less than pivot,
// place it to the left of the pivot
swap(arr[m], arr[i])
m++;
}
swap(arr[m], arr[e]);      //place pivot at correct position
return m;
}

//the quicksort algorithm
quicksort(arr, start, end){
if(start<end)
{
pivot = partation(arr, start, end)
//call quick sort on right and left subarray of pivot
quicksort(arr, start, pivot-1);
quicksort(arr, pivot+1, end);
}
return
}```

### Implementation in CPP

```
#include <bits/stdc++.h>
using namespace std;

void swap(int *a, int *b)
{
int temp;
temp = *a;
*a = *b;
*b = temp;
}

int partation(int *arr, int start, int end)
{
int pivot = arr[end];
int m=start;

for(int i=start; i<end; i++)
{
if(arr[i]<pivot)
{
swap(&arr[i], &arr[m]);
m++;
}
}
swap(&arr[m],&arr[end]);
return m;
}

int quicksort(int *arr, int start, int end)
{
if(start<end)
{
int pivot = partation(arr, start, end);
quicksort(arr, start, pivot-1);
quicksort(arr, pivot+1, end);
}
return 0;
}

int main()
{
int arr[] = {9, 26, 11, 20, 18, 5, 15};
int size = sizeof(arr)/sizeof(arr);

quicksort(arr, 0, size-1);
for(int i=0; i<size; i++)
{
cout<<arr[i]<<" ";
}
cout<<endl;

return 0;
}
```

#### Output

`5 9 11 15 18 20 26`

### Time Complexity

The best and average-case time complexity of QuickSort is O(NlogN), where N is the number of elements in the array and the worst-case time complexity is O(N²). The worst-case will occur when all the elements will be available in sorted order(either ascending or descending order).

###### Best case recurrence relation
`T(n) = 2 T(n/2) + O(n)`
###### Worst case recurrence relation
`T(n) = T(n-1) + T(1) + O(n)`

QuickSort properties at a glance:

1. Best case time complexity = O(NlogN)
2. Worst-case time complexity = O(N²)
3. Auxiliary space requirement = O(logN)
4. Number of comparisons in best case = O(NlogN)
5. Number of comparisons in worst case = O(N²)

Did, we miss something, or do you want to add some other key points? 