Common Data Structure Operations
Merge sort is one of the most flexible sorting algorithms in java known to mankind (yes. Sorting algorithms Cheat Sheet by pryl - Cheatography.com Created Date: 0824Z. Merge Sort: Array: O(n log(n)) O(n log(n)) O(n log(n)) O(n) Heapsort: Array: O(n log(n)) O(n log(n)) O(n log(n)) O(1) Bubble Sort: Array: O(n) O(n^2) O(n^2) O(1) Insertion Sort: Array: O(n) O(n^2) O(n^2) O(1) Select Sort: Array: O(n^2) O(n^2) O(n^2) O(1) Bucket Sort: Array: O(n+k) O(n+k) O(n^2) O(nk) Radix Sort: Array: O(nk) O(nk) O(nk) O(n+k).
Data Structure | Time Complexity | Space Complexity | |||||||
---|---|---|---|---|---|---|---|---|---|
Average | Worst | Worst | |||||||
Access | Search | Insertion | Deletion | Access | Search | Insertion | Deletion | ||
Array | Θ(1) | Θ(n) | Θ(n) | Θ(n) | O(1) | O(n) | O(n) | O(n) | O(n) |
Stack | Θ(n) | Θ(n) | Θ(1) | Θ(1) | O(n) | O(n) | O(1) | O(1) | O(n) |
Queue | Θ(n) | Θ(n) | Θ(1) | Θ(1) | O(n) | O(n) | O(1) | O(1) | O(n) |
Singly-Linked List | Θ(n) | Θ(n) | Θ(1) | Θ(1) | O(n) | O(n) | O(1) | O(1) | O(n) |
Doubly-Linked List | Θ(n) | Θ(n) | Θ(1) | Θ(1) | O(n) | O(n) | O(1) | O(1) | O(n) |
Skip List | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | O(n) | O(n) | O(n) | O(n) | O(n log(n)) |
Hash Table | N/A | Θ(1) | Θ(1) | Θ(1) | N/A | O(n) | O(n) | O(n) | O(n) |
Binary Search Tree | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | O(n) | O(n) | O(n) | O(n) | O(n) |
Cartesian Tree | N/A | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | N/A | O(n) | O(n) | O(n) | O(n) |
B-Tree | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | O(log(n)) | O(log(n)) | O(log(n)) | O(log(n)) | O(n) |
Red-Black Tree | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | O(log(n)) | O(log(n)) | O(log(n)) | O(log(n)) | O(n) |
Splay Tree | N/A | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | N/A | O(log(n)) | O(log(n)) | O(log(n)) | O(n) |
AVL Tree | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | O(log(n)) | O(log(n)) | O(log(n)) | O(log(n)) | O(n) |
KD Tree | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | O(n) | O(n) | O(n) | O(n) | O(n) |
Array Sorting Algorithms
Algorithm | Time Complexity | Space Complexity | ||
---|---|---|---|---|
Best | Average | Worst | Worst | |
Quicksort | Ω(n log(n)) | Θ(n log(n)) | O(n^2) | O(log(n)) |
Mergesort | Ω(n log(n)) | Θ(n log(n)) | O(n log(n)) | O(n) |
Timsort | Ω(n) | Θ(n log(n)) | O(n log(n)) | O(n) |
Heapsort | Ω(n log(n)) | Θ(n log(n)) | O(n log(n)) | O(1) |
Bubble Sort | Ω(n) | Θ(n^2) | O(n^2) | O(1) |
Insertion Sort | Ω(n) | Θ(n^2) | O(n^2) | O(1) |
Selection Sort | Ω(n^2) | Θ(n^2) | O(n^2) | O(1) |
Tree Sort | Ω(n log(n)) | Θ(n log(n)) | O(n^2) | O(n) |
Shell Sort | Ω(n log(n)) | Θ(n(log(n))^2) | O(n(log(n))^2) | O(1) |
Bucket Sort | Ω(n+k) | Θ(n+k) | O(n^2) | O(n) |
Radix Sort | Ω(nk) | Θ(nk) | O(nk) | O(n+k) |
Counting Sort | Ω(n+k) | Θ(n+k) | O(n+k) | O(k) |
Cubesort | Ω(n) | Θ(n log(n)) | O(n log(n)) | O(n) |
This is a summary of the key features that make up an algorithm.
Motivation
While it is easy to understand the concept of each sort algorithm, I find it difficult for me to remember the key steps that define an algorithm or is characteristic of that algorithm. That key step may prove to be pivotal in solving the algorithm but may come in as insignificant from the macro view of its concept.
Hence, I decided that it will be better for me to jot the key pointers down.
Quicksort
Key Steps
C++ Big O Cheat Sheet
- Gist: Using a pivot value, distribute the array into 2 halves that are not ordered, but are collectively smaller on the left side and collectively larger on the right side.
- The array is mutated.
- The pivot value of each iteration will find its rightful position in the array at every iteration, eventually leading to a sorted array.
Discussions
Let’s start with the recursion
function.
In the recursion
function, note that the arguments are indices of the array, not the length. Keep this at the back of your mind so that you can understand when to end a loop.
Line 7 ensures we are iterating at least 2 elements.
In lines 9 and 10, the recursion occurs on either side of the pivot index in that iteration. Note that the pivot index does not participate in the next recursion, since it is already at where it belongs.
Now for the partition
function.
In the partition
function, the pivot does not participate in the reordering. Line 18 ensures the loop ends before reaching the last index, finish
, which is the pivot, with the non-inclusive range constructor operator.
In the loop in line 18, we are are trying to push the values smaller than or equal to the pivot to the left here. It is also ok to use <
.
We do so by swapping them with those that are bigger than the pivot, but exist on the left of those that are smaller.
The pivot_index
increments at each swap and remembers the last position that was swapped. Hence, at the end of the loop, it holds the position of the first value that is bigger than the pivot. Everything on the left is either smaller than or equal to the pivot.
This is where the pivot belongs to in the array. We swap the pivot into that position. Ascend the throne!
The state of the array does not change in this last swap: all elements on the left of the pivot is still smaller or equal to the pivot, while all elements on the right of the pivot is still bigger than the pivot. They remain unsorted
The function returns the pivot’s position to the parent recursion
function, which needs it to know where to split the array for the next iteration.
Lastly, let’s go back to the calling function where the initial recursion function is triggered. Make sure to pass in the last index of the array instead of its length.
Mergesort
Key Steps
- Gist: first recursively halve array until we are dealing with 1 element, then recursively merge the elements back in a sorted order until we get back the array of the same size, and now it will be sorted
- A recursive function that consist of 2 parts in order: recursively split and recursively merge
- The array will be mutated
- In lines 16 and 18, we are continuously appending the smaller of the first element on the right vs left array.
- Lines 11 and 13 will take care of the comparison that is still ongoing, when 1 side has been fully appended while the other still have elements inside. Since these arrays are already sorted at whichever iteration, we can just append the whole array.
- Remember the breaking function in line 2
Unfortunately, while this use of recursion is great, the number of recursions may become too excessive and cause a “stack level too deep” error.
We may need to to think prepare an alternative if the stack overflows.
Line 15 till 36 basically carry out the operation with a while loop instead of recursion. It mutates the array along the way.
Insertion sort
Key Steps
- Gist: insert elements one by one from unsorted part of array into sorted part of array
- Divide the array into sorted portion and unsorted portion
- Sorted partition always starts from the first element, as array of 1 element is always sorted
- First element of unsorted array will shift forward until the start of the sorted portion of the array OR until it meets an element bigger than itself
- Order of the sorted portion is maintained
- The last element of the sorted array takes its place
- The next iteration start on the next element of the unsorted portion, which is now the first element of the current unsorted portion
- The loop mutates the array
Discussions
- Best case is an already sorted array, so no shifting of elements from the unsorted to the sorted portion of the array, resulting in a time complexity of
n
- The worst case is a reverse sorted array, which results in the whole sorted array having to shift for each iteration. The first element of the unsorted portion of array is always at the the smallest and need to go to the front of the sorted portion. Time complexity is
n^2
Selection sort
Key Steps
- Gist: scan array to find the smallest element and eliminate it for the next iterations
- Swap smallest element with the front most element
- Scan the array in the next iteration excluding the smallest element(s)
- Last remaining single element will be of the largest value, so iterations take place until
n - 2
Discussions
- Time complexity is
n^2
Sorting And Searching Algorithms Cheat Sheet
Bubble sort
Key Steps
- Gist: keep swapping adjacent elements if left is larger than right down the array, and repeat this iteration for as many times as there are elements in the array. The last iteration will not have any swap occur to declare the array swapped.
Discussions
Algorithm Time Complexity Cheat Sheet
- Time complexity is
n^2