It has been seen \(\Omega(n log n)\) and \(\Omega(n^2)\) that sorting algorithms:( bubblesort), insertion sort, (selection sort), mergesort, quicksort, have worst-case running time\((n log n)\). Can we do better than \(n log n\)? If we did not find a faster algorithm, it does not mean that there isn’t one..
Can we say that it is impossible to sort faster than \((n log n)\) in the worst case? If we could, then this would be what it’s called a lower bound.
A lower bound for a problem is the worst-case running time of the best possible algorithm for that problem.
To prove a lower bound of \((n log n)\) for sorting, we would have to prove that no algorithm, however smart, could possibly be faster, in the worst-case, then \(n log n\). So, how to prove a lower bound, without going through all possible algorithms?
To approach this type of problem we have to go back and remind ourselves whhat is an algorithm. The power of an algorithm depends on the primitive instructions we allow, which in turn depend on the machine model.
How fast we can solve a problem depends on the model of computation. The more powerful the machine model, the smaller the lower bound. (Ultimately, if our model allowed sort as a primitive operation, then we could sort in one instruction, hence constant time!!)
All sorting algorithms that we have seen so far use only comparisons to gain information about the input. We will prove that such algorithms have to do \((n log n)\) comparisons
To prove bound, we need formal model. Comparison (or Decision tree) Model:
Worst case number of comparisons performed corresponds to maximal height of tree ; therefore lower bound on height is equivalent to lower bound on sorting.
Theorem: Any decision tree sorting n elements has height \((n log n)\).
Tree of height \(h\) has at most \(2^h\) leaves