Asymptotic Analysis
From Suhrid.net Wiki
Jump to navigationJump to search
- Running time of an algorithm is a function of the size of it's input.
- How fast does an algorithm grow with input size ? This is the rate of growth of the running time.
- Three kinds of asymptotic notation :
- Big Theta : Provides an asymptotically tight bound on the running time of an algo. If f(n) is BigTheta(g(n)) this means that f(n) grows asymptotically at the same rate as g(n)
- Big Omega : Provides an asymptotically lower bound on the running time of an algo. If f(n) is BigOmega(g(n)) this means that f(n) grows asymptotically no slower than g(n)
- Big O : Provides an asymptotically upper bound on the running time of an algo. If f(n) is O(g(n)) this means that f(n) grows asymptotically no faster than g(n)
- e.g. consider binary search, the worst-case running time is log(n).
- So we can say Running time of binary search i.e. f(n) - the actual algorithm logic - grows no faster than log(n) i.e. g(n). So we say BinarySearch is O(log(n)).