Difference between revisions of "Asymptotic Analysis"
From Suhrid.net Wiki
Jump to navigationJump to search (Created page with "* Running time of an algorithm is a function of the size of it's input. * How fast does an algorithm grow with input size ? This is the rate of growth of the running time. *...") |
|||
Line 2: | Line 2: | ||
* How fast does an algorithm grow with input size ? This is the rate of growth of the running time. | * How fast does an algorithm grow with input size ? This is the rate of growth of the running time. | ||
* Three kinds of asymptotic notation : | * Three kinds of asymptotic notation : | ||
− | * Big Theta : Provides an asymptotically tight bound on the running time of an algo. If f(n) is BigTheta(g(n)) this means that f(n) grows asymptotically at the same rate as g(n) | + | * Big Theta : Provides an asymptotically tight bound on the running time of an algo. If f(n) is BigTheta(g(n)) this means that f(n) grows (asymptotically) at the same rate as g(n) |
− | * Big Omega : Provides an asymptotically lower bound on the running time of an algo. If f(n) is BigOmega(g(n)) this means that f(n) grows asymptotically no slower than g(n) | + | * Big Omega : Provides an asymptotically lower bound on the running time of an algo. If f(n) is BigOmega(g(n)) this means that f(n) grows (asymptotically) no slower than g(n) |
− | * Big O : Provides an asymptotically upper bound on the running time of an algo. If f(n) is O(g(n)) this means that f(n) grows asymptotically no faster than g(n) | + | * Big O : Provides an asymptotically upper bound on the running time of an algo. If f(n) is O(g(n)) this means that f(n) grows (asymptotically) no faster than g(n) |
* e.g. consider binary search, the worst-case running time is log(n). | * e.g. consider binary search, the worst-case running time is log(n). | ||
* So we can say Running time of binary search i.e. f(n) - the actual algorithm logic - grows no faster than log(n) i.e. g(n). So we say BinarySearch is O(log(n)). | * So we can say Running time of binary search i.e. f(n) - the actual algorithm logic - grows no faster than log(n) i.e. g(n). So we say BinarySearch is O(log(n)). |
Latest revision as of 19:45, 4 May 2015
- Running time of an algorithm is a function of the size of it's input.
- How fast does an algorithm grow with input size ? This is the rate of growth of the running time.
- Three kinds of asymptotic notation :
- Big Theta : Provides an asymptotically tight bound on the running time of an algo. If f(n) is BigTheta(g(n)) this means that f(n) grows (asymptotically) at the same rate as g(n)
- Big Omega : Provides an asymptotically lower bound on the running time of an algo. If f(n) is BigOmega(g(n)) this means that f(n) grows (asymptotically) no slower than g(n)
- Big O : Provides an asymptotically upper bound on the running time of an algo. If f(n) is O(g(n)) this means that f(n) grows (asymptotically) no faster than g(n)
- e.g. consider binary search, the worst-case running time is log(n).
- So we can say Running time of binary search i.e. f(n) - the actual algorithm logic - grows no faster than log(n) i.e. g(n). So we say BinarySearch is O(log(n)).