Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
525 views
in Technique[技术] by (71.8m points)

algorithm - What is the difference between O, Ω, and Θ?

I am learning algorithm analysis. I am having trouble understanding the difference between O, Ω, and Θ.

The way they're defined is as follows:

  • f(n) = O(g(n)) means c · g(n) is an upper bound on f(n). Thus there exists some constant c such that f(n) is always ≤ c · g(n), for large enough n (i.e., n ≥ n0 for some constant n0).
  • f(n) = Ω(g(n)) means c · g(n) is a lower bound on f(n). Thus there exists some constant c such that f(n) is always ≥ c · g(n), for all n ≥ n0.
  • f(n) = Θ(g(n)) means c1 · g(n) is an upper bound on f(n) and c2 · g(n) is a lower bound on f(n), for all n ≥ n0. Thus there exist constants c1 and c2 such that f(n) ≤ c1 ·g(n) and f(n) ≥ c2 ·g(n). This means that g(n) provides a nice, tight bound on f(n).

The way I have understood this is:

  • O(f(n)) gives worst case complexity of given function/algorithm.
  • Ω(f(n)) gives best case complexity of given function/algorithm.
  • Θ(f(n)) gives average case complexity of given function/algorithm.

Please correct me if I am wrong. If it is the case, time complexity of each algorithm must be expressed in all three notations. But I observed that it's expressed as either O, Ω, or Θ; why not all three?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

It is important to remember that the notation, whether O, Ω or Θ, expresses the asymptotic growth of a function; it does not have anything intrinsically to do with algorithms per se. The function in question may be the "complexity" (running time) of an algorithm, either worst-case, best-case or average-case, but the notation is independent of where the function comes from.

For example, the function f(n)=3n2+5 is:

  • O(n2), it is also O(n2log n), O(n3), O(n4) and so on, but is not O(n).
  • Ω(n2), it is also Ω(n log n), Ω(n) and so on, but is not Ω(n3).
  • Θ(n2). It is not even Θ(n2log n) or Θ(n2/log n).

Now, usually the function considered is the worst-case complexity of an algorithm, and which notation of the three is used depends on what we want to say about it and on how carefully we do the analysis. For example, we may observe that because there are two nested loops, the worst-case running time is at most O(n2), without caring about whether this is actually achieved for some input. (Usually it is obvious that it is.) Or, we may say that the worst-case running time of sorting is Ω(n log n), because there must be some inputs for which it must take at least cn(log n) steps. Or, we may look at a particular mergesort algorithm, and see that it takes at most O(n log n) steps in the worst-case and that some input makes it take n log n steps, so the worst-case running time is Θ(n log n).

Note that in all the three examples above, it was still the same (worst-case) running time that was being analyzed. We may analyze the best-case or average-case instead, but again, which notation of the three we use depends on what we want to say — whether we want to give an upper bound, lower bound, or tight bound on the order of growth of the same function.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

1.4m articles

1.4m replys

5 comments

57.0k users

...