0
$\begingroup$

I am currently learning landau notations and am stuck on the following True/False question.

Quession

What seems a little confusing to me is the use of big-theta notation to describe worst-case run-time. What is the difference in saying "worst-case run-time Θ(n log n)" or simply "run-time O(n log n)"? Giving the worst case a tight limit seems a lot like simply giving an upper bound to the run-time.

If I go by that reasoning my guess would be that the statement is true. As an O(n) algorithm will be faster than an O(n log n) as n grows large enough. But i am not sure if my reasoning is correct and if i get the big picture here.

$\endgroup$
1
  • 1
    $\begingroup$ Related question. $\endgroup$ Commented May 16, 2016 at 18:43

1 Answer 1

2
$\begingroup$

Θ(n log n) means the algorithm is O(n log n) and Ω(n log n), and is slower than Θ(n) so your reasoning is correct.

But there's another layer: Landau notation denotes asymptotic bounds on functions. Saying "worst case" restricts it to those functions which result in the highest run time. For example take binary search: we get a best-case runtime asymptotic of Θ(1) and a worst-case asymptotic of Θ(log n).

To answer your question about the difference between "worst-case Θ(n log n)" and "run-time O(n log n)"*, the first one gives us more information: it tells us it runs about as fast as n log n, where the second tells us it runs at most n log n.

*Technically the second one doesn't tell us for what 'run-time' configuration it is asymptotically approaching, but in that case generally most people mean 'in the worst case this algorithm runs in O(n log n)' when they don't say anything.

$\endgroup$
0

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.