1

I have the following recursion:

T(n) = T(2*n / 3) + T(n / 3) + O(n log n) 

I need to know the exact equation, I know Master theorem won't help me.

Please tell me how to do it in general for such recursions. I need complexity and to understand how to solve such problems.

Thanks in advance.

6
  • 1
    Why do you think the Master Theorem won't help you? What is your exact question? In general, the Master Theorem is the correct way to determine the asymptotic run-time for most (not all) recursive calls. Commented May 8, 2014 at 6:58
  • Do you see that you're equation is T(n) = T(n) + O(n logn) which is very weird? Commented May 8, 2014 at 9:20
  • @pkacprzak That's not at all the case. Commented May 8, 2014 at 11:50
  • 1
    @pkacprzak T(2*n / 3) + T(n / 3) != T(n) Commented May 8, 2014 at 12:01
  • 1
    @MatthewHerbst The Master Theorem requires that the 2n/3 and n/3 be the same. You can convert the n/3 to 2n/3 to get an upper bound, as AbcAeffchen did. Commented May 8, 2014 at 20:11

2 Answers 2

1

A generalization of the Master Theorem is the Akra-Bazzi method.

Assuming that your O(n log n) is really Θ(n log n), we have g(x)=x log x, ai=1 for i=1 and i=2, b1=2/3 and b2=1/3. Then b1p+b2p=1 when p=1, g(u)/up+1=(log u)/u has integral (log²u)/2, and T(x) is Θ(x log²x).

Sign up to request clarification or add additional context in comments.

Comments

1

You could use a simplification to get an upper bound.

1. Way:

T(n) = T(2/3 ⋅ n) + T(1/3 ⋅ n) + O(n log n) ≤ T(2/3 ⋅ n) + T(2/3 ⋅ n) + O(n log n) = 2 T(2/3 ⋅ n) + O(n log n) 

Now you can use the master theorem. This should give you O( n1.7095 ). Due to we have a , we do have O instead of Θ.

2. Way:
You can 'count' the nodes in the calling-tree, which is a binary tree in your case. This tree has maximum depth log3/2(n), so it has less than 2O(log n) = O(n) nodes. For each node there comes a O(n log n), so you get O(n) ⋅ O(n log n) = O(n² log n) in total.

Sure, this is very unexact. We can do better if we split the tree up into two parts:

  1. The upper part above depth of 1/2 ⋅ log n: O(21/2 ⋅ log n ) = O( sqrt(n) ) nodes. This nodes have a weight of O(n log n).
  2. The lower part below depth of 1/2 ⋅ log n: O(2log n - sqrt(n) ) = O(n) nodes. This nodes have a weight of O(sqrt(n) log n)

In total you get: T(n) = O(sqrt(n)) ⋅ O(n log n) + O(n) ⋅ O(sqrt(n) log n) = O(n1.5 log n).

I hope this helps.

EDIT: O(n1.5 log n) ⊂ O(n1.7095).

EDIT: You can write O(n1.5 log n) as O(n⋅sqrt(n)⋅log n). As @Teepeemm showed, the exact complexity is Θ(n log²(n)), which improves the upper bound I showed by replacing the factor sqrt(n) by log(n).

2 Comments

Replacing T (2/3 n) + T (n/3) with T (2/3 n) + T (2/3 n) totally changes the result. T (n) = c * n * log^2 n matches the recursion formula just fine.
Sure, both results are only upper bounds, but you can get them without any heavy theory but master theorem

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.