I've been asked to develop a recursive function and then analyze the asymptotic time complexity.
f(N) = 0, if N < N1 f(N1) = C1 f(N)= A1 + M1*f(M2*N/D1 - S1) Op M3*f(M4*N/D2 - S2), if N > N1 We're to assume that:
s1 = s2 = 0
m2 = m4 = 1
d1 = d2 > 1
//the user enters N1 C1 A1 M1 M2 M3 M4 D1 D2 S1 S2 and then ARG int Recursion_Plus(int ARG) { if (ARG < n1) { return 0; } else if(ARG == n1) { return c1; } else if(ARG > n1 ) { return a1 + m1 * Recursion_Plus(m2*ARG/d1 - s1) + m3*Recursion_Plus(m4*ARG/d2 - s2); } } I've tested my recursive function against the instructor's program and it works exactly the same so I've moved on to my analysis, where I've hit a wall.
I'm struggling to wrap my brain around this so bear with me please.
My attempt at a partial solution:
the 2 comparisons (if ARG < N1 & if ARG == N1) take 1 unit of time
a1 & m1 & m3 are insignificant because they're outside the recursive call
a1 + m1*_ = 1 unit of time (addition)
m1*_ = 1 unit of time (multiplication)
adding the 2 recursive calls together is 1 unit of time
m3*_ = 1 unit of time (multiplication)
From the instructions we're given, both recursive functions will be called using the same # every time, and every successive number that the recursive function calls will be smaller than the last because d1 = d2 > 1.
So the larger ARG is (in comparison to n1), the longer it takes to reach the base case and the larger the result will be. So the algorithm takes O(ARG) time?
I'd appreciate it if anyone could let me kno if I'm on the right track. Thanks