Consider an algorithm that takes in an integer
Nand divides out all the factors of 2, then 3, then 4, all the way up to aboutsqrt(N). If it takes unit time to add, subtract, multiply, and divide integers, how fast is this algorithm? What about if it takesTheta(n)time to add, subtract, multiply, and divide twon-digit binary numbers?
My thoughts:
In order to better understand the problem, I first wrote some pseudocode outlining the algorithm. I think it looks something like this.
factorize(N) { for (i = 2; i <= sqrt(N); i++) { let r be the remainder obtained by dividing N by i. while (r == 0) { N = N / i; print i # Print the prime factors of N. } } } I'm pretty sure the algorithm is O(sqrt(N)) just from trial-and-error (plugging in values). I think this worst-case is exhibited when N is prime. However, I'm not completely sure about this.
Now I'm also confused about how this runtime changes if it takes Theta(n) time to add, subtract, multiply, and divide two binary digits. This seems a lot harder to analyze for some reason since we're performing a division on every iteration. If I had to guess, I would say O(N^(3/2)), which is just my old answer multiplied by N, but I'm even less sure about this answer.
Could someone please help me?