3
$\begingroup$

I've been reading Brent's original paper and struggling with his worst-case bounds derivation. (I won't even bother with the expected behavior analysis.) I get the Floyd's algorithm analysis as that's straightforward.

For simplicity, take $q = 2, u = 0$ as in practice. This is the algorithm in pseudocode, without multiple statements compressed into one line:

y = x0 r = 1 k = 0 done = False while not done: x = y j = k r *= 2 while not (done or k >= r): k += 1 y = f(y) done = (x == y) n = k - j 

Intuitively after each outer loop iteration, the values after the inner loop exits are

  1. $0 = j < k \le r = 2$
  2. $2 = j < k \le r = 4$
  3. $4 = j < k \le r = 8$
  4. etc.

so eq 3.3, $j = 2^{s-1} [s > 1]$, and eq 3.4, $k = j+n \le 2^s$, make perfect sense.

I don't understand the derivations from 3.7 to 3.10. Given in eq 3.5 that $\bar s $ is the smallest positive integer such that $2^{\bar s-1} \ge \max(m, n+1)$, eq 3.6 says $2^{\bar s} - 2^{\bar s - 1} - 1 \ge n$. Fine. But how does he conclude $s \le \bar s$?

If I compare $k \le 2^s$ with $2^{\bar s}$, and $j = 2^{s-1}$ with $2^{\bar s-1}$, which is what I guess is implied, I can only get $2^s - 2^{s-1} \ge k - j = n$. So both $2^{\bar s - 1} - 1$ and $2^{s-1}$ are $\ge n$, but it doesn't tell me about how $s$ compares to $\bar s$. If the second inequality were the other way around with $2^{s-1} \le n$, then it would make sense to say $s \le \bar s$.

I also don't understand the leap of logic to eq 3.7. The best I can come up with is an argument from contradiction of eq 3.7: suppose $j > 2 \max(m, n+1)$. Then $j/2 = 2^{(s-1)-1} \ge \max(m, n+1)$, so $s-1$ satisfies eq 3.5, which contradicts the least property of $\bar s$ in eq 3.5. But this seems to be a roundabout way I'm not sure of, when it seems like there should a simpler approach.

$\endgroup$

1 Answer 1

-1
$\begingroup$

At any point, $y = x_k$. $r = 2^n$ for some n.

Take a sequence $x_0$ = any value, $x_{j+1} = f(x_j)$ for every $j ≥ 0$, with a finite number of possible values $x_j$. Then the sequence ends in a cycle, depending on f and $x_0$: There is a smallest $k ≥ 1$ and $l ≥ 1$ such that $x_k = x_{k+l} = x_{k+2l} ...$ Floyd's and Brent's algorithms find some k and l, not necessarily the smallest one.

The inner loop ends every time with $y = x_r = x_{2^n}$. If $r < k$ then $x_k$ is not part of the cycle, so $x_k ≠ x_{k+l}$ for all $l ≥ 1$. So the algorithm runs first until we find the first $r = 2^n$ such that $r ≥ k$. We then do r more steps; if $r < l$ then $x_{k+r} ≠ x_k$. This repeats until we have an r such that r ≥ k and r ≥ l. At that point another l steps finds the cycle. So find the smallest $r = 2^l$ such that r ≥ k and r ≥ l, and we need r + l steps.

Now you must calculate $x_{k+l}$, otherwise we would stop the calculation before the end of the first cycle. It makes sense to compare the execution time of the algorithm with k + l, the theoretical lower limit. If r is the step where a cycle is found, then we can show that $k + l > r/4$ and $k + l > 3r/8$ if $1/2 ≤ l/k ≤ 2$.

If calculating $x_i = x_j$ is fast compared to calculating $x_{j+1} = f(x_j)$ then you can observe that the number of steps depends strongly on how k, l compare to r multiplied by powers of two. You might perform the calculation for initial values of r = 5, 6, 7 and 8 for example. Replace x with four values $x_5$, $x_6$, $x_7$ and $x_8$, do one calculation of y and compare the result with four different numbers. The average number of steps might shrink enough to make this faster.

You should also check whether multiplying r by 2 gives the optimal number of steps. The values r can be any sequence that grows faster and faster. Including for example multiplying r by some c > 1 and rounding it to an integer.

$\endgroup$
3
  • $\begingroup$ This doesn't answer my specific question. The paper already has notation for $k,l$. $\endgroup$ Commented Oct 8, 2024 at 23:50
  • $\begingroup$ 1. I like Brent's algorithm. 2. It may very well help someone else. 3. Do you expect me to rewrite this paper? $\endgroup$ Commented Oct 9, 2024 at 9:18
  • $\begingroup$ I expect an answer to answer the question, which was not give a general exposition of Brent's algorithm. $\endgroup$ Commented Oct 9, 2024 at 15:47

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.