Consider the following algorithm:
function randomQueries(A, n <- A.size) if n > 1 then p <- n - 1; x <- partition(A, p); randomQueries(A[0,1,...,x-1], x) randomQueries(A[x+1,...,n-1], n-1-x); for y = 1 to x do for z = x + 1 to n - 1 do query(A, y, z); else if n = 1 then query(A,0,0); partition is Hoare's partition-routine (so after the partition, all keys in A[0,...,x-1] are <= A[x] and all keys in A[x+1,...,n-1] are >= A[x]. But I think it only really matters that partition performs n key comparisons. Also, query(A, y, z) uses one key comparison.
Deduce with justification, asymptotically tight bounds for the worst and best case number of key comparisons.
I think the best case is $\Theta(n^2)$; choosing $x=n/2$ each time (which is possible if you arrange the array so that the last element of each subarray visited by the randomQueries function is always the median of the subarray) shows that the best case number $T^{best}(n) \leq n + n/2(n/2-1) + T^{best}(n/2) + T^{best}(n/2-1) \leq n^2 + 2T^{best}(n/2)$ for n large enough, which simplifies to $O(n^2).$ Also, each call takes $n$ key comparisons and there are $\Omega(n)$ calls because as long as $n > 1,$ there will be a recursive call and each call "removes" one index (x) each time.
Similarly, if one always chooses the last element ($x=\frac{n}2$), the worst case runtime should be at least $n + T^{worst}(n-1)$, which evaluates to $\Omega(n^2).$ I'm not sure how to find an upper bound for the worst case.
randomQueries? $\endgroup$