After checking this post again, I'm surprised this isn't on here yet.
Caution: I don't have a proof for this, but it should work for all of your most common recurrences. I guess, just be a bit wary of using Landau notation with this. It will probably be best to stick to precise equality where possible.
Domain Transformation / Change of Variables
When dealing with recurrences it's sometimes useful to be able to change your domain if it's unclear how deep the recursion stack will go.
For instance, take the following recurrence:
$$T(n) = T(2^{2^{\sqrt{\log \log n}}}) + \log \log \log n$$
How could we ever solve this? We could expand out the series, but I promise this will get gross really fast. Instead, let's consider how our input changes with each call.
We first have:
- $n$, then
- $2^{2^{\sqrt{\log \log n}}}$, then
- $2^{2^{\sqrt{\log \log (2^{2^{\sqrt{\log \log n}}})}}}$, and so on.
The goal of a domain transformation will now be to change our recurrence into an equivalent $S(k)$ such that instead of the above transitions, we simply have $k, k-1, k-2, \ldots$.
For example, if we let $n = 2^{2^{2^{2^k}}}$, then this is what we get for our above recurrence: $$\begin{align*} T(2^{2^{2^{2^k}}}) & = T(2^{2^{\sqrt{\log \log 2^{2^{2^{2^{k}}}}}}}) + \log \log \log (2^{2^{2^{2^{k}}}})\\ & = T(2^{2^{2^{2^{k-1}}}}) + 2^k \end{align*}$$ Then we can simply rewrite it as: $$T(k) = T(k-1) + 2^k = \sum_{i = 1}^{k} 2^k = 2^{k+1} - 1$$ Then all you have to do is convert $k$ back to $n$ to get: $$T(n) = 2^{(\log \log \log \log n) + 1} - 1 = O(\log \log \log n)$$
With this example, we can now see our goal.
Assume $$T(n) = T(f(n)) + h(n)$$$$T(n) = \begin{cases} h(1) & n = 1\\ a\cdot T(f(n)) + h(n) & \mathrm{otherwise} \end{cases}$$ For some constant $a$ and functions $f(n)$ and $h(n)$.
We are now trying to find some function $g(k) = n$ and $f(g(k)) = g(k-1)$ such that $$\begin{align*} T(g(k)) &= T(f(g(k))) + h(g(k))\\ & = T(g(k-1)) + h(g(k)) \end{align*}$$$$\begin{align*} T(g(k)) &= aT(f(g(k))) + h(g(k))\\ & = a\cdot T(g(k-1)) + h(g(k)) \end{align*}$$
More generally, we want $f^{(i)}(n) = g(k - i)$ where $f^{(i)}(n)$ is the repeated application of $f$ on $n$, $i$ times. (e.g. $f^{(2)}(n) = f(f(n))$). This will allow $g(k)$ to act as the "iterating" function. Where, at depth $i$ of recursion, the work done is simply $h(g(k-i))$.
Then we can easily convert this to $S(k) = T(g(k))$ so that $$S(k) = S(k-1) + h(g(k))$$$$S(k) = a\cdot S(k-1) + h(g(k))$$ Then we only have to worry about summing up $h(g(k))$ for all $k$ up to a given base case. That is, $$S(k) = \sum_{i = g^{-1}(1)}^{k} a^{k - i} h(g(i))$$
If we can determine $S(k) = \gamma(k)$ for some closed form $\gamma$ function, then we can determine $T(n)$ as $$T(n) = \gamma( g^{-1}(n))$$
Then we use this to get a bound on $T(n)$ via one of the other methods above. You could obviously modify this method a little bit to your specification, but in general you're trying to find an iterating function $g(k)$ to turn $T(n)$ into a simple recursion.
I don't know of an exact way to determine $g(k)$ at this point, but I will keep thinking about it and update if it becomes clearer (or if any commenter has some tips!). I have mostly found my $g(k)$ functions through trial and error in the past (see here, here, here, and here for examples).