Preamble
I will present a sort of a packaged and automated solution, which uses deques and metaprogramming to automate caching. This should work for most normal pattern-based functions.
Deques
I will use Daniel Lichtblau's implementation for a deque, taken from his great account on Data Structures and Efficient Algorithms in Mathematica. Here it is:
newQueue[sym_Symbol] := {Unique[sym], 0, Null, 0, 0} Clear[queueLength, emptyQueue, enQueue, deQueue]; qptr = 1; qlen = 2; qelem = 3; qfront = 4; qback = 5; queueLength[Q_] := Q[[qlen]] emptyQueue[Q_] := Q[[qlen]] == 0 SetAttributes[enQueue, HoldFirst] enQueue[Q_, elem_] := (Q[[qlen]]++; Q[[qfront]]++;Q[[qptr]][Q[[qfront]]] = elem;) SetAttributes[deQueue, HoldFirst] deQueue[Q_] := ( Q[[qlen]]--; Q[[qback]]++; Q[[qelem]] = Q[[qptr]][Q[[qback]]];Q[[qptr]][Q[[qback]]] =.; Q[[qelem]] )
I refer to his treatment for details on how this works.
Automating caching via metaprogramming
The idea will be to inject certain caching code into a definition of a function. The deque will be used to store the arguments of a last given number of function calls, and remove older cache values when a function is called again, if the cache is full. Here is an implementation:
ClearAll[makeCached]; SetAttributes[ makeCached, HoldAll]; makeCached::incrlim = "The cache size is too small. Increase the cache size. Further \ computation will use an uncached function"; makeCached[SetDelayed[f_[args___], rhs_], limit_Integer] := Module[{qq, faux, queue, cache, myHold, dv}, dv = DownValues[f]; SetAttributes[myHold, HoldAll]; queue = newQueue[qq]; faux[a : PatternSequence[args]] := With[{result = cache[a]}, result /; ! MatchQ[result, _cache] ]; faux[a : PatternSequence[args]] /; queueLength[queue] < limit := ( enQueue[queue, {a}]; cache[a] = rhs ); faux[a : PatternSequence[args]] := With[{argums = Sequence @@ deQueue[queue]}, If[Head[cache[argums]] === cache, Message[makeCached::incrlim]; Throw[myHold[f[a]], makeCached], (* else *) cache[argums] =.; faux[a]] ]; f[argums___] := Catch[faux[argums], makeCached] /. myHold[code_] :> Block[{f}, DownValues[f] = dv; f[args] := rhs; code ] ];
What is happening here is that we introduce cache as a storage for cached results, queue and qq to keep the queue of function arguments, and we define f so that it basically uses faux to do the computation. If the cache becomes full in the process of a computation (which may happen for deeply recursive functions), we bail out of the computation via Throw and then use the un-cached definition for f to compute that piece of code (technically, I use Block-trick to accomplish that across the execution stack).
Example
Here will be our function:
ClearAll[ff]; ff[_?Negative, _?Negative] = 0; makeCached[ff[x_, y_] := 1 + ff[x - 1, y - 1], 1000];
and we also define it's non-cached counter-part:
ClearAll[fff]; fff[_?Negative, _?Negative] = 0; fff[x_, y_] := 1 + fff[x - 1, y - 1];
Now, the benchmarks:
MapThread[ff,{#,#}&@Range[10000]]//Short//Timing (* {0.796875,{2,3,4,5,6,<<9990>>,9997,9998,9999,10000,10001}} *)
while for non-cached version:
Block[{$RecursionLimit=Infinity},MapThread[fff,{#,#}&@Range[10000]]]//Short//Timing (* {107.437500,{2,3,4,5,6,<<9990>>,9997,9998,9999,10000,10001}} *)
Remarks
The above code should be able to deal with most pattern-based functions. It may have some problems with functions which hold their arguments, or have many definitions that have to be cached. The non-cached definitions (like base definitions for recursion, etc), must be given before makeCached is called.
Another problem which I did not yet deal with here is that the internal variables like faux, cache etc won't be garbage-collected even after we Remove ff. This can be dealt with by introducing an explicit function to release that memory.