2

Is there a built-in way of doing in memory caching in Scala like a MemoryCache class that can be used without any additional dependencies for a simple LRU cache with a size limit? I've found many possibilities but they all require external dependencies.

1
  • java.util.LinkedHashMap can be configured to evict in LRU order. Commented May 30, 2018 at 3:28

1 Answer 1

4

There's nothing in the Standard Library built specifically for memory caching but it's easy enough to roll your own.

// memoize this function (arity 1) def memo1[A,R](f: A=>R): (A=>R) = new collection.mutable.WeakHashMap[A,R] { override def apply(a: A) = getOrElseUpdate(a,f(a)) } 

The reason for using WeakHashMap is that it is designed to drop (forget) seldom-accessed elements in a memory-challenged environment.

So this can be used to cache (memoize) existing methods/functions...

def s2l(s :String) :Long = ??? val s2lM = memo1(s2l) //memoize this String=>Long method val bigNum :Long = s2lM(inputString) //common inputs won't be recalculated 

...or you can define the function logic directly.

//memoized Long-to-Double calculation val l2dM = memo1{ n:Long => //Long=>Double code goes here } 

For functions with larger arity, use a tuple as the Map key.

def memo3[A,B,C,R](f :(A,B,C)=>R) :(A,B,C)=>R = { val cache = new collection.mutable.WeakHashMap[(A,B,C),R] (a:A,b:B,c:C) => cache.getOrElseUpdate((a,b,c), f(a,b,c)) } 
Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.