Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

2
  • 1
    $\begingroup$ Excellent, thanks! One remark - I've already tried it, it works very well except when it comes to cutting cache down it slows down terribly. A way out seems to introduce one more parameter, cut amount, and when the size limit is reached, cut by that amount. I've tried it with limit one billion and cut one million, and it works much quicker this way. $\endgroup$ Commented Sep 21, 2018 at 3:48
  • $\begingroup$ It is now running for a while, and I noticed one problem, but I don't know whether it is solvable. As I said, values most frequently used for n are those for about one third of n. This means that when I erase the less used part of the cache, I am also losing values (those with m larger than one third of current n) which are going to be used more frequently later, when I reach larger n... $\endgroup$ Commented Sep 21, 2018 at 9:55