One of The Fastest Zero Alloc LRU Cache for Golang (Go). Maximizes memory use, no garbage collection issues. Use memory size limit instead of specifying item capacity. Batch eviction feature for faster processing
- Updated
Jun 9, 2024 - Go
One of The Fastest Zero Alloc LRU Cache for Golang (Go). Maximizes memory use, no garbage collection issues. Use memory size limit instead of specifying item capacity. Batch eviction feature for faster processing
An implementation of LRU Cache using Doubly Linked Lists and Maps with O(1) read and write time complexity. [500+ NPM Downloads]
Simple cache implementation on java
Least recently used (LRU) in-memory cache.
A LRU cache implemented using object oriented approach
cachify is a lightweight, high-performance, thread-safe Least Recently Used (LRU) cache library for Go. It is designed for in-memory caching with optional support for expiration, eviction callbacks, and dynamic capacity adjustment.
In memory Go implementation of LRU Cache
Different implementations of LRU Page Replacement algorithm and their results
Simple implementations of LRU / LFU cache.
Implementation and analysis of cache replacement policies (Random and Least Recently Used) in a C++-based cache simulator. This project explores cache architecture behavior, evaluates eviction strategies, and measures performance metrics such as cache hits, misses, and flush counts.
Add a description, image, and links to the lru-implementation topic page so that developers can more easily learn about it.
To associate your repository with the lru-implementation topic, visit your repo's landing page and select "manage topics."