Class ThreadedLRUCacheStrategy<K,V>

java.lang.Object
com.cedarsoftware.util.cache.ThreadedLRUCacheStrategy<K,V>
Type Parameters:
K - the type of keys maintained by this cache
V - the type of mapped values
All Implemented Interfaces:
Map<K,V>

public class ThreadedLRUCacheStrategy<K,V> extends Object implements Map<K,V>
This class provides a thread-safe Least Recently Used (LRU) cache API that evicts the least recently used items once a threshold is met. It implements the Map interface for convenience.

The Threaded strategy allows for O(1) access for get(), put(), and remove() without blocking. It uses a ConcurrentHashMapNullSafe internally. To ensure that the capacity is honored, whenever put() is called, a scheduled cleanup task is triggered to remove the least recently used items if the cache exceeds the capacity.

LRUCache supports null for both key and value.

Note: This implementation uses a shared scheduler for all cache instances to optimize resource usage.

Author:
John DeRegnaucourt ([email protected])
Copyright (c) Cedar Software LLC

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at

License

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
  • Constructor Details

    • ThreadedLRUCacheStrategy

      public ThreadedLRUCacheStrategy(int capacity, int cleanupDelayMillis)
      Create an LRUCache with the maximum capacity of 'capacity.' The cleanup task is scheduled to run after 'cleanupDelayMillis' milliseconds.
      Parameters:
      capacity - int maximum size for the LRU cache.
      cleanupDelayMillis - int milliseconds before scheduling a cleanup (reduction to capacity if the cache currently exceeds it).
  • Method Details