public class LFUCache<K,V> extends AbstractCacheMap<K,V>
Frequency of use data is kept on all items. The most frequently used items are kept in the cache. Because of the bookkeeping requirements, cache access overhead increases logarithmically with cache size. The advantage is that long term usage patterns are captured well, incidentally making the algorithm scan resistant; the disadvantage, besides the larger access overhead, is that the algorithm doesn't adapt quickly to changing usage patterns, and in particular doesn't help with temporally clustered accesses.
Summary for LFU: not fast, captures frequency of use, scan resistant.
cacheMap, cacheSize, existCustomTimeout, hitCount, missCount, timeout
Constructor and Description |
---|
LFUCache(int maxSize) |
LFUCache(int maxSize,
long timeout) |
Modifier and Type | Method and Description |
---|---|
protected int |
pruneCache()
Prunes expired and, if cache is still full, the LFU element(s) from the cache.
|
clear, createCacheObject, get, getHitCount, getMissCount, isEmpty, isFull, isPruneExpiredActive, isReallyFull, limit, onRemove, prune, put, put, remove, size, snapshot, timeout
public LFUCache(int maxSize)
public LFUCache(int maxSize, long timeout)
protected int pruneCache()
pruneCache
in class AbstractCacheMap<K,V>
Copyright © 2003-present Jodd Team