dulwich.lru_cache(View In Hierarchy)
Known subclasses: dulwich.lru_cache.LRUSizeCache
|Method||add||Add a new value to the cache.|
|Method||cache_size||Get the number of entries we will cache.|
|Method||keys||Get the list of keys currently cached.|
|Method||items||Get the key:value pairs as a dict.|
|Method||cleanup||Clear the cache until it shrinks to the requested size.|
|Method||__setitem__||Add a value to the cache, there will be no cleanup function.|
|Method||clear||Clear out all of the cache.|
|Method||resize||Change the number of entries that will be cached.|
|Method||_walk_lru||Walk the LRU list, only meant to be used in tests.|
|Method||_record_access||Record that key was accessed.|
|Method||_remove_lru||Remove one entry from the lru, and handle consequences.|
Add a new value to the cache.
Also, if the entry is ever removed from the cache, call cleanup(key, value).
|Parameters||key||The key to store it under|
|value||The object to store|
|cleanup||None or a function taking (key, value) to indicate 'value' should be cleaned up.|
Get the list of keys currently cached.
Note that values returned here may not be available by the time you request them later. This is simply meant as a peak into the current state.
|Returns||An unordered list of keys that are currently cached.|
Clear the cache until it shrinks to the requested size.
This does not completely wipe the cache, just makes sure it is under the after_cleanup_count.
Remove one entry from the lru, and handle consequences.
If there are no more references to the lru, then this entry should be removed from the cache.