BDE 4.14.0 Production release
|
Provide a in-process cache with configurable eviction policy.
This component defines a single class template, bdlcc::Cache
, implementing a thread-safe in-memory key-value cache with a configurable eviction policy.
bdlcc::Cache
class uses similar template parameters to bsl::unordered_map
: the key type (KEY
), the value type (VALUE
), the optional hash function (HASH
), and the optional equal function (EQUAL
). bdlcc::Cache
does not support the standard allocator template parameter (although bslma::Allocator
is supported).
The cache size can be controlled by setting the low watermark and high watermark attributes, which is used instead of a single maximum size attribute for performance benefits. Eviction of cached items starts when size() >= highWatermark
and continues until size() < lowWatermark
. A fixed maximum size is obtained by setting the high and low watermarks to the same value.
Two eviction policies are supported: LRU (Least Recently Used) and FIFO (First In, First Out). With LRU, the item that has not been accessed for the longest period of time will be evicted first. With FIFO, the eviction order is based on the order of insertion, with the earliest inserted item being evicted first.
The bdlcc::Cache
class template is fully thread-safe (see bsldoc_glossary ) provided that the allocator supplied at construction and the default allocator in effect during the lifetime of cached items are both fully thread-safe. The thread-safety of the container does not extend to thread-safety of the contained objects. Thread-safety for the contained objects, if needed, must be arranged by the user separately.
Threads accessing a bdlcc::Cache
may block while waiting for other threads to complete their operations upon the cache. Concurrent reading is supported. Neither readers or writers are starved by the other group.
All of the modifier methods of the cache potentially requires a write lock. Of particular note is the tryGetValue
method, which requires a writer lock only if the eviction queue needs to be modified. This means tryGetValue
requires only a read lock if the eviction policy is set to FIFO or the argument modifyEvictionQueue
is set to false
. For limited cases where contention is likely, temporarily setting modifyEvictionQueue
to false
might be of value.
The visit
method acquires a read lock and calls the supplied visitor function for every item in the cache, or until the visitor function returns false
. If the supplied visitor is expensive or the cache is very large, calls to modifier methods might be starved until the visit
method finishes looping through the cache items. Therefore, the visit
method should be used judiciously by making the method call relatively cheap or ensuring that no time-sensitive write operation is done at the same time as a call to the visit
method. A visit
method call is inexpensive if the visitor returns quickly, or if the visitor returns false after only a subset of the cache items were processed.
When an item is evicted or erased from the cache, the previously set post-eviction callback (via the setPostEvictionCallback
method) will be invoked within the calling thread, supplying a pointer to the item being removed.
The cache object itself should not be used in a post-eviction callback; otherwise, a deadlock may result. Since a write lock is held during the call to the callback, invoking any operation on the cache that acquires a lock inside the callback will lead to a deadlock.
In this section we show intended use of this component.
This examples shows some basic usage of the cache. First, we define a custom post-eviction callback function, myPostEvictionCallback
that simply prints the evicted item to stdout:
Then, we define a bdlcc::Cache
object, myCache
, that maps int
to bsl::string
and uses the LRU eviction policy:
Next, we insert 3 items into the cache and verify that the size of the cache has been updated correctly:
Then, we bulk insert 3 additional items into the cache and verify that the size of the cache has been updated correctly:
Next, we retrieve the second value of the second item stored in the cache using the tryGetValue
method:
Then, we set the cache's post-eviction callback to myPostEvictionCallback
:
Now, we insert two more items into the cache to trigger the eviction behavior:
Notice that after we insert "Steve", the size of the cache is 7, the high watermark. After the following item, "Tim", is inserted, the size of the cache goes back down to 6, the low watermark.
Finally, we observe the following output to stdout:
Notice that the item "John" was not evicted even though it was inserted before "Rob", because "John" was accessed after "Rob" was inserted.
Suppose that a service needs to retrieve some values that are relatively expensive to compute. Clients of the service cannot wait for computing the values, so the service should pre-compute and cache them. In addition, the values are only valid for around one hour, so older items must be periodically updated in the cache. This problem can be solved using bdlcc::Cache
with a background updater thread.
First, we define the types representing the cached values and the cache itself:
Then, suppose that we have access to a function retrieveValue
that returns a MyValue
object given a int
key:
Next, we define a visitor type to aggregate keys of the out-of-date values in the cache:
Then, we define the background thread function to find and update the out-of-date values:
Finally, we define the entry point of the application: