Programming Microsoft ASP.NET 4 - Dino Esposito [383]
{
string s = elem.Key.ToString();
Cache.Remove(s);
}
}
Even though the ASP.NET cache is implemented to maintain a neat separation between the application’s items and the system’s items, it is preferable that you delete items in the cache individually. If you have several items to maintain, you might want to build your own wrapper class and expose one single method to clear all the cached data.
Cache Synchronization
Whenever you read or write an individual cache item, from a threading perspective you’re absolutely safe. The ASP.NET Cache object guarantees that no other concurrently running threads can ever interfere with what you’re doing. If you need to ensure that multiple operations on the Cache object occur atomically, that’s a different story. Consider the following code snippet:
var counter = -1;
object o = Cache["Counter"];
if (o == null)
{
// Retrieve the last good known value from a database
// or return a default value
counter = RetrieveLastKnownValue();
}
else
{
counter = (Int32) Cache["Counter"];
counter ++;
Cache["Counter"] = counter;
}
The Cache object is accessed repeatedly in the context of an atomic operation—incrementing a counter. Although individual accesses to Cache are thread-safe, there’s no guarantee that other threads won’t kick in between the various calls. If there’s potential contention on the cached value, you should consider using additional locking constructs, such as the C# lock statement (SyncLock in Microsoft Visual Basic .NET).
Important
Where should you put the lock? If you directly lock the Cache object, you might run into trouble. ASP.NET uses the Cache object extensively, and directly locking the Cache object might have a serious impact on the overall performance of the application. However, most of the time ASP.NET doesn’t access the cache via the Cache object; rather, it accesses the direct data container—that is, the CacheSingle or CacheMultiple class. In this regard, a lock on the Cache object probably won’t affect many ASP.NET components; regardless, it’s a risk that I wouldn’t like to take.
By locking the Cache object, you also risk blocking HTTP modules and handlers active in the pipeline, as well as other pages and sessions in the application that need to use cache entries different from the ones you want to serialize access to.
The best way out seems to be by using a synchronizer—an intermediate but global object that you lock before entering in a piece of code sensitive to concurrency:
lock(yourSynchronizer) {
// Access the Cache here. This pattern must be replicated for
// each access to the cache that requires serialization.
}
The synchronizer object must be global to the application. For example, it can be a static member defined in the global.asax file.
Per-Request Caching
Although you normally tend to cache only global data and data of general interest, to squeeze out every little bit of performance you can also cache per-request data that is long-lived even though it’s used only by a particular page. You place this information in the Cache object.
Another form of per-request caching is possible to improve performance. Working information shared by all controls and components participating in the processing of a request can be stored in a global container for the duration of the request. In this case, though, you might want to use the Items collection on the HttpContext class (discussed in Chapter 16) to park the data because it is automatically freed up at the end of the request and doesn’t involve implicit or explicit locking like Cache.
Designing a Custom Dependency
Let’s say it up front: writing a custom cache dependency object is no picnic. You should have a very good reason to do so, and you should carefully design the new functionality before proceeding. The CacheDependency class is inheritable—you can derive your own class from it to implement an external source of events to invalidate cached items.
The base CacheDependency class handles all the wiring of the new dependency object to the ASP.NET