Caching Application Block and database backing store
Caching can help to overcome some of the challenges associated with enterprise-scale distributed web applications:
- Performance – Caching improves application performance by storing relevant data as close as possible to the data consumer. This avoids repetitive data creation, processing and transportation
- Scalability – Storing information in a cache helps save resources and increases scalability as the demands on the application increase
- Availability – By storing data in a local cache, the application may be able to survive system failures such as network latency, web service problems, and hardware failures
Out of the box ASP.NET provides three primary forms of caching:
- Page Level output caching – A copy of the HTML that was sent in response to a request is kept in memory and subsequent request are then sent the cached output until the cache expires. This can result in large performance gains as sending the cached output is always very fast and fairly constant
- User Control level output caching (fragment caching) – Page level output caching may not be feasible in cases where certain parts of the page are customized for the user. Yet, there may be other parts of the page e.g. menus and layout elements which are common to the entire application. The cached controls can be configured to vary based on some set property or any of the variations supported by page level caching. All pages using the same controls share the same cached entries for these controls.
- And the Cache API – The real power of caching is exposed via the Cache object. ASP.NET includes an easy-to-use caching mechanism that can be used to store objects in memory that require a lot of server resources. The .NET Framework includes the ASP.NET cache in the System.Web namespace which can be accessed through the System.Web.HttpContext.Cache object. WinForm applications can also make use of this Cache object by referencing the System.Web assembly and can access it through the System.Web.HttpRuntime.Cache object. Instances are private to each application and the lifetime is tied to the corresponding application.
By using the Caching Application Block we can write a consistent form of code to implement caching in any application component, be it the web UI, a Windows service, a WinForm desktop application, or a web service. The Caching Application Block is optimized for performance and is both thread safe and exception safe.
The Caching Application Block works with ASP.NET cache and provides a number of features that are not available to the ASP.NET cache such as:
- The ability to use a persistent backing store – both isolated storage and database backing store
- The ability to encrypt a cache item’s data – this works only when using a persistent backing store
- Multiple methods of setting expiration times – absolute time, sliding time, extended time format, file dependency, or never expires
- The core settings are described in configuration files and can be changed without recompilation of the project
- Can be extended to create your own expiration policies and storage mechanisms
To use the Caching Application Block you need to add references of the following assemblies to your project:
The following namespaces need to be included in the classes that use the Caching Block:
If there is a requirement for a persistent backing store then the data access block needs to be included:
If there is a requirement to encrypt data in the persistent backing store then the encryption block needs to be included:
Configuration the Cache Block
In Memory Cache
Cache Using Backing Store
Use the database backing store provider when deploying your application on a web farm on multiple computers or on multiple processes on the same machine scenario. To use the database backing store you need to first create the cache database on SQL Server. The script to do this can be found in <Enterprise Library Source Dir>\App Blocks\Src\Caching\Database\Scripts.
Cache Application Block Class Reference
CacheFactory Class – The CacheFactory uses the supplied configuration information to determine the type of cache object to construct
GetCacheManager – The GetCacheManager method returns a CacheManager object determined by the configuration information
CacheManager Class – The CacheManger class acts as the interface between the application and the rest of the caching block. It provides all the methods required to manage the applications
GetData – The GetData method returns an object from the cache containing the data that matches the supplied ID. If the data does not exist or if it has expired Null is returned
Add – The Add method will add an item to the cache
Contains – The Contains method returns true if the item exists in the cache
Remove – The Remove method will delete an item from the cache
Flush – The Flush method removes all items from the cache. If an error occurs during the flush the cache is left unchanged
Monitoring Your Cache Performance
Monitoring your cache usage and performance can help you understand whether your cache is performing as expected and helps you to fine tune your cache solution. You can use the Windows performance monitor application (Perfmon) to view and analyze your cache performance data when it is not delivering the expected performance.
To monitor cache performance
- Monitor the Cache Insert and Cache Retrieve Times under different cache loads (for example, number of items and size of cache) to identify where your performance problem is coming from. These two performance counters should be as low as possible for your cache to be more responsive to the application. You should note that the cache insert time and retrieve time should be constant regardless of the number of items in cache.
- Check your Cache Hit/Miss ratio. A cache hit occurs when you request an item from the cache and that item is available and returned to you. A cache miss occurs when you request an item from the cache and that item is not available. If this is low, it indicates that items are rarely in cache when you need them. Possible causes for this include:
- Your cache loading technique is not effective.
- Your maximum allowed cache size is too small, causing frequent scavenging operations, which results in cached items being removed to free up memory.
- Your maximum allowed cache size is too small, causing frequent scavenging operations which result in cached items being removed to free up memory.
- Faulty application design, resulting in improper use of the cache.
Regular monitoring of your cache should highlight any changes in data use and any bottlenecks that these might introduce. This is the main management task associated with the post-deployment phase of using a caching system.
Synchronizing Caches in a Server Farm
A common problem for distributed applications developers is how you synchronize cached data between all servers in the farm. Generally speaking, if you have a situation in which your cache needs to be synchronized in your server farm, it almost always means that your original design is faulty. You should design your application with clustering in mind and avoid such situations in the first place.
You can configure the Cache Application Block to share the backing store between servers in a web farm. All machines in the farm can have the same cache instance and partition and can read/write to the store. But the in-memory version of the cache is always unique to each server in the farm.
However, if you have one of those rare situations where such synchronization is absolutely required, you should use file dependencies to invalidate the cache when the information in the main data store changes.
To create file dependencies for cache synchronization
- Create a database trigger that is activated when a record in your data store is changed.
- Implement this trigger to create an empty file in the file system to be used for notification. This file should be placed either on the computer running SQL Server, a Storage Area Network (SAN), or another central server.
- Use Application Center replication services to activate a service that copies the file from the central server to all disks in the server farm.
- Make the creation of the file on each server trigger a dependency event to expire the cached item in the ASP.NET cache on each of the servers in the farm.
NOTE: because replicating a file across the server farm can take time, it is inefficient in cases where the cached data changes every few seconds.
The Caching Application Block should not be used if:
- The ASP.NET cache provides all the caching functionality that the application requires
- If security is an issue. While the persistent cache allows data to be encrypted there is no support to encrypt the in memory cache. If a malicious user could gain access to the system they can potentially retrieve cached data. Do not store sensitive information such as passwords and credit numbers in the cache if this is an issue for the application.
- If multiple applications need to share the cache or the cache and application need to reside on separate systems.
- The cache should ideally be used to store data that is either expensive to create or expensive to transport and that is at least semi-static in nature. It is generally not a good idea to cache transactional data