Posts Tagged ‘Caching’

Caching Application Block and database backing store

February 6, 2009 2 comments

Caching can help to overcome some of the challenges associated with enterprise-scale distributed web applications:

  • Performance – Caching improves application performance by storing relevant data as close as possible to the data consumer. This avoids repetitive data creation, processing and transportation
  • Scalability – Storing information in a cache helps save resources and increases scalability as the demands on the application increase
  • Availability – By storing data in a local cache, the application may be able to survive system failures such as network latency, web service problems, and hardware failures

Out of the box ASP.NET provides three primary forms of caching:

  • Page Level output caching – A copy of the HTML that was sent in response to a request is kept in memory and subsequent request are then sent the cached output until the cache expires. This can result in large performance gains as sending the cached output is always very fast and fairly constant
  • User Control level output caching (fragment caching) – Page level output caching may not be feasible in cases where certain parts of the page are customized for the user. Yet, there may be other parts of the page e.g. menus and layout elements which are common to the entire application. The cached controls can be configured to vary based on some set property or any of the variations supported by page level caching. All pages using the same controls share the same cached entries for these controls.
  • And the Cache API – The real power of caching is exposed via the Cache object. ASP.NET includes an easy-to-use caching mechanism that can be used to store objects in memory that require a lot of server resources. The .NET Framework includes the ASP.NET cache in the System.Web namespace which can be accessed through the System.Web.HttpContext.Cache object. WinForm applications can also make use of this Cache object by referencing the System.Web assembly and can access it through the System.Web.HttpRuntime.Cache object. Instances are private to each application and the lifetime is tied to the corresponding application.

By using the Caching Application Block we can write a consistent form of code to implement caching in any application component, be it the web UI, a Windows service, a WinForm desktop application, or a web service. The Caching Application Block is optimized for performance and is both thread safe and exception safe.

The Caching Application Block works with ASP.NET cache and provides a number of features that are not available to the ASP.NET cache such as:

  • The ability to use a persistent backing store – both isolated storage and database backing store
  • The ability to encrypt a cache item’s data – this works only when using a persistent backing store
  • Multiple methods of setting expiration times – absolute time, sliding time, extended time format, file dependency, or never expires
  • The core settings are described in configuration files and can be changed without recompilation of the project
  • Can be extended to create your own expiration policies and storage mechanisms

To use the Caching Application Block you need to add references of the following assemblies to your project:

  • Microsoft.Practices.EnterpriseLibrary.Common
  • Microsoft.Practices.EnterpriseLibrary.Caching

The following namespaces need to  be included in the classes that use the Caching Block:

  • Microsoft.Practices.EnterpriseLibrary.Caching
  • Microsoft.Practices.EnterpriseLibrary.Caching.Expirations
  • Microsoft.Practices.EnterpriseLibrary.Common

If there is a requirement for a persistent backing store then the data access block needs to be included:

  • Microsoft.Practices.EnterpriseLibrary.Data
  • Microsoft.Practices.EnterpriseLibrary.Caching.database

If  there is a requirement to encrypt data in the persistent backing store then the encryption block needs to be included:

  • Microsoft.Practices.EnterpriseLibrary.Security.Cryptography

Configuration the Cache Block

In Memory Cache


         <section name="cachingConfiguration" type="Microsoft.Practices.EnterpriseLibrary.Caching.Configuration.CacheManagerSettings, Microsoft.Practices.EnterpriseLibrary.Caching, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35" /> 




                    <add expirationPollFrequencyInSeconds="60" maximumElementsInCacheBeforeScavenging="10" numberToRemoveWhenScavenging="5" backingStoreName="Null Storage" name="Prices" /> 



                    <add encryptionProviderName="" type="Microsoft.Practices.EnterpriseLibrary.Caching.BackingStoreImplementations .NullBackingStore, Microsoft.Practices.EnterpriseLibrary.Caching, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35" name="Null Storage" /> 



Cache Using Backing Store

Use the database backing store provider when deploying your application on a web farm on multiple computers or on multiple processes on the same machine scenario. To use the database backing store you need to first create the cache database on SQL Server. The script to do this can be found in <Enterprise Library Source Dir>\App Blocks\Src\Caching\Database\Scripts.


         <section name="dataConfiguration" type="Microsoft.Practices.EnterpriseLibrary.Data.Configuration.DatabaseSettings, Microsoft.Practices.EnterpriseLibrary.Data, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35" /> 

         <section name="cachingConfiguration" type="Microsoft.Practices.EnterpriseLibrary.Caching.Configuration.CacheManagerSettings, Microsoft.Practices.EnterpriseLibrary.Caching, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35" /> 


<dataConfiguration defaultDatabase="Northwind" /> 


         <add name="CacheDSN" connectionString="Data Source=(local);Initial Catalog=Caching;Integrated Security=True;User Instance=False" providerName="System.Data.SqlClient" /> 

        <add name="Northwind" connectionString="Data Source=(local);Initial Catalog=Northwind;Integrated Security=True" providerName="System.Data.SqlClient" /> 


<cachingConfiguration defaultCacheManager="Customers"> 


                 <add expirationPollFrequencyInSeconds="60" maximumElementsInCacheBeforeScavenging="11000" numberToRemoveWhenScavenging="10" backingStoreName="DataStorage" name="Customers" /> 



                 <add databaseInstanceName="CacheDSN" partitionName="MyFirstCacheApp" encryptionProviderName="" type="Microsoft.Practices.EnterpriseLibrary.Caching.Database.DataBackingStore, Microsoft.Practices.EnterpriseLibrary.Caching.Database, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35" name="DataStorage" /> 



Cache Application Block Class Reference

CacheFactory Class – The CacheFactory uses the supplied configuration information to determine the type of cache object to construct

GetCacheManager – The GetCacheManager method returns a CacheManager object determined by the configuration information

ICacheManager myCache = CacheFactory.GetCacheManager(); //uses the default cache specified in configuration 

ICacheManager myCustomerCache = CacheManager.GetCacheManager("Customers"); //overload creates the name cache CustomerData 

CacheManager Class – The CacheManger class acts as the interface between the application and the rest of the caching block. It provides all the methods required to manage the applications

GetData – The GetData method returns an object from the cache containing the data that matches the supplied ID. If the data does not exist or if it has expired Null is returned

Customer oCustomer = myCustomerCache.GetData("CustomerID"); 

Add – The Add method will add an item to the cache

myCustomerCache.Add("CustomerID", oCustomer); 

myCustomerCache.Add("CustomerID", oCustomer, scavengingPriority, refreshAction, cacheExpirations); 

Contains – The Contains method returns true if the item exists in the cache

bool dataExists = myCustomerCache.Contains("CustomerID"); 

Remove – The Remove method will delete an item from the cache


Flush – The Flush method removes all items from the cache. If an error occurs during the flush the cache is left unchanged


Monitoring Your Cache Performance

Monitoring your cache usage and performance can help you understand whether your cache is performing as expected and helps you to fine tune your cache solution. You can use the Windows performance monitor application (Perfmon) to view and analyze your cache performance data when it is not delivering the expected performance.

To monitor cache performance

  • Monitor the Cache Insert and Cache Retrieve Times under different cache loads (for example, number of items and size of cache) to identify where your performance problem is coming from. These two performance counters should be as low as possible for your cache to be more responsive to the application. You should note that the cache insert time and retrieve time should be constant regardless of the number of items in cache.
  • Check your Cache Hit/Miss ratio. A cache hit occurs when you request an item from the cache and that item is available and returned to you. A cache miss occurs when you request an item from the cache and that item is not available. If this is low, it indicates that items are rarely in cache when you need them. Possible causes for this include:
    • Your cache loading technique is not effective.
    • Your maximum allowed cache size is too small, causing frequent scavenging operations, which results in cached items being removed to free up memory.
  • Check your Cache Turnover rate. The cache turnover rate refers to the number of insertions and deletions of items from the cache per second. If this is high, it indicates that items are inserted and removed from cache at a high rate. Possible causes for this include:
    • Your maximum allowed cache size is too small, causing frequent scavenging operations which result in cached items being removed to free up memory.
    • Faulty application design, resulting in improper use of the cache.
  • Additionally you can also monitor the Cache Entries and Cache Size counters. Although the Cache Entries counter does not provide enough information regarding your cache performance it can be used with other counters to provide valuable information.

Regular monitoring of your cache should highlight any changes in data use and any bottlenecks that these might introduce. This is the main management task associated with the post-deployment phase of using a caching system.

Synchronizing Caches in a Server Farm

A common problem for distributed applications developers is how you synchronize cached data between all servers in the farm. Generally speaking, if you have a situation in which your cache needs to be synchronized in your server farm, it almost always means that your original design is faulty. You should design your application with clustering in mind and avoid such situations in the first place.

You can configure the Cache Application Block to share the backing store between servers in a web farm. All machines in the farm can have the same cache instance and partition and can read/write to the store. But the in-memory version of the cache is always unique to each server in the farm.

However, if you have one of those rare situations where such synchronization is absolutely required, you should use file dependencies to invalidate the cache when the information in the main data store changes.

To create file dependencies for cache synchronization

  • Create a database trigger that is activated when a record in your data store is changed.
  • Implement this trigger to create an empty file in the file system to be used for notification. This file should be placed either on the computer running SQL Server, a Storage Area Network (SAN), or another central server.
  • Use Application Center replication services to activate a service that copies the file from the central server to all disks in the server farm.
  • Make the creation of the file on each server trigger a dependency event to expire the cached item in the ASP.NET cache on each of the servers in the farm.

NOTE: because replicating a file across the server farm can take time, it is inefficient in cases where the cached data changes every few seconds.

The Caching Application Block should not be used if:

  • The ASP.NET cache provides all the caching functionality that the application requires
  • If security is an issue. While the persistent cache allows data to be encrypted there is no support to encrypt the in memory cache. If a malicious user could gain access to the system they can potentially retrieve cached data. Do not store sensitive information such as passwords and credit numbers in the cache if this is an issue for the application.
  • If multiple applications need to share the cache or the cache and application need to reside on separate systems.
  • The cache should ideally be used to store data that is either expensive to create or expensive to transport and that is at least semi-static in nature. It is generally not a good idea to cache transactional data


    ASP.NET Session State and Velocity Provider

    June 10, 2008 1 comment

    ASP.NET session state supports several different storage options for session data. Below is a quick summary of the different modes of session state and their pros and cons

    • InProc
      • Stores session state in memory on the web server.
      • Fastest, but the more session data, the more memory is consumed on the web server, and that can affect performance.
      • Cannot be used with web farms.
      • Session state will be lost if the worker process (aspnet_wp.exe) recycles, or if the AppDomain restarts.
    • StateServer
      • Stores session state in a separate process called the ASP.NET state service. This ensures that session state is preserved if the web application is restarted and also makes session state available to multiple web servers in a web farm.
      • Objects are serialized to an out of process memory store. The cost of serialization/deserialization can affect performance if you’re storing lots of objects.
      • Solve the session state loss problem in InProc mode. Allows a web farm to store session on a central server.
      • Single point of failure at the State Server.
      • For session state to be maintained across different web servers in the web farm, the Application Path of the website (For example \LM\W3SVC\2) in the IIS Metabase should be identical in all the web servers in the web farm.
    • SQLServer
      • Stores session state in a SQL Server database. This ensures that session state is preserved if the web application is restarted and also makes session state available to multiple web servers in a web farm.
      • Solve the session state loss problem in InProc mode. Allows a web farm to store session on a central server.
      • Session state data can survive a SQL server restart, and you can also take advantage of SQL server failover cluster, after you’ve followed instructions in KB 311029.
      • For session state to be maintained across different web servers in the web farm, the Application Path of the website (For example \LM\W3SVC\2) in the IIS Metabase should be identical in all the web servers in the web farm.

    The best scalability is provided by SQLServer session store but it comes with a slight performance cost. The best performance is provided by InProc session store but it cant really be used in any production web app because of scalability issues.

    What if you could have the scalability of SQLServer session store but with the performance of InProc / StateServer session state? "Velocity" is a distributed caching product to provide the .NET application platform support for developing highly performant, scalable, and highly available applications. It allows any type of data CLR object, XML document, or binary data to be cached. Velocity fuses large numbers of cache nodes in a cluster into a single unified cache and provides transparent access to cache items from any client connected to the cluster.

    Velocity provides a session store provider that allows you to store your ASP.Net Session object in Velocity cache – This enables non-sticky routing allowing scaling your application.

    Hooking up Velocity in your ASP.NET web app is easy. All it takes is some web.config entries:

    1. Configure session state provider element in your app’s web.config file:

    <sessionState mode="Custom" customProvider="SessionStoreProvider"> <providers> <add name="SessionStoreProvider" type="System.Data.Caching.SessionStoreProvider, ClientLibrary"/> </providers> </sessionState>

    2. Configure Velocity client configuration elements in web.config:

    <configSections> <section name="dcacheClient" type=" System.Configuration.IgnoreSectionHandler" allowLocation="true" allowDefinition="Everywhere"/> </configSections> <dcacheClient deployment="simple" localCache="false"> <hosts> <!--List of hosts --> <host name="<serviceHostName>" cachePort="22233" cacheHostName="DistributedCacheService"/> </hosts> </dcacheClient>

    Note : “serviceHostName” is the hostname where distributed cache service is running on cache port “22233”(default setting). Configure these 2 parameters appropriately.

    3. Add reference to or copy to \bin folder the following dll’s that the session store provider uses: CacheBaseLibrary.dll, FabricCommon.dll, ClientLibrary.dll, CASBase.dll, CASClient.dll .

    And that’s about it; your web application is ready to run with the Velocity SessionStoreProvider. 


    • For Velocity Session State provider to work, Cache cluster should be up and running. To learn about to how to deploy Velocity Distributed Cache Cluster, please refer the product documentation.
    • As the provider stores the session objects in out-of-proc distributed cache service, objects that you put in Session should be serializable.

    For more info please refer to the product documentation or the Velocity blog on MSDN How to Use Session State Provider (Microsoft Project code named Velocity)