Inversion of Control Cached Proxy Pattern

An illustration depicting miscellaneous 'data', next to a picture of Bit the Raccoon.

For a developer, there are fewer things more satisfying than an “aha” moment when they find exactly where a performance bottleneck sits.

When part of an application is not performing well, we often turn to cache as a solution. We grab our nearest caching library, then wrap the under-performing code in cache. Mission accomplished with only a single if-statement. We then bask in the genius of our problem solving skills. “It’s such a simple solution”, we proclaim!

So we sprinkle this caching code (if-statement) all around our solution like little bits of delicious grated cheese on a pizza. Then one day you hear reports of unreplicatable intermittent performance problems, stale content/settings, and even disappearing values. Like grated cheese on a pizza, once a cache is baked in, you will find it very hard to unpick.

In this article we present a caching pattern which avoids common caching pitfalls, by decoupling the cache from the unperforming code using inversion of control. The examples will be given in c#, however, the caching technique is universally applicable.

 

A Common Cache Pattern

One of the most common ways to implement cache is via wrapping under-performing code in an if-statement. This technique means that you are essentially baking the cache into the solution. It can be very hard (and messy) to disable this cache if you need to diagnose issues.

Let’s start with an initial problem! The following is some code which has a performance bottleneck.

public MyModel GetById(int id)
    {
        var value = GetValue(id); // do something expensive

        return value;
    }

A common approach would be to wrap the expensive part of the method with some cache. Here is that same method using cache. It will probably look very familiar.

public MyModel GetById(int id)
    {
        var cacheKey = “myCachePrefix-” + id;

        var value = _cache.Get(cacheKey) as MyModel;
        If (value == null)
        {
            value = GetValue(id); // do something expensive

            _cache.Add(cacheKey, value, 60);
        }

        return value;
    }

While many people may go to production with the code above, the more astute and cautious people will want to move that hardcoded 60sec value to some sort of AppSetting so they can tweak the cache time easily (perhaps on production).

So then you have something like this:

    public MyModel GetById(int id)
    {
        var cacheKey = “myCachePrefix-” + id;

        var value = _cache.Get(cacheKey) as MyModel;
        if (value == null)
        {
            value = GetValue(id); // do something expensive

            int cacheTime = int.Parse(ConfigurationManager.AppSettings[“myCacheTime”]);
            _cache.Add(cacheKey, value, cacheTime);
        }

        return value;
    }

To clean this up, you may decide to pass a function into the cache, so that the cache library will call it automatically. This is a nice way to remove that if statement from your code.

public MyModel GetById(int id)
{
    var cacheKey = “myCachePrefix-” + id;
    int cacheTime = int.Parse(ConfigurationManager.AppSettings[“myCacheTime”]);

    return _cache.Get(cacheKey, () =>
    {
        return GetValue(id);
    }, cacheTime) as MyModel;
}

Some Issues

The first observation is that this code is arguably no longer following SOLID principles. It doesn’t just get the value anymore. It also caches it. The second observation is that the only way to disable the cache is to set the cache time to zero. Arguably, this is not even really “disabling” the cache because the cache class is still being used.

When diagnosing issues the questions you should ask are:

  • Where have I used this caching pattern? Multiple classes & methods?
  • How can I easily disable the cache?
  • If I disable the cache, will it disable all the cache everywhere? Is that useful?
  • Can I easily disable the cache for a single service?

I hope you are beginning to see that the ability to granularly disable cache is as important as the caching itself.

 

Inversion of Control Cached Proxy Pattern

The following caching pattern uses Dependency Injection.

I like to call this the Inversion of Control Cached Proxy Pattern. This is just a fancy name for using interfaces and dependency injection for what they are actually meant for. We will be focusing on caching an entire service.

 

SOLID Service

First let’s introduce an interface…

public interface IMyService
{
    MyModel GetById();
}

Here is the service:

public class MyService: IMyService
{
    public MyService()
    {
        // nothing to see here
    }

    public MyModel GetById(int id)
    {
        var value = GetValue(id); // do something expensive

        return value;
    }
}

Notice that this service implements our original method without any caching. This service remains clean and understandable.

 

Cached Proxy

The following code introduces a proxy, MyServiceCachedProxy. Notice that its constructor parameters include the concrete class MyService, and a caching object.

public class MyServiceCachedProxy: IMyService
{
    MyService _myService;
    ICache _cache;
    int _cacheTime;

    public MyServiceCachedProxy(MyService myService, ICache cache)
    {
        _myService = myService;
        _cache = cache;
        _cacheTime = int.Parse(ConfigurationManager.AppSettings[“myCacheTime”]);
    }

    public MyModel GetById(int id)
    {
        var cacheKey = $”{typeof(MyServiceCachedProxy)}_GetById_{id}”;

        return _cache.Get(cacheKey, () =>
        {
            return _myService.GetById(id);
        }, _cacheTime) as MyModel;
    }
}

Notice that the methods in this class only call the corresponding method in the MyService class. In doing so, we also pass this method call through the cache library as a function to be cached.

 

Advantages

One great advantage of splitting out IMyService into 2 implementations is that the MyService class is now truly SOLID. The cache occurs only in MyServiceCachedProxy class, leaving the MyService as it was originally written.

The code is clean, and understandable. This caching pattern also allows great flexibility.

 

To Register Or Not – A Swappable Proxy

At first glance this code looks very similar to the original cache example, however this strategy has a powerful distinction. The cached proxy class can actually be swapped out at runtime with the original MyService, and vice versa.

Consider a scenario (eg. during development) where you want to disable cache in your service(s). The IoC Cached Proxy allows you to add an application setting which you can check at the startup of your application (see below).

public void RegisterServices(IContainer container)
{
    if (ConfigurationManager.AppSettings["MyServiceCache:Enabled"] == "true")
    {
        container.Register<MyService, MyService>();
        container.Register<IMyService, MyServiceCachedProxy>();
    }
    else
    {
        container.Register<IMyService, MyService>();
    }
}

Therefore the most powerful feature of the IoC Cached Proxy is that you don’t have to register it at all!

In the registration (above), you now have control over your cache. Enabling and disabling a service cache is via setting a flag in your web.config, allowing you to enable caching for individual services. Moreover, if we need to performance profile a specific service, then we can disable the cache for that service, while leaving the cache of other services enabled.

 

Clearing Cache

If you need to clear the cache programmatically then the IoC Cached Proxy pattern makes it super flexible! You will notice that the CachedProxy was using cache keys such as:

var cacheKey = $”{typeof(MyServiceCachedProxy)}_GetById_{id}”;

Cache keys are in the form: “ClassName_MethodName_Method Parameter”.

This convention allows us to clear cache with multiple levels of granularity.

We may clear the cache by:

  • Class MyServiceCachedProxy
  • Method GetById
  • A single cached entity by ID

We can use regular expressions to clear cache programmatically. In the following example we are clearing all the cache that was registered by MyServiceCachedProxy.

_cache.ClearByRegex($"{typeof(MyServiceCachedProxy)}_(.*)");

Note: If there are multiple method parameters, they should all be concatenated & suffixed. Conversely, if there are no method parameters, then the cache key does not require that suffix.

 

What About Passthrough?

There are scenarios where you will not want to cache some methods in your services. The way to accomplish this is for the Cached Proxy to pass the call through to the service, without doing any cache.

In the following example, we show a method called GetSomethingElse(), which calls the corresponding method in the MyService class.

    public MyModel GetSomethingElse(int id)
    {
        return _myService.GetSomethingElse(id);
    }
}

Final Thoughts

I hope this article has given you some food for thought when it comes to your caching implementations. Cache can be your best friend, or your worst friend. The common pattern of sprinkling if-statements around your code can be a source of pain, as the cache can actually get in the way, causing confusion when debugging performance issues.

My recommendation is to always aim to write good fast code first, then only use caching as a last resort. Most importantly, if you are going to use cache, ensure that you can turn it off, and have granular control over it. By using the Inversion of Control Cached Proxy pattern you get to write your code in isolation, then only introduce caching when you need to. The pattern allows injection at runtime, promotes SOLID principles, and granular control over cache clearing.

Cache me on Twitter: @anthonydotnet
Website: anthonydotnet.blogspot.com