Caching Web Requests

The prevailing wisdom is to cache content whenever possible. Indeed, that is one of the concepts of an HTTP get, in that you can cache certain content across machines. However, caching can be used as a crunch at times to avoid addressing performance bottlenecks downstream.

For example, recently I ran across an issue in which retrieval of an assessment (a program evaluation after reading some nursing literature) was buckling under load tests. Since we had a deadline coming up, one of the suggestions I came up with was to cache the questions, as they do not change that often. In reality, however, this would just kick the can down the road and let the inefficient code to be left alone. What ended up happening was we bit the bullet and refactored the code that was causing the bottleneck, and we were able to solve the problem without caching.


One of caching’s main downfalls, of course, is that you not should cache things that are changed often. This causes many headaches and bugs simply due to the content being changed and the old content still being in its cached state. When using IIS, if the app pool refreshes, boom, there goes your cache. Another bug that can commonly arise is a null in the cache (due to an app pool refresh, for example), is not accounted for, and you get an unhandled exception in your application. As such, a web application always needs to check for a null value of an element in the cache, and have an immediate strategy for retrieving the item from the nonvolatile data store.