The Power of Caching in JavaScript

Christopher T.

January 18th, 2022

Share This Post

As our applications become larger the need for performance grows. Performance can be achieved by caching. This is an important strategy to leverage in web applications because loading web pages requires time and resources. When we reduce time and resources our applications become faster and more performant. In addition, this increases the positive feedback in user experience for our users.

User experience is huge and by leveraging caching you can achieve great things like improve your search result rankings.

In this post we will go over powerful ways caching can improve the way you think and write efficient code.

Caching Can Be Used to "Revive" Your Application In Tough Times

When your site gets indexed into Google's search results Googlebot takes a snapshot every time the bots crawl through your website. When it does this it keeps a "backup" of your website. This is called the Google Cache.

For example, if your website was hacked or taken down for some odd reason, Google compares what your current website looks to what it last looked like. If your website responds with a 404 then Google is able to serve your website's last "working" state from its cache, effectively saving your website from provoking users to bounce somewhere else.

We can learn from this by applying similar techniques like caching our app's resources to provide an offline experience when the user is disconnected from their internet.

Offline Experience

In the modern world offline experiences for users have become more important than ever.

For example when users have airplane mode turned on we can provide them an offline experience so they can still use our apps.

We can leverage caching for this. One way is to save the last used state of your DOM by registering a listener as shown below:

window.addEventListener('beforeunload', function (evt) {
      html: document.getElementById('root').innerHTML,
      origin: location.origin,
      pathname: location.pathname,
      scrollPos: {
        x: window.scrollX,
        y: window.scrollY,

window.addEventListener('load', function (evt) {
  const _last_ = localStorage.getItem('_last_')

  if (_last_) {
    try {
      const lastState = JSON.parse(_last_)
      if (lastState.origin === location.origin) {
        document.getElementById('root').innerHTML = lastState.html
        // Restore their scroll position
          behavior: 'smooth',
          top: lastState.scrollPos.y,
          left: lastState.scrollPos.x,
    } catch (error) {

This is a general idea of an example to get you started for a more official or secure solution.

The great thing about JavaScript is that it is able to tap into separate technologies without ever having to learn a new language.

We can leverage service workers (which lives in a context separate from the DOM) to pre-cache resources and intercept requests so that they receive faster response times. It is even more secure since it performs the operations on a separate thread.

There are many other caching strategies that you can use from service workers to provide a more robust user experience for your users.

In case you aren't familiar with Workers, here is a short example of how its syntax can look like when working with them:

For dedicated workers:

window.addEventListener('load', function (evt) {
  // Fires up our background dedicated worker
  const worker = new Worker('./myWorker.js')

For service workers:

window.addEventListener('load', function (evt) {
  // Fires up our service worker registration
  if ('serviceWorker' in navigator) {
      .then((registration) => {

For shared workers:

window.addEventListener('load', function (evt) {
  // Fires up our service worker registration
  const sharedWorker = new SharedWorker('mySharedWorker.js')

Workers need to be separated since they run on a separate context, so they should be isolated in their own files:


self.addEventListener('fetch', function (fetchEvent) {
  console.log(`Received fetch event from the front end!`, fetchEvent)


self.addEventListener('message', function (evt) {
  console.log(`Received message from the front end!`, evt)

You can learn more about workers here.

Some great examples of offline experiences are Spotify and Netflix. Most of time we could hardly notice when my phone has no signal when listening to Spotify thanks to its caching.

"Faking" Good Performance

The power of caching shines very well in "faking" things. The word "Faking" has a pretty bad connotation when its used in the real world but in the context of the DOM it is positive! The number one (arguably) goal in web apps is to make our users happy, and in faking good performance you actually help the user feel they're in good hands.

Here are some ways caching can help fake good performance for users:

1. Serving placeholders while images load

Users shouldn't have to be forced to wait for images to be loaded in order for them to feel at home on our pages. By utilizing placeholder images we can allow the user to have an easy experience right away with our pages if we substitute large images with something else (such as a traced SVG) while it finishes loading the real one. We can either cache the actual image to be loaded instantly the next time the user visits or we can cache the dimensions which enable us to just instantly display something like a silhouette with a predicted size while its loading.

If you aren't familiar with what a traced SVG is, this is how it can look like, where the left is the fully loaded image and the right is the partial (traced SVG) placeholder:


2. Lazy loading

Lazy loading is a popular term in web development. It is a useful strategy that requests data for the user client only when needed. Caching can help speed lazy loading if we cache resources that will be requested when it is time. This is an import concept because a lot of data in web apps are almost never requested by the user client. Lazy loading helps you think of what to cache so that it only gets consumed when asked for. You most likely interacted with a some component on a web page making use of this concept.

For example any web page rendering some form of pagination is an implementation of this approach.

3. Reusing request responses

Earlier we talked about service workers and it's worth mentioning here that this is where they truly shine in faking a great performant app for your users.

At the company I work for I was astonished at the positive feedback received after implementing cached responses via service workers. Regular users (and by regular I am referring to people who are not developers) have no idea that their pages are just being picked up from an earlier session of their browser during the "instant response" times they're getting when our background workers cached their visits.

Another important thing to take from this is that we are also skipping the extra requests when we serve cached responses (assuming that the content won't be changed often) which is a plus in saving banwidth and resources.

Faster functions

We can make our functions process data faster over time by caching their results which can effectively become a major performance boost in our applications.

As engineers it is an absolute must to consider which portions of our data should be cached. It's also important to know which caching strategy is best used depending on the situation.

In particular we should consider:

  1. When to update or invalidate data from our cache. It is not uncommon in web development to mishandle caching when it comes to maintaining an up-to-date cache.

  2. Saving frequently accessed data that we can retrieve for use later when it is unnecessary to perform time-consuming or expensive operations. Users are important, so we also need to consider updating the cache periodically so that they can see the latest information.

For example when we have a function responsible for retrieving and returning data we can use a strategy called memoization to retrieve data from the cache if it was already fetched before:

const axios = require('axios')

let fetchCount = 0

function memoize(fn) {
  const cache = {}

  return async function (key, ...args) {
    if (cache[key]) {
      return cache[key]

    return (cache[key] = await fn(...args))

const fetchDogs = memoize(function (page) {
  if (typeof page !== 'number') page = 1
  return axios
    .then((response) => {
    .catch((err) => {
      throw err

async function start() {
  try {
    let dogs = await fetchDogs(`dogs_page_11`, 11)
    return dogs
  } catch (error) {
    const err = error instanceof Error ? error : new Error(String(error))
    console.error(`Error: ${err.message}`, err)

By spamming this function it will only perform the fetch the first time and return the cached data instantly on subsequent calls:

  .then(() => start())
  .then(() => start())
  .then(() => start())
  .then(() => start())
  .then(() => start())
  .then(() => start())
  .then(() => start())
  .then(() => start())
  .then(() => start())
  .then((dogs) => {
    console.log(`Fetched dogs`, dogs)
    console.log(`Fetched ${fetchCount} times`)

This strategy is partially known as the cache-aside pattern once you put in an invalidation strategy like a TTL.

Real world scenarios

  • Gatsby caches builds into JSON objects which gets picked up on subsequent builds in order to make builds process quicker

  • SWR takes advantage of an HTTP cache invalidation strategy called state-while-revalidate to provide quick responses while promising to keep up to date with data

  • The Chrome V8 engine caches compiled code in three stages which is fast and efficient


And that concludes the end of this post! I hope you found this to be valuable and look out for more in the future!

Top online courses in Web Development


Read every story from jsmanifest (and thousands of other writers on medium)

Your membership fee directly supports jsmanifest and other writers you read. You'll also get full access to every story on Medium.

Subscribe to the Newsletter

Get continuous updates


© jsmanifest 2023