How we Increased Search Traffic by 20x in 4 Months with the Next.js App Router

Adam FortunaAvatar for Adam Fortuna

By Adam Fortuna

22 min read

Get ready for a highly technical blog post. 😂 If you’re not here for code, you might want to check out our Trending Books page, or The 2023 Year in Books.

This post will focus on the technical decisions we’ve made on Hardcover over the last year – which are starting to pay huge dividends! The traffic from Google speaks for itself.

Google Search Engine Traffic over the last 16 months

The blue line is clicks from search traffic, while the purple line is the number of times Hardcover was shown to user (with the left axis being clicks).

When we launched Hardcover in 2021, it was the first React.js (and Next.js) application I’d ever worked on. I’ve shipped a ton of code in jQuery, Backbone.js, Ember.js, Vue.js, and Angular.js but this was my first time working in React. I made all the typical mistakes that new React developers make many of which I’ll go over in this post.

At first these mistakes didn’t prevent Google from indexing and surfacing Hardcover in search results. However, the more we added, the slower the experience got. I wasn’t aware of dynamic component loading (using lazy or dynamic). That meant that when you loaded any page on Hardcover you’d get a TON of code that wasn’t needed to render the page you were seeing.

The result of all of this additional code was that our Google Pagespeed took a nosedive. Performance and Best Practices were in the red, with scores of around 40 (out of 100).

In August we launched a major revamp of the entire site. I became more than a little obsessed with performance. When we finally launched on August 15, 2023, our Google Pagespeed was nearly perfect!

Google Pagespeed… soooo close to perfect!

For the homepage, the download size was reduced from 500k to 80k. That’s a lot less JavaScript to download, but also a lot less to run.

This update focused not just on UX, but on making Hardcover fast. I’m aiming for 100% PageSpeed on Google, no layout shift, and an instant initial page load with as much cached on CDNs close to the user as possible.

Why is this Important?

Speed plays a huge role in how well any site can do. Besides being a better experience for users, Google and other search engines weigh speed very heavily when deciding to rank your site.

Earlier this year we saw a troubling trend: our search clicks dropped significantly. It turns out that our PageSpeed index had declined due to some technical changes I (unknowingly) made combined with some changes to Google’s ranking algorithm.

Hardcover SEO traffic
Hardcover SEO traffic

How much speed plays into search engine results is a complex topic. In the Reddit discussion of this post, speeding up the site likely wasn’t the only thing to improve our search traffic. We also restructured some pages with new data which could have increased content and structure. We’ve also continued to see more referring domains – which Google takes as a vote of confidence and helps out with search as well.

SEO is a complex topic that’s notoriously difficult to understand how any single cause translates into effect. It does seem like there’s a correlation between traffic and page size in both directions. In the months before the App Router update, I added more complexity and JavaScript to the site. In that time our search traffic went down – even as our referring domains went up. There are differing factors besides page speed here.

This correlation on the way down made me realize: maybe we need to pay more attention to our site speed. That started me on a quest: how fast can we get Hardcover to load?

Our Tech Stack: A Quick Overview

Before we get into the changes we’ve made for performance, you need a little context. Hardcover isn’t just a Next.js app. We’re a Next.js, Ruby on Rails, Hasura, Typesense, Sidekiq, Postgres, Loops, Google Cloud app. 😅

Our Next.js site is responsible for everything the user sees but almost none of the API. Basically it’s the view layer of this entire system.

OK, with that out of the way, here’s how we sped things up.

1. Render It All On the Server

When you request Hardcovers homepage (and increasingly more pages across the site) what you see is rendered entirely on the server by the Next.js 13 14 App Router.

That wasn’t always the case. Up until August of this year, we were using the Pages router – usually without any server props.

The initial idea was that the site would be 100% statically generated with all data coming from the API from the client. This would allow us to easily transition to mobile apps using React Native using this same setup.

Once we realized that Capacitor.js could wrap our website, that advantage became meaningless. We could just develop a website and wrap it with Capacitor. We released mobile apps on Android and iOS in March of 2023 and have focused on building a solid experience on both ever since.

Here’s what a typical page request looked like before our most recent update:

  1. User: Loads a page, say The Way of Kings.
  2. Hardcover: Sends the same HTML for the book page to the client.
  3. User: Requests their API token from the Hardcover API.
  4. User: Requests info about their current user (to show their avatar in navigation).
  5. User: Requests info about The Way of Kings.
  6. User: Requests info about their status for this book.

In this case the Next.js app isn’t doing much. The Book Page HTML/JS sent down to the user was the same for every page, then on the client side we’d make API requests to get the data to show. It worked, but it meant a bunch of API calls before the user could see anything.

If you load this page today when not logged in, you’ll see there are zero API requests. Everything you see is sent in the initial HTML by the server! Here’s the new flow for this.

  1. User: Loads a page, say The Way of Kings.
  2. Next.js: Processes this page in two parts
    • For the layout and wrapper of the page, Next.js determines if the user is logged in and shows a different header if they are.
      • The layout also includes the users API token (either a guest token, or tied to their user).
    • For this route, Next.js caches all network requests on this page for an hour, resulting in the same HTML for the book page to every request during that time.
  3. Users Browser: Browser reads the initial HTML (which may have their avatar if they’re logged in, otherwise a login link).
    • Guest: Nothing more to do! The initial HTML from the server has everything
    • Logged in: The returned HTML contains sections that only show up if logged in (your status for this book, friends activity, similar readers, match percentage, etc).

Since the book route is cached for an hour, it further speeds everything up. Currently this is cached using export const revalidate = 3600; for the route, however we’d like to fully cache the entire route.

Even though this page is generated on the server, it includes a number of client components using Islands Architecture (more on this later).

Now the end user needs 4 fewer API requests to render this page (!). It also means that Google and other search engines have 4 fewer points of failure.

What this helps with: Cumulative Layout Shift, Largest Contentful Paint, Avoid large layout shifts, Minimize main-thread work, Reduce JavaScript execution time, Avoid long main-thread tasks.

You might be asking: “But there’s dynamic data on this page! How can it be cached?” there’s a few solutions for that.

2. Fetch Server Side, Hydrate Client Side

If you’re logged into Hardcover, you’ll see your avatar in the top right of every page. Some of the navigation links are also dynamic based on your username (like /@adam, /@adam/books and /@adam/lists).

We could render this server side for logged in users and that would work. We could even do a full page reload ( window.location = window.location.href ) when people change their avatar or their username.

We initially were doing that, but there was a problem with Capacitor. If you set the window.location from Capacitor it wouldn’t reload the page, it would exit the app and open the current page in a web browser. That solution was out.

So how do we start the page with these links but also allow them to be changed and load them without a full page reload?

The solution came in the form of a new feature from the Apollo Client library, the library we use to fetch data, called useFragment. Solving this took me WEEKS of trial and error, but I’m happy with the solution.

Our solution starts in our layout file. Here’s what that template looks like. Notice the <CurrentUserLoader /> which is doing a lot of work.

app/layout.tsx

<html>
  <body>
    <Providers>
      <CurrentUserLoader />

      <Nav />
      {children}
      <Footer />

      <SharedPlaceholders />
      <BackgroundManager />
    </Providers>
  </body>
</html>Code language: HTML, XML (xml)

components/background/CurrentUserLoader.tsx

import { Suspense } from "react";
import { loadCurrentSession } from "queries/users/loadCurrentSession";
import CurrentUserClientLoader from "./CurrentUserClientLoader";

// Loads everything about the logged in user on the client side
export default async function CurrentUserLoader() {
  const { session, user } = await loadCurrentSession();

  return (
    <Suspense>
      <CurrentUserClientLoader session={session} user={user} />
    </Suspense>
  );
}
Code language: JavaScript (javascript)

Up to this point everything has happened entirely on the server.

This last file (CurrentUserLoader.tsx) has one responsibility: loading the current user and passing it to a client component. loadCurrentSession (not shown) will get the users info from their cookie and hits our GraphQL API to get all the data needed for the user.

This includes their username and avatar, but also their status on every book they’ve ever read. More on why we need that later.

This is passed into the CurrentUserClientLoader component. This is the bridge between server side and client side. This file does a lot.

components/background/CurrentUserClientLoader.tsx

"use client";

import { Suspense, lazy, useEffect, useRef } from "react";
import { useDispatch } from "react-redux";
import { currentUserActions } from "features/currentUser/currentUserSlice";
import { UserType } from "types";
import { HardcoverSession } from "app/(api)/api/auth/[...nextauth]/options";
import { bootstrapUserByUserId } from "queries/users/bootstrapUserById";
import { getClient } from "lib/apollo/client";

const NotificationsUpdater = lazy(() => import("./NotificationsUpdater"));
const CurrentUserClientManager = lazy(
  () => import("./CurrentUserClientManager")
);

// Loads everything about the logged in user on the client side
interface Props {
  session: HardcoverSession;
  user?: UserType;
}
export default function CurrentUserClientLoader({ session, user }: Props) {
  const initialized = useRef(false); // Prevents duplicate loading for some reason
  const loaded = useRef(false);
  const dispatch = useDispatch();

  // This will load all bootstrapped data into Apollo's fragment cache
  // Side note:
  //   I'd love to get rid of this and hand off the server cache
  //   to the client cache, but that's not currently possible.
  function loadFragmentCache() {
    getClient().writeQuery({
      query: bootstrapUserByUserId,
      data: { user },
      variables: {
        userId: user.id,
      },
    });
  }

  // Set the session and user in Redux
  useEffect(() => {
    if (!initialized?.current) {
      initialized.current = true;
      if (user) {
        loadFragmentCache();
      }

      dispatch(currentUserActions.setSession(session));
      dispatch(currentUserActions.setInitialUser(user as UserType));
      loaded.current = true;
    }
  }, []);

  if (!loaded) {
    return false;
  }

  return (
    <Suspense>
      <CurrentUserClientManager />
      <NotificationsUpdater />
    </Suspense>
  );
}
Code language: JavaScript (javascript)

In this file, we’ve handed over our data from the server to the client. This handled three important steps:

  • Load the data about the user into the Apollo Cache
  • Load the current user into Redux
  • Load a client component which will keep Apollo Cache and Redux in sync.

There’s a lot going on here, but those are the important bits. We defer as much of this as we can using Suspense so that the initial page load isn’t blocked and we can load more important JavaScript while this is running. It also means that the CurrentUserClientManager and NotificationsUpdater won’t be downloaded unless the user is logged in.

The last piece (code shown next) is the client component which will keep Redux in sync with Apollo’s cache. This means that when a user changes their username or avatar, we’ll update it here.

There are a bunch of places where a user makes changes to their user info. We considered trying to update it at each of those places. Having it here in one place makes it less likely we’ll miss one and throw the entire user state off.

The “magic” of this is the useFragment call. Because we already set the cache in the previous component, this call will fetch that fragment without needing to make an API call.

However, if you’re using the site and you login, then we’ll use this to make that initial call and fill the cache. It’s incredibly fast without even needing a page reload.

components/background/CurrentUserClientManager.tsx

"use client";

import { useEffect, useRef } from "react";
import { useDispatch, useSelector } from "react-redux";
import { useFragment, useQuery } from "@apollo/client";
import {
  getReloadUser,
  getTokenSelector,
  getUserId,
} from "features/currentUser/currentUserSelector";
import { useCurrentSession } from "hooks/useCurrentSession";
import { currentUserActions } from "features/currentUser/currentUserSlice";
import OwnerFragmentCompiled from "queries/users/fragments/OwnerFragmentCompiled";
import { UserType } from "types";
import { bootstrapUserByUserId } from "queries/users/bootstrapUserById";

// Loads everything about the logged in user on the client side
export default function CurrentUserClientManager() {
  const dispatch = useDispatch();
  const userId = useSelector(getUserId);
  const token = useSelector(getTokenSelector);
  const { resetSession } = useCurrentSession();
  const refreshing = useSelector(getReloadUser);
  const startedRefresh = useRef(false);

  useEffect(() => {
    if (refreshing) {
      startedRefresh.current = true;
    }
    if (!refreshing && startedRefresh.current) {
      startedRefresh.current = false;
    }
  }, [refreshing]);

  const { data: currentUserData, complete } = useFragment({
    fragment: OwnerFragmentCompiled,
    fragmentName: "OwnerFragment",
    from: {
      __typename: "users",
      id: userId || 0,
    },
  });

  const { loading } = useQuery(bootstrapUserByUserId, {
    fetchPolicy: "cache-and-network",
    skip: !userId || (complete && !refreshing),
    variables: {
      userId,
    },
  });

  // Reset the session if the user logs out or logs back in
  useEffect(() => {
    if (loading) {
      return;
    }

    const currentUser = {
      ...currentUserData,
      notificationsCount: currentUserData.notifications?.aggregate?.count,
    };

    if (complete && userId !== currentUser?.id) {
      resetSession();
    } else if (token) {
      // Done loading user, or user cache changed
      if (userId && currentUser?.id) {
        dispatch(currentUserActions.setUser(currentUser as UserType));
      }
      // No current user, done loading
      if (!userId) {
        dispatch(currentUserActions.setUser(null));
      }
    }
  }, [token, userId, currentUserData, startedRefresh?.current]);

  return false;
}Code language: JavaScript (javascript)

I can’t claim this is the best way to handle this scenario, but it’s the best one we’ve found. It has an added bonus too: saving state for every book you’ve ever read (that’ll be important for #3 next).

What this helps with: Cumulative Layout Shift, Largest Contentful Paint, Avoid large layout shifts, Minimize main-thread work, Reduce JavaScript execution time, Avoid long main-thread tasks.

3. Bootstrap the Most Important Data

The most core feature of Hardcover is allowing readers to track which books they’ve read and want to read. This is our killer feature, and we wanted to make it as good as possible.

Behind the scenes we have a table in our PostgreSQL database called user_books. This table has user_id, book_id and status_id columns. We show your status, or a gray button if you haven’t interacted with this book before.

There’s a lot to this little feature.

For starters, if you see two of this button for the same book, it should keep them in sync.

The buttons on the left correspond to the same books on the right.

But the biggest question was “where do we load this data?“. Originally we’d load a readers status with a book in the same query we loaded data about the book. That worked when we were loading everything on the client side. Now that we were loading that data on the server side, if we used the same approach we wouldn’t be able to cache anything.

The solution to this is to use a technique called bootstrapping data. In that initial user load, we’re also loading their status for every book. Even for readers with 10,000 books saved, this ends up taking less than 100ms since it’s just 3 integers.

src/queries/user/fragments/OwnerFragment.ts

export default `
  fragment UserBookStatusFragment on user_books {
    bookId:book_id
    userId:user_id
    statusId:status_id
  }

  fragment OwnerFragment on users {
    id
    cachedImage:cached_image
    name
    username
    flair
    pro
    librarianRoles:librarian_roles

    user_books {
      ...UserBookStatusFragment
    }
  }
`;
Code language: JavaScript (javascript)

Next, we need to let Apollo know that we’ll lookup book status not by id, but by a composite key of userId and bookId. This will allow us to load any users statusId for any bookId.

src/lib/apollo/cache.ts

import { NextSSRInMemoryCache } from "@apollo/experimental-nextjs-app-support/ssr";
import { createFragmentRegistry } from "@apollo/client/cache";
import fragments from "queries/fragments";

const cache = new NextSSRInMemoryCache({
  fragments: createFragmentRegistry(fragments),
  typePolicies: {
    user_books: {
      keyFields: ["userId", "bookId"],
    },
  },
});

export default cache;Code language: JavaScript (javascript)

We do this a bunch across Hardcover. The same works for “liking” something. That way if you like something on one page, or we load your like from one page, we can mark it as liked on another page.

This comes together on the Book Button itself which takes in a bookId and shows a button reflecting the current users status (or a placeholder if they’re not logged in).

We call this button everywhere. Since it’s shown everywhere, it’ll sometimes show up before the users session has even loaded into Redux. We account for that with a loading state that quickly updates.

components/BookButton/index.tsx

"use client";

import { useFragment } from "@apollo/client";
import { useSelector } from "react-redux";
import UserBookStatusFragmentCompiled from "queries/userBooks/fragments/UserBookStatusFragmentCompiled";
import { userLoadingState } from "features/currentUser/currentUserSelector";
import { useEffect, useState } from "react";
import { UserBookStatusType } from "types/UserBookType";
import StatusButton from "./StatusButton";

export type BookButtonSizeType = "sm" | "md";
export interface Props {
  bookId: number;
  referrerUserId?: number;
  size?: BookButtonSizeType;
  onClick?: any;
}

export default function BookButton({
  bookId,
  referrerUserId,
  size = "md",
  onClick,
}: Props) {
  const { currentUserId, userLoaded } = useSelector(userLoadingState);
  const [userBook, setUserBook] = useState<UserBookStatusType>({
    userId: currentUserId,
    bookId,
    statusId: 0,
  });

  const { data, complete } = useFragment({
    fragment: UserBookStatusFragmentCompiled,
    fragmentName: "UserBookStatusFragment",
    from: {
      __typename: "user_books",
      bookId,
      userId: currentUserId || 0,
    },
  });

  useEffect(() => {
    setUserBook(
      data?.userId
        ? data
        : {
            userId: currentUserId,
            bookId,
            statusId: 0,
          }
    );
  }, [currentUserId, data, complete, bookId]);

  return (
    <StatusButton
      loading={!userLoaded}
      userBook={userBook}
      size={size}
      referrerUserId={referrerUserId}
      onClick={onClick}
    />
  );
}
Code language: JavaScript (javascript)

Side note: we have a concept of a referrer for a book. This allows readers to see which books they’re actually influencing other people to be excited about.

Of all the parts of Hardcover, this button and the drawer that shows up when you click on are my favorite feature. This allows us to do things that wouldn’t have been performant otherwise like loading your status in search for each book and showing your status on book covers (with a small green checkbox) throughout the site.

This is possible because your status for each book is already loaded!

On the book page, we show all the other books in this series. This code is generated on the server, then hydrated on the client. The BookCover component will check your status on a book and add a small green checkmark for books you’ve read.

Doing this without fragment cache would’ve been a nightmare. I know because we have to do something similar for Match Percentage (our score from 0%-100% for how much we think you’ll like each book). Luckily we show that in fewer places, so it’s easier to manage.

You can use this same technique in your own applications! If there’s data that you show across the site, consider bootstrapping it on load.

What this helps with: Reduce unused JavaScript, Minimize main-thread work, Reduce JavaScript execution time.

4. Create Ghost Components

One library we use everywhere is HeadlessUI. We use Menu and Popover for our dropdowns, Combobox for autocomplete, Dialog for our search Modal and more.

But loading this library on every page adds another 50kb or more of JavaScript and additional code that needs to be compiled on every page request. That might not seem like a lot, but it was enough to knock 10 points off our Mobile Google PageSpeed Score.

When you load a page right now we won’t download HeadlessUI until you interact with a component that needs it. Here’s how this looks to the user:

Watch for the spinner on click

Notice the loading indicator for a moment. That pause is your browser downloading all the JS needed to expand the menu.

This might not seem like a lot, but it adds up!

Each of these components works like this. We show a basic button initially, but if they click on it we load and show the full version. Here’s how that works for the Explore dropdown shown above.

components/nav/DesktopNav/DesktopLinks/ExploreMenuLink.tsx

"use client";

import { Suspense, lazy, useState } from "react";
import ExploreLinkMenuInactive from "./ExploreLinkMenuInactive";

const ExploreLinkMenuActive = lazy(() => import("./ExploreLinkMenuActive"));

export default function ExploreLink() {
  const [clicked, setClicked] = useState(false);

  // Don't load Headless UI and the entire dropdown component unless it's clicked on
  if (clicked) {
    return (
      <Suspense fallback={<ExploreLinkMenuInactive loading />}>
        <ExploreLinkMenuActive />
      </Suspense>
    );
  }

  return <ExploreLinkMenuInactive onClick={() => setClicked(true)} />;
}
Code language: JavaScript (javascript)

In addition to less JavaScript, this is also less HTML sent down to the browser. That allows the browser to parse the page faster and reduce the dreaded “Avoid an excessive DOM size” performance issue.

Now, you might be thinking “but that’s a client component!”. That’s actually OK! The initial HTML that’s sent down to the user will include the rendered HTML of the ExploreLinkMenuInactive button. After the JavaScript of the page is downloaded, the buttons will become clickable.

This prioritizes what the user sees before what they can do.

What this helps with: Reduce unused JavaScript, Minimize main-thread work, Reduce JavaScript execution time, Avoid an excessive DOM size.

5. Optimize Font Loading

I overlooked this one for a while, but the solution was embarrassingly simple.

We use two fonts on Hardcover: Inter (sans serif from Google Fonts) and New Spirit (serif from Adobe).

Originally we’d load our global.css file which would load another CSS file from Adobe, which would then load the fonts.

Google has a name for this problem: “Avoid chaining critical requests”. In order for a page to load, we needed to wait for 4 chained requests to complete!

Next.js to the rescue! They have a pair of libraries that help with this exact problem: next/font. These will do all of the work of loading these fonts and injecting their value into a CSS variable in our body tag. We can use those variables in our TailwindCSS configuration.

app/layout.tsx

import "styles/globals.css";

import { Inter } from "next/font/google";
import localFont from "next/font/local";


const inter = Inter({
  subsets: ["latin"],
  weight: ["400", "600", "700"],
  display: "swap",
  variable: "--font-sans",
});

const newSpirit = localFont({
  src: [
    {
      path: "../../public/fonts/new-spirit-400.woff2",
      weight: "400",
      style: "normal",
    },
    {
      path: "../../public/fonts/new-spirit-600.woff2",
      weight: "600",
      style: "normal",
    },
  ],
  variable: "--font-serif",
});

export default async function RootLayout({children}) {
  return (
    <html className={`antialiased ${inter.variable} ${newSpirit.variable}`}>
     ...
    </html>
  );
}
Code language: JavaScript (javascript)

We can configure tailwind to use these fonts from their variable. That way we can use the class font-serif as shorthand for New Spirit and font-sans for Inter.

tailwind.config.js

const defaultTheme = require("tailwindcss/defaultTheme");

module.exports = {
  theme: {
    extend: {
      fontFamily: {
        sans: [
          "var(--font-sans)",
          "Inter var",
          ...defaultTheme.fontFamily.sans,
        ],
        serif: ["var(--font-serif)", "new-spirit, palatino, serif"],
      },
    },
  },
};
Code language: JavaScript (javascript)

next/font does two other things that are just cool. First they add prefetch HTML tags at the TOP of your HTML’s head.

Rendered HTML

<link rel="preload" href="/_next/static/media/2a0c0940792da67a-s.p.woff2" as="font" crossorigin="" type="font/woff2">Code language: HTML, XML (xml)

This means your browser will start loading these fonts immediately, not at the end of a 4-chain series.

Secondly, Next.js will add the CSS for these fonts to your first CSS that’s loaded. This means that when the first CSS file is parsed, it should have already started preloading the fonts and it can just start using them at the same time the CSS is read. The fonts load so fast now that I don’t even notice the font swap.

What this helps with: Avoid chaining critical requests, Largest Contentful Paint element, Total Blocking Time.

6. Remove Excess Providers

Providers in React are components you can wrap your entire application in. Their functionality is accessible from any component that’s nested inside them – however deep.

In the initial version of Hardcover we abused this concept. We had a dozen providers, and would add them willy-nilly when we needed something. Whenever any of them re-rendered, the entire page would re-render. Sometimes that would even cause the page to be unusable.

In our migration from client side rendering to server side rendering, we narrowed down our providers to just three:

  • BugSnag – Which we use for error tracking.
  • Apollo – Our Network layer and network caching
  • Redux – Our global state manager.

I’ve even considered replacing Redux with Zustand but so far we haven’t needed to. We barely use Redux at all, aside for the current user, the state of the Book Drawer and the state of the UI (ex: Is the Search Modal open?).

If there’s one place you should focus your attention, it’s your providers. According to my performance analysis (that’s next), this was one of the biggest areas we needed (and still need) work.

Side note: I’ve played with the idea of removing Apollo and using it with it’s manual configuration. However this comment made it clear Apollo is doing a lot more work than I realized both on the server side and the client side.

What this helps with: Reduce unused JavaScript, Minimize main-thread work, Reduce JavaScript execution time.

7. Profile Your Application’s Performance with Chrome Developer Tools

If you’re like me, eventually you’ll run into the dreaded “Minimize main-thread work” diagnostic from Google Pagespeed. This is one of the harder ones to reduce.

Creating Ghost components will help to some extent, but you’ll likely want to do more.

One feature of Chrome I’d recommend you learn is how to test the Performance of your application.

You can do this by navigating to the page you want to check, opening up Chrome Developer tools, then clicking the “Start Profiling and reload page” button (the second button in the top left that looks like a reload/refresh) icon.

After a few seconds when the page is completely reloaded you can click stop.

Next, you’ll see an insanely detailed look at how your application runs.

If you’ve never looked at this before it can be quite intimidating. The X-axis here is time. The longer the bar, the longer execution of that function takes. Each “box” here is a function that is being called.

What’s great about this view is that the Y-axis shows which functions each function calls. You can dig down and see which functions of your application are taking the most time to complete.

The part you want to look for first are sections that have a red warning overlay on them (like the image above). That indicates that Chrome considers this a “long task”.

These long tasks are the ideal place to start when improving your performance.When I did this exercise on Hardcover, there were three areas that stood out as having the longest runtime:

  • Providers – Hardcover had a BUNCH of providers. That included theme (dark vs light mode), Error caching, current user, next-auth, Apollo Client, Redux and more.
  • Headless UI Components – We used these in the navigation as well as other parts of each page.
  • FontAwesome Icons – Icons we show everywhere.

Each of these has it’s own solution which are very specific to our application.

For providers, we narrowed our requirements down to just three (error catching, Apollo, Redux). Everything else we moved into a BackgroundProcesses component which is loaded last in the layout.

That file handles work asynchronously without slowing down page rendering. That includes theme management, mobile management, saving referrer, Plausible Analytics, preloading resources and more.

For Headless UI Components, we switched to using Ghost Components (#4 described above). This cut the render time from 50ms to 12ms while reducing the downloaded JavaScript by more than 50kb.

Performance for a Ghost Component

For FontAwesome Icons, I went a little overboard. I couldn’t figure out a good solution (I’m curious for feedback on this one). I ended up copying all FontAwesome icons to our repository, loading them as SVG and passing that into a new custom component. Now there’s no overhead from the FontAwesome Library, and each SVG is included in the HTML passed down.

Next Steps for SEO and Performance

There’s still more we need to improve. One of the biggest is that our Lists load everything on the client side. We’re working to restructure those to also render server side. I’m excited about that switch, as it’ll allow us to do more sorting and filtering options too.

Once that fix is in, our main focus will turn from performance to content strategy. Check back in a year for an update on how that’s going. 😂

← More from the blog