WordPress term archives for your Astro site

In our last post, we talked about embracing new technologies without having to completely abandon existing infrastructure. For us, finding ways to embrace fast, flexible, and modern frontends while not having to abandon longstanding and well-loved WordPress content management structures for some of our clients has been an enjoyable endeavor.

All little on page templates.

In many ways, creating page templates for Astro projects is very similar to WordPress, as much as a PHP template can be the same as an Astro template that is.

But there are some key differences to consider…

In WordPress, we have a file structure within the theme that maps to the content getting called. Your files and file structure aren’t dictating the structure of your site, they are just responding to a route that has been created and called by the application, WordPress. So, for example, if we have a taxonomy we’ve created for “topics”, the application will create an archive for topics. Us creating a file called archive-topic.php doesn’t tell the system to create that route, only that when that route is called use that template. If that file didn’t exist, the system would look for a generic archive.php template and render the content with it.

With Astro, you have a little more control (some would read that as responsibility) in creating the structure of the site. We can create folders, dynamic routes, nesting, etc. to tell the application what the structure of the site will be and give it templates when building the content we define. At every file where content will be created, we add a getStaticPaths method that tells the application what content we want to build with that template and to generate those pages.

And now for the archives.

So, when we use an application like Astro to create a frontend with our WordPress content, we have to do a little more work to create the term archives that are automatically created within WordPress.

  1. Create the file structure for routing the paths for the archives. When we add files here, we can take advantage of Astro’s structure for creating dynamic routes as well as paginating them. (pages > topic > [topic].astro & pages > topic > [topic] > [page].astro).
  2. Within our [topic].astro file, we’ll create a typical getStaticPaths build out where we call all of the possible slugs (using a collection) for that route. However, rather than building out a page of content associated with that topic here (which you very well could within this template), I like to rewrite with the first page of our paginated archive response. return Astro.rewrite(`/topic/${topic}/1`);
  3. I would then utilize our [topic] > [page].astro to build out the paginated archives for each topic that we have. For this, we’ll pass both the topic and the collection of possible results (posts) into our getStaticPaths method on that file. We check the posts to see if they are associated with the current, dynamic content and then pass it to the return if it is.
  4. At this point, we have all of the posts but we haven’t separated them into chunks to display on paginated results. To do this, we use the paginate function on the return to chunk them and have Astro create the associated numbered pages.

Here is a breakdown of our paginated archive paths using this approach.

export async function getStaticPaths({ paginate }) {

  const topics = await getCollection("topics");
  const resources = await getCollection("resources");

  return topics.flatMap((topic) => {

    const filteredPosts = resources.filter((post) => {

      const topicNodes = Array.isArray(post.data.topics?.nodes)
        ? post.data.topic.nodes
        : [];

      const slugs = topicNodes
        .map((topic) => topic?.slug || "")
        .filter(Boolean);

      return slugs.includes(topic.data.slug);

    });

    return paginate(filteredPosts, {
      params: { topic: topic.data.slug },
      props: {
        topic: topic.data.slug,
        name: topic.data.name,
      },
      pageSize: 16,
    });
  });
}

Curious about other templates when switching to a headless approach? Drop us a line and we’d be happy to chat.

Get faster build times for content updates on your large content Astro site

When building headless, there are generally two approaches, Static Site Generation (SSG) and Server Side Rendering (SSR). The essential difference between these two approaches is when the static content is generated. For SSG, all of the site content is turned into static files at the time the site is built. For SSR, the files are turned into static files as they’re requested – once a file has been requested, it becomes a static file for all future requests (visitors) to that page.

Each approach has its own drawbacks and benefits, but I love SSG for our headless websites. They make for the most performant sites and with a framework like Astro, we can capitalize on some of the benefits of SSR not usually available with SSG (ex. Islands).

The biggest drawback with SSG is that the entire site has to be built at build time. If you have a site with a couple dozen pages and a few hundred blog posts, SSG builds wonderfully. However, a site with hundreds of pages, thousands of posts, and thousands of terms, can take forever to build. Almost an hour in some of our test cases. These builds are not only slow, they can cause performance issues on the server providing the content (WordPress in most of our cases) or memory issues on the server building the site (Netlify in most of our cases).

We can do better than that!

When we need to do a complete site build, there are some limited options we do have to help us improve performance. We can batch our request and cache our source endpoints (WPGraphQL Smart Cache) as well as utilize Astro data stores in the content loader.

But, this can still be a drag for content editors wanting a quick publish experience from their CMS. Getting a build that fully utilizes our cached data store in Astro while updating it with just the latest published / modified content is critical.

Getting incremental with Astro

So, for our objectives we need to:

  • have a way to trigger a build from content updates within WordPress
  • have builds triggered there tell our code that this is a content update
  • only update modified content within our data store
  • keep the ability for full data rebuilds when code updates trigger a deploy

Triggering builds within WordPress

We can trigger builds within Netlify via a webhook. When updating content, you can hook that POST request to your webhook via the save_post action within WordPress. You could also use a plugin such as JAMstack Deployments to quickly configure you build hooks and build image URLs.

For the build hooks, Netlify does allow us to append query parameters to them to modify some of their default behavior. For our builds triggered via WordPress, I append the parameter for trigger_title which will update how that deploy displays within our Netlify dashboard as well as be accessible within our Astro file for our content loader. So, the build hook we trigger specifically from WordPress would look like:

https://api.netlify.com/build_hooks/${hook_id}?trigger_title="wp-content-sync"

When that hook is triggered, we can access the value using the INCOMING_HOOK_TITLE via the Astro import meta:

const INCOMING_HOOK_TITLE = import.meta.env.INCOMING_HOOK_TITLE
  ? import.meta.env.INCOMING_HOOK_TITLE
  : "";

With this, we can conditionally modify our GraphQL requests to call just the last modified posts and not iterate our request to get all of the posts (like we would for a full site build).

const postLoader = async (after = null, results = []) => {
  let hasNextPage = true;

  const query = `
    query GetPosts($first: Int!, $after: String, $orderby: PostObjectsConnectionOrderbyEnum! ) {
      posts( where: {orderby: {field: $orderby, order: DESC}}, first: $first, after: $after ) {
        pageInfo {
          hasNextPage
          endCursor
        }
        nodes {
          id
          title
          slug
          ... query
        }
      }
    }`;

  const variables = {
    first: 2,
    after: after,
    orderby: INCOMING_HOOK_TITLE != "wp-content-sync" ? "MODIFIED" : "DATE",
  };

  const url = "https://example.com/graphql";

  try {
    const response = await fetch(url, {
      method: "POST",
      headers: { "Content-Type": "application/json" },
      body: JSON.stringify({ query, variables }),
    });

    const result = await response.json();

    const posts = result.data.posts.nodes;
    const pageInfo = result.data.posts.pageInfo;

    hasNextPage = pageInfo.hasNextPage;
    after = pageInfo.endCursor;

    results.push(...posts);

    if (INCOMING_HOOK_TITLE != "wp-content-sync") {
      if (hasNextPage) {
        return postLoader(after, results);
      }
    }

    return results;
  } catch (error) {
    console.error("Error fetching posts:", error);
    throw error;
  }
};

Pulling only this latest content, we can bring the build time for a site that normally take 30+ minutes to build with thousands of pieces of content plus archives down to just a couple of minutes while still maintaining full site builds when pushing code updates.