I rebuilt PaperGiant's SEO infrastructure in 3 days with Claude Code

I spent three days with Claude Code fixing the SEO on the Paper Giant website. The site is Next.js 15 with a Contentful headless CMS. It worked fine, but nobody had touched the site in years. It had no structured data, most pages were missing meta descriptions, there was a sitemap full of dead pages, and no real form tracking to speak of.

An annual analytics review made it clear organic traffic had dropped a lot, partly because AI Overviews are eating into click-through rates. So I blocked out three days and worked through it.

Day 1: Structured data, meta descriptions, sitemap

I added JSON-LD structured data to every page type — Organisation and WebSite globally, Article on case studies, NewsArticle on news, Service on service pages, BreadcrumbList on sectors, CollectionPage on listings. The annoying part was dateModified — Contentful doesn’t expose _updatedAt by default, so I had to pipe it through the data transformation layer.

Most pages had no meta description, so Google was picking whatever it wanted. I wrote a helper that pulls the first meaningful sentence from each page’s content and truncates on word boundaries. Case study descriptions include the sector and services. Not perfect, but much better than empty.

The sitemap had hundreds of URLs. A lot of them were pages for people who’d left, or thin pages that were basically just links with a sentence of commentary. I removed the dead ones, added noindex,follow to the thin ones, pulled those out of the sitemap, and split the rest into per-type sitemaps. Cut it by more than half.

The site had also gone through an substantial IA redesign that changed service URLs. This is worth a seperate post, but we went from about 16 top-level services to 5, and they feel much cleaner to me now, and capture the scope and breadth of what we can do.

I added permanent redirects from the old slugs to the new ones based on which old URLs still had organic traffic in GA4.

Day 2: Performance, analytics, content strategy

Through the course of the sprint, I found a bunch of code inefficiencies. One case study page was fetching the entire Contentful database — 1000 items, ~700KB — just to find 3 related case studies. I replaced it with a filtered query, and the page payload went from 700KB to 20KB.

I also found a caching bug: Vercel was caching 404 API responses with the same headers as 200s. So when new content was published in Contentful, some CDN edges would still serve the cached 404. Pages would randomly work or not depending on which edge you hit, unless I completely redeployed the website. The fix was to not set cache headers on 404 responses.

On analytics, I ran a proper GA4 and Search Console audit for the first time (again, using claude code with direct Analytics access via CLI). We set up Core Web Vitals reporting, form event tracking, and key event configuration so conversions actually show up in reports. As part of this, I built an automated monthly reporting skill that pulls from GA4, filters spam, and generates a markdown report with key insights.

For content strategy, the main finding was that the site ranked fine for branded searches but had almost no visibility for the kinds of queries potential clients use. So we documented content pillars, topic lists with keyword difficulty, and a publishing schedule.

Day 3: Reporting automation

I built a reporting tool that pulls from multiple GA4 dimensions, filters out bot traffic, computes period-over-period changes, and writes a full report. I have never had such insight from website Analytics before. I can now run a /slash command in Claude Code and get an incredibly detailed breakdown of whatever period I like, within minutes.

Summary

In all, we made about 30 commits, producing about 6,000 lines of code across 50-odd files. The sitemap is half the size it used to be, and every page has structured data and a meta description. We’re tracking web vitals, form submissions, and search queries. The worst API call went from 700KB to 20KB, and a caching bug that caused intermittent 404s is fixed.

The useful thing about working this way was having analytics data to make decisions against — which pages had traffic, which had engagement, which had neither — and being able to verify each change immediately. Every structured data change went through Google’s Rich Results Test. Every redirect was checked. Every noindex was based on actual traffic numbers.

It’ll take a few weeks to see the SEO changes reflected in Search Console, and a few months for the content strategy to have any effect. But the plumbing is done.