A content audit used to take me half a day. Now it takes 90 seconds.
I’m not being loose with the numbers. The analysis part — pulling performance data, identifying patterns, flagging problem pages, prioritizing what to fix first — genuinely runs in under two minutes when Claude Code has access to your live data.
The old way: export GSC data to a spreadsheet, cross-reference it with Ahrefs metrics, write formulas to surface the patterns, build a priority matrix, format it for the client. Three to four hours, minimum, for a site with a few hundred pages.
The new way is what this post is about. I’ll show you exactly what I type, what Claude does with it, and what the output looks like — so you can run the same audit on your own site today.
What You Need First
This workflow uses the MCP integrations covered in the previous post — Google Search Console via Composio and Ahrefs. If you haven’t set those up yet, that post walks through the full setup. The whole thing takes about 15 minutes.
Once those are connected, you’re ready. Open Claude Code in any directory — though if you have a client CLAUDE.md set up, open it there so Claude already knows the client context. Start with the audit prompt below.
The Audit: Start to Finish
Step 1: Pull the Performance Baseline
The first thing I do is get a complete picture of how the site is performing in search. I ask Claude to pull 90 days of GSC data — long enough to smooth out weekly fluctuations, short enough to reflect current reality.
Pull all pages from Google Search Console for [domain] over the last 90 days.
For each page I want: URL, total impressions, total clicks, average CTR, and average position.
Sort by impressions descending. Show me the top 50.
Claude calls the GSC MCP, fetches the data, and returns a structured table in about 10–15 seconds. At this point you have a live snapshot of your top 50 pages — no CSV export, no spreadsheet setup.
Step 2: Find the Quick Wins
This is where most audits get valuable fast. Pages with high impressions but low CTR are ranking for relevant queries but failing to earn the click — usually a title or meta description problem. These are the fastest wins on any site.
From the data you just pulled, identify all pages where:
- Impressions are above 500 in the last 90 days
- CTR is below 2%
- Average position is between 1 and 15
Rank them by impressions. For each one, tell me the current CTR and what CTR I'd expect at that position based on typical benchmarks.
Claude cross-references the data against standard CTR curves by position and flags the pages where you’re most underperforming relative to your ranking. These are the pages worth rewriting titles and meta descriptions for first — you’re already ranking, you just need more clicks.
Step 3: Find the Pages in Freefall
Next I look for pages that have dropped significantly. A page losing traffic fast is either getting outcompeted, has been hit by an algorithm update, or has a technical issue. Any of those need attention before they bottom out.
Now compare performance for the same pages: last 90 days versus the prior 90 days.
Show me any pages where clicks have dropped more than 25% period over period.
List them with the absolute click numbers for each period and the percentage change.
If you have Ahrefs connected, follow immediately with:
For each of those declining pages, check Ahrefs to see if there's been any significant change in referring domains over the same period.
Combining click drop data with backlink data often tells you immediately whether you’re dealing with a content quality issue (backlinks stable, rankings dropped) or a link-based issue (backlinks lost, rankings followed).
Step 4: Flag Cannibalization
Keyword cannibalization — two or more pages on your site competing for the same query — is one of the most common and most underdiagnosed problems in content marketing. It’s also tedious to find manually. Claude makes this fast:
Using the GSC data, identify any queries where two or more of my pages appear in the same results.
Group them by query and show me which URLs are competing.
Flag any cases where the competing pages are ranking within 5 positions of each other.
Claude will surface any overlap patterns in the data. A page ranking position 4 and another ranking position 6 for the same query is a clear consolidation opportunity — combine them into one stronger page.
Step 5: Build the Priority Action List
By this point Claude has all the data it needs. The final step is asking it to synthesize everything into a ranked action list — the specific things to fix, in the order that will move the needle most:
Based on everything you've pulled — the CTR underperformers, the declining pages, and the cannibalization flags — give me a prioritized action list.
I want it organized into three tiers:
1. Quick wins (under 2 hours of work, highest expected impact)
2. Medium-term fixes (require more substantial content work)
3. Monitor only (issues to track but not act on yet)
For each item include: the URL, the problem, and the recommended action.
What comes back is a clean, tiered action list that would have taken hours to build manually. It’s specific to your data, not a generic SEO checklist.
Writing the Output to Notion
If you have Notion MCP connected, you can push the entire audit output directly into your workspace in one step:
Create a new page in my [database name] Notion database titled "Content Audit — [Domain] — [Month Year]".
Include the full priority action list, the CTR underperformer table, the declining pages table, and the cannibalization findings.
Format it with clear H2 sections for each category.
The audit lives in Notion, searchable and shareable, rather than buried in a chat window. For agency owners, this means the deliverable is client-ready the moment the session ends.
What This Doesn’t Replace
To be direct: Claude Code is doing the analysis layer, not the editorial judgment layer. It can tell you a page is underperforming on CTR — it can’t tell you whether the page’s topic is still worth pursuing, whether your brand positioning has shifted, or whether a competitor has permanently claimed that territory with a piece you’ll never outrank.
Use the audit output as a starting point, not a final answer. The 90 seconds gets you to a prioritized list you can interrogate. The judgment calls are still yours.
That said: getting from raw data to a prioritized action list in under two minutes — rather than half a day — changes what’s possible. You can run this audit monthly instead of quarterly. You can run it on a client’s site before a pitch, not just after you land them. The speed changes the workflow, not just the tools.
Frequently Asked Questions
What is an AI content audit and how is it different from a traditional one?
An AI content audit uses a tool like Claude Code — connected to live data sources like Google Search Console and Ahrefs — to pull performance data, identify patterns, and generate a prioritized action list automatically. A traditional content audit requires manually exporting data, building spreadsheets, and writing analysis by hand. The output is similar; the time investment is not. A traditional audit for a 200-page site takes three to four hours. The AI-assisted version takes under five minutes, including setup.
Do I need coding experience to run a content audit with Claude Code?
No. The workflow in this post uses plain-language prompts — the same way you’d ask a question to a colleague. Claude handles the data retrieval and analysis. The only technical step is the one-time MCP setup covered in the previous post, which involves running two terminal commands per tool. If you can copy and paste, you can do the setup.
How accurate is the content audit data Claude pulls from Google Search Console?
The data is pulled directly from the GSC API via the Composio MCP server — it’s the same data you’d see in your GSC dashboard, not an estimate or approximation. Position data in GSC is averaged across all queries and devices, which means it can differ from what you see when you search manually. That’s a GSC limitation, not a Claude one. For auditing purposes the data is accurate enough to identify patterns and prioritize action.
Can Claude Code do a content audit without Ahrefs?
Yes. The GSC-only version of this audit — CTR underperformers, declining pages, cannibalization — is fully functional without Ahrefs. Ahrefs adds the backlink layer, which is useful for diagnosing why a page is declining but not required for identifying that it is. If you don’t have an Ahrefs subscription, skip Step 3’s follow-up prompt and focus on the GSC-only analysis.
How often should I run a content audit with Claude Code?
Monthly is now realistic, given how fast the process runs. Quarterly was the previous standard because of how long manual audits took. Monthly audits let you catch declining pages before they bottom out and act on quick wins while they’re still quick. For client sites, a monthly audit also gives you a concrete deliverable that justifies the retainer.
What’s the difference between a content audit and a site audit?
A content audit focuses on performance: which pages are getting traffic, which aren’t, which have CTR problems, which are cannibalizing each other. A site audit focuses on technical health: crawlability, broken links, page speed, structured data, index coverage. They’re complementary. This post covers the content audit side. Claude Code with Ahrefs’ Site Audit MCP tools can also run a technical audit — that’s worth a separate post.
What’s Next
The next post goes deeper into custom Claude skills — reusable commands you build once and use across every client, every campaign, every audit. Five specific skills every SEO practitioner should have set up before anything else.
One more angle worth reading: if the content audit reveals you’re losing ground to AI-generated answers in search — queries where ChatGPT or Claude is eating your traffic instead of a competitor — this guide on getting mentioned in AI tools covers exactly what to do about it.
If you want the full content audit prompt pack — the exact prompts from this post, formatted and ready to paste, plus a Notion template for the output — it’s included in The AI Marketing Stack.

