SEO + AI Lab

Screaming Frog + Claude Code MCP: Automated SEO Audits at Scale

Claudio Novaglio
5 min read
Screaming Frog + Claude Code MCP: Automated SEO Audits at Scale

Why auditing sites manually is obsolete

Technical SEO audits are tedious. Crawl the site, find broken links, missing meta tags, slow pages, redirect chains. Then manually catalog the issues. Then send it to the developer and hope they fix them.

Screaming Frog + Claude Code MCP turns this into an automated loop: crawl → identify issues → propose fixes → execute → verify.

What is the MCP protocol?

MCP (Model Context Protocol) is an open standard that lets AI agents interact with external tools. Think of it as a bridge between Claude (the AI) and your local software.

Without MCP:

You run Screaming Frog, export a CSV, manually read it, write a to-do list, ask a developer to fix it.

With MCP:

Claude connects directly to Screaming Frog, reads the crawl data, identifies issues, writes code or instructions to fix them, and verifies the fix. No manual export or handoff.

MCP unlocks agentic workflows. Claude doesn't just chat; it takes action.

8 MCP tools for SEO automation

Screaming Frog exposes these tools via MCP:

ToolWhat it doesUse case
crawl_siteStart a background crawl; returns crawl_idAudit site structure, find technical issues
crawl_statusCheck progress of a running crawlMonitor crawl without blocking
export_crawlLoad crawl data and export to CSVGet structured data for analysis
read_crawl_dataQuery exported CSV; filter, sort, paginateExtract specific issue types (broken links, redirects)
list_crawlsShow all saved crawls in SF's databaseTrack audit history, manage storage
delete_crawlRemove a crawl from storageFree up disk space
sf_checkVerify SF is installed and report versionConfirm MCP connection
storage_summaryShow disk usage of all crawlsMonitor SF database growth

A real audit workflow with Claude Code

Here's how it works in practice:

Step 1: Start the crawl

You tell Claude: "Audit my site for SEO issues."

Claude uses crawl_site to start a full crawl of your domain. Screaming Frog runs in the background. Claude gets a crawl_id and moves on—no waiting.

Step 2: Wait for completion

Claude checks crawl_status periodically until the crawl is done. When finished, Claude moves to the next step.

Step 3: Export and analyze

Claude uses export_crawl to pull the crawl data. It requests key reports:

  1. Response Codes: Find 404s, 500s, redirects
  2. Titles and Meta Descriptions: Check SEO metadata
  3. Internal Links: Identify broken internal links and redirect chains
  4. H1 and H2 tags: Find missing or duplicate headings
  5. Images: Find missing alt text

Step 4: Identify issues

Claude reads the CSV data using read_crawl_data and categorizes issues:

  1. 15 broken internal links (404s)
  2. 8 pages with missing meta descriptions
  3. 3 redirect chains (bad for SEO)
  4. 12 images with no alt text
  5. 5 pages slower than 3 seconds

Step 5: Propose fixes

For each issue type, Claude writes a fix:

  1. Broken links: "Redirect /old-page/ → /new-page/ (uses /new-page/ slug)"
  2. Missing meta: "Add description: '[keyword] | [benefit] | Claudio Novaglio'"
  3. Redirect chains: "Change /a → /b → /c to /a → /c (saves redirect hop)"
  4. Missing alt text: "Alt: '[specific description of image content]'"
  5. Slow pages: "Compress images, remove unused CSS, defer JavaScript"

Step 6: Verify

After fixes are deployed, Claude re-crawls using crawl_site and compares the new crawl to the old one. Issues should be gone.

From audit to correction: the feedback loop

The magic is automation with verification:

  1. Run crawl 1 → Report issues
  2. Fix issues (developer deploys changes)
  3. Run crawl 2 → Verify fixes
  4. If issues remain → Propose new fixes
  5. Repeat until clean

Without MCP, each loop requires human handoff. With MCP, the loop is instant and continuous.

Scheduling audits

You can set Claude to run audits on a schedule—weekly or monthly:

  1. Crawl runs automatically every Monday at 8 AM.
  2. Claude analyzes the data.
  3. If new issues are found, Claude alerts you (or proposes fixes).
  4. You stay aware of SEO health without thinking about it.

Who this is for

  1. Agencies auditing multiple client sites.
  2. In-house teams running continuous SEO checks.
  3. Developers integrating SEO monitoring into their pipeline.
  4. Anyone tired of manual crawl analysis.

Prerequisites

  1. Screaming Frog SEO Spider (commercial license; $249/year).
  2. Claude Code or Claude API access.
  3. MCP server running on your machine (installed with Claude Code).
  4. Basic understanding of technical SEO issues.

Common questions

Can Claude fix the issues automatically?

Claude can propose fixes and write code/instructions, but deploying changes requires a developer. For coding tasks (wrong redirects, missing meta tags in templates), Claude can write the fix. For content changes (alt text, headings), you need manual input.

How many sites can I audit?

Unlimited. Each crawl is stored in SF's database. You can audit 100 sites and query them all with MCP. Disk space is the only constraint.

Is this better than hiring an SEO agency?

Not instead of, but alongside. Agencies provide strategy and creativity. MCP provides execution and monitoring. Use both.

Can it work with Next.js or single-page apps?

Screaming Frog crawls rendered HTML, so SPAs work if they server-render or use prerendering. For client-side rendering, you may miss issues. Plan accordingly.

The future: scheduled audits with notifications

Imagine: every week, Claude crawls your site, identifies new issues, and sends you a summary:

"New issues this week: 2 broken redirects, 1 missing meta description. I've drafted fixes; approve to deploy."

This is the direction SEO automation is heading.

The bottom line

Screaming Frog + Claude Code MCP turns SEO audits from a tedious chore into an automated feedback loop. You get continuous monitoring, AI-powered diagnosis, and instant alerts. For agencies and in-house teams, this saves days of manual work per year.

Frequently Asked Questions

MCP (Model Context Protocol) is a standard that lets AI agents interact with external tools. With MCP, Claude connects directly to Screaming Frog, reads crawl data, and takes action—no manual export/import needed.

Claude can propose and write fixes, but deployment requires a developer for code changes or manual work for content. For example, Claude can write a redirect rule or rewrite a title tag—you deploy it.

Yes. Set Claude to crawl weekly or monthly on a schedule. It analyzes the data, flags new issues, and alerts you. No manual trigger needed.

Yes. The free tier is limited. For serious auditing and MCP integration, the commercial license ($249/year) is required.

Yes. Claude can analyze severity: a 404 internal link is critical (blocks crawling), while missing alt text is medium (affects accessibility). Use severity levels to prioritize.

Screaming Frog crawls rendered HTML. If your SPA server-renders or uses static prerendering, yes. For fully client-rendered SPAs, you may miss content.

No. Agencies provide strategy, creativity, and competitive analysis. MCP provides execution and monitoring. Use both for maximum impact.

About the author

Claudio Novaglio

Claudio Novaglio

SEO Specialist, AI Specialist e Data Analyst con oltre 10 anni di esperienza nel digital marketing. Lavoro con aziende e professionisti a Brescia e in tutta Italia per aumentare la visibilità organica, ottimizzare le campagne pubblicitarie e costruire sistemi di misurazione data-driven. Specializzato in SEO tecnico, local SEO, Google Analytics 4 e integrazione dell'intelligenza artificiale nei processi di marketing.

Want to improve your online results?

Let's talk about your project. The first consultation is free, no commitment.