GUIDE 02 – AI-POWERED SEO SERIES
Run a Technical SEO Audit
with AI
Feed your Google Search Console data, crawl reports, and sitemaps to Claude or ChatGPT. Get a prioritized fix list ranked by impact vs. effort – in minutes, not weeks.
REGULARLY UPDATED
WORKS ON ANY PLATFORM
Why Most SEO Audits Are a Waste of Time
The standard SEO audit goes like this: an agency pulls 200 pages of data from Screaming Frog, dumps it into a spreadsheet, highlights everything in red, and presents it as a “roadmap.” The client gets overwhelmed. Nothing gets fixed. Three months later the contract is over.
The problem is not the audit itself. It is the lack of prioritization. A technical audit that treats a missing alt tag with the same urgency as a crawlability issue blocking 40% of your pages is worse than useless – it is a distraction. AI fixes this by analyzing the data and ranking every issue by actual business impact.
Practitioner’s Note
“When I used to receive SEO pitches at Times Internet and Future Group, the moment an agency started their pitch with ‘here is your site audit,’ I would pause the pitch and ask them to leave the room. I already know something is wrong with my website – that is why you are here. Tell me what you know about my category, my audience, and how you plan to fix it. Audit is a byproduct, not a starting point.” – Apurv Singh, Dream SEO Masterclass Session 1
What a Real Technical SEO Audit Actually Checks
A proper technical audit covers six categories. Most SEOs only check the first two. The remaining four are where the real differentiation lies.
| Audit Category | What It Covers | Impact Level | Guide Link |
|---|---|---|---|
| 1. Crawlability & Indexing | Can Google find and store your pages? Robots.txt, sitemap, crawl budget, orphan pages | Critical | This guide |
| 2. Site Architecture | URL structure, breadcrumbs, click depth, internal linking hierarchy | Critical | This guide |
| 3. Page Speed & CWV | LCP, CLS, INP, TTFB, render-blocking resources | High | CWV Guide |
| 4. Structured Data | Schema markup, rich results eligibility, JSON-LD validation | High | Schema Guide |
| 5. Mobile Usability | Responsive design, tap targets, viewport config, mobile-first indexing | High | This guide |
| 6. Security & HTTPS | SSL certificate, mixed content, HTTPS enforcement, security headers | High | This guide |
Step 1: Gather Your Data (5 Minutes)
Before feeding anything to AI, collect these four data sources. You only need the free ones to get started.
Source 1: Google Search Console
Go to Pages report. Export the full list of indexed vs. not-indexed URLs. Also export your Coverage report and any manual actions.
Cost: Free – Time: 2 minutes
Source 2: Crawl Stats (GSC)
Go to Settings then Crawl Stats. Screenshot the total crawl requests, response time, and file type breakdown (HTML percentage).
Cost: Free – Time: 1 minute
Source 3: Your Robots.txt
Visit yoursite.com/robots.txt. Copy the entire file contents. This tells you what pages are blocked and whether your sitemap is declared.
Cost: Free – Time: 30 seconds
Source 4: Screaming Frog (Optional)
Run a crawl and export the full results CSV. This gives you every technical issue across your entire site in one file. Free up to 500 URLs.
Cost: Free (500 URLs) or $259/year – Time: 5-15 minutes depending on site size
Step 2: Feed the Data to AI
Now give everything to Claude or ChatGPT with this master audit prompt. The key is providing context about your business, not just raw data.
Master Audit Prompt
I need a complete technical SEO audit of my website. Here is the data:
Website: {URL}
Platform: {WordPress/Shopify/Custom}
Total pages on site: {number}
Business type: {e-commerce/SaaS/local service/publisher}
Top 5 revenue-generating pages: {list URLs}
DATA ATTACHED:
– Google Search Console indexed/not-indexed report (CSV)
– Crawl stats screenshot
– Robots.txt file contents
– Screaming Frog export (CSV) (if available)
Please:
1. Identify every technical issue, grouped by category (crawlability, architecture, speed, structured data, mobile, security)
2. Rate each issue as Critical, High, Medium, or Low priority
3. For Critical and High issues, provide the exact fix
4. Create an implementation timeline: what to fix in Week 1, Week 2, Month 1
5. Flag which fixes need a developer vs. what I can do via plugins or CMS settings
6. Identify any pages that should be indexed but are not, and vice versa
Step 3: Crawlability & Indexing Audit
This is the most critical category. If Google cannot find and store your pages, nothing else matters. No amount of content optimization or link building will help a page that is not indexed.
| What to Check | Where to Find It | Red Flag | AI Can Fix? |
|---|---|---|---|
| Indexed vs total pages | GSC Pages report | Less than 50% pages indexed | Diagnosis yes, fix needs dev |
| Crawl requests trend | GSC Settings – Crawl Stats | Declining month over month | Diagnosis yes, root cause analysis |
| HTML crawl share | GSC Crawl Stats – By file type | HTML less than 10% of crawl requests | Diagnosis yes |
| Robots.txt blocks | yoursite.com/robots.txt | Important pages or GPTBot blocked | Yes – generates fix |
| Sitemap presence | Robots.txt or yoursite.com/sitemap.xml | No sitemap, or sitemap not in robots.txt | Yes – generates sitemap |
| Orphan pages | Screaming Frog or manual check | Important pages with zero internal links | Yes – generates link map |
| Duplicate content | Screaming Frog or GSC non-indexed reasons | Same content on multiple URLs | Yes – identifies and suggests canonicals |
The Crawl Stats Diagnostic
“If a website has 10,000 pages and crawl requests are only 7,500 – and that includes duplicate requests to the same URLs – something is off. Google is not crawling enough. When I see this number declining month over month, it is almost always one of two things: copy-paste AI content that Google has stopped caring about, or duplicate content where the same answer appears on 10 different pages. Start your diagnosis from the crawl stats, not from the content.” – Apurv Singh, Dream SEO Masterclass Session 3
Step 4: Robots.txt & Sitemap Audit
Your robots.txt file is the gatekeeper. It tells search engines what not to crawl, points them to your sitemap, and can block specific bots. Most websites either have a blank robots.txt (missing opportunity) or one with accidental blocks (causing damage).
Pages to Block (Disallow)
☑ Cart and checkout pages
☑ User account and login pages
☑ Internal search results pages
☑ Admin panels and staging URLs
☑ Thank-you and order confirmation pages
☑ URL parameter variations (filters, sorting)
Never Block These
☐ Homepage, category pages, product pages
☐ Blog posts and content pages
☐ CSS and JavaScript files (Google needs to render)
☐ Googlebot, Bingbot, GPTBot, PerplexityBot
☐ Your XML sitemap
☐ Image directories (unless specifically needed)
Sitemap checklist: Your XML sitemap should only contain pages that are live (200 status), not redirected, not blocked by robots.txt, and not set to noindex. It should not contain cart pages, parameter URLs, 404 pages, or out-of-stock product pages. AI can audit your entire sitemap against these rules in seconds.
Copy-Paste Prompt
Here is my robots.txt file:
— paste full robots.txt —
And here is my XML sitemap (or list of URLs in it):
— paste sitemap URLs —
My website is a {business type} on {platform}. Please:
1. Identify any important pages or bots being blocked that should not be
2. Identify any pages in the sitemap that should not be there (404s, redirects, parameter URLs, noindex pages)
3. Check if GPTBot, PerplexityBot, and ClaudeBot are allowed
4. Recommend the optimal robots.txt configuration
5. Suggest a parent-child sitemap structure if I have more than 5 content categories
Step 5: Site Architecture Audit
Your site architecture determines how easily Google can navigate from one page to another. The rule of thumb: no important page should be more than 3 clicks from the homepage. If a page is buried deep, Google will either take too long to find it or skip it entirely.
| Architecture Issue | Why It Hurts | How AI Helps |
|---|---|---|
| Click depth over 3 | Google deprioritizes deep pages. Users bounce faster. | Analyzes URL structure and suggests flattening |
| No breadcrumbs | Google cannot understand page hierarchy. Loses breadcrumb rich results. | Generates BreadcrumbList schema + HTML |
| Broken internal links | Crawl budget wasted on 404s. Link equity lost. | Scans link list and identifies all broken URLs |
| Redirect chains | A redirects to B, B redirects to C. Slows crawling, loses equity. | Identifies chains and generates direct redirect map |
| Inconsistent URL patterns | Mixed formats (/Product vs /product vs /products/) confuse crawlers | Flags inconsistencies and suggests canonical strategy |
Step 6: Mobile & Security Quick Check
These two categories are fast to audit but expensive to ignore. Google uses mobile-first indexing (it crawls the mobile version of your site, not desktop). And HTTPS is an official ranking factor.
Mobile Checklist
☑ Viewport meta tag present
☑ Text readable without zooming
☑ Tap targets at least 48px apart
☑ No horizontal scrolling required
☑ Images resize responsively
☑ No Flash or unsupported plugins
Test: Google Mobile-Friendly Test tool
Security Checklist
☑ HTTPS active on all pages
☑ SSL certificate not expired
☑ No mixed content (HTTP resources on HTTPS pages)
☑ HTTP automatically redirects to HTTPS
☑ Security headers present (HSTS, X-Frame-Options)
☑ No Google Search Console security issues flagged
Test: SSL Labs test (ssllabs.com/ssltest)
Critical Warning: SSL Expiry
“Always keep a tab on when your SSL certificate is expiring. We have had this happen twice in 12 years. The SSL expired, HTTPS became HTTP, and Googlebot visited in that window. The organic traffic drop was instant and painful. Set a calendar reminder 30 days before your SSL expiry date.” – Apurv Singh, Dream SEO Masterclass Session 5
Step 7: Prioritize Fixes with the Impact-Effort Matrix
Once AI identifies all the issues, you need a system for deciding what to fix first. Use this matrix – it is the same prioritization framework used in our consulting work across 30+ brands.
| Priority | Impact | Effort | Examples |
|---|---|---|---|
| Week 1: Critical | Blocks indexing or causes major traffic loss | Low to medium | Robots.txt blocking key pages, SSL expired, noindex on product pages, sitemap missing |
| Week 2: High | Reduces ranking potential or user experience | Medium | Duplicate content, redirect chains, missing schema, Core Web Vitals failures |
| Month 1: Medium | Improves quality but not urgent | Medium to high | Internal linking gaps, image optimization, breadcrumb implementation, mobile tap target issues |
| Ongoing: Low | Nice to have, marginal impact | Low | Missing alt tags on decorative images, minor HTML validation errors, canonical tag refinements |
Prioritization Prompt
Here is the full list of technical SEO issues found on my website:
— paste issue list —
My top business priority is {getting more organic sales / increasing traffic / fixing ranking drops / preparing for AI search}.
My team capacity is {developer X hours per week, content writer Y hours per week}.
Please create a prioritized implementation plan with:
1. Week 1 fixes (critical, blocks indexing or causes traffic loss)
2. Week 2 fixes (high impact, improves rankings)
3. Month 1 fixes (medium, quality improvements)
4. Ongoing maintenance items
For each fix, specify: who does it (developer / SEO / content), estimated time, and exact steps.
Step 8: Set Up Ongoing Monitoring
A one-time audit is not enough. Technical issues resurface constantly from plugin updates, new content, theme changes, and server configurations. Build these checks into your routine:
| Check | Frequency | Tool | What to Look For |
|---|---|---|---|
| Indexed page count | Weekly | Google Search Console | Sudden drops in indexed pages |
| Crawl request trend | Weekly | GSC Crawl Stats | Declining crawl requests (early warning signal) |
| Core Web Vitals | Monthly | GSC CWV report | Regressions from plugin or theme updates |
| SSL certificate expiry | Quarterly (set reminder) | Hosting panel or SSL Labs | Renew at least 30 days before expiry |
| Full site crawl | Monthly | Screaming Frog | New broken links, redirect issues, orphan pages |
Master the Complete Technical SEO Framework
This guide covers the audit methodology. For the full 5-pillar SEO system including the SEO TAM Graph, keyword prioritization, contextual signaling, and trust-building frameworks, explore the Dream SEO Masterclass.
Frequently Asked Questions
How often should I run a technical SEO audit?
A comprehensive audit should be done quarterly. Weekly monitoring of key metrics (indexed pages, crawl stats, CWV) catches issues before they become problems. After any major site change like a redesign, migration, or large content update, run a focused audit immediately.
Can AI find issues that tools like Screaming Frog miss?
They serve different purposes. Screaming Frog is better at finding raw technical issues at scale (every broken link, every missing tag). AI is better at prioritizing those issues by business impact, diagnosing root causes, and generating the specific fix. The best workflow uses both: Screaming Frog for discovery, AI for analysis and action planning.
What is the single most important thing to check first?
Your indexed page count versus your total page count. If you have 1,000 pages on your site and only 300 are indexed by Google, that is your starting point. Everything else – content, backlinks, on-page optimization – is irrelevant for the 700 pages Google has not even stored in its index.
Do I need paid tools for a proper technical audit?
No. Google Search Console (free) covers the most critical data: indexing status, crawl stats, Core Web Vitals, and manual actions. Screaming Frog is free for up to 500 URLs. Claude and ChatGPT free tiers can analyze the exported data. Paid tools like Semrush and Ahrefs add competitor analysis and historical trending, but are not required for a solid audit.
My organic traffic is dropping. Is it a technical issue?
Often yes, and the first place to look is the Crawl Stats report in Google Search Console. If crawl requests are declining, Google is deprioritizing your site. The two most common causes are AI-generated copy-paste content and duplicate content across multiple pages. But traffic drops can also come from algorithm updates, lost backlinks, or increased competition, so always check multiple signals.
Apurv Singh
Growth Architect – HQ Digital
12+ years in digital marketing. Built SEO for a global top-10 traffic website and multiple marketplace platforms. Currently consulting for brands across India, UAE, US, and Europe – including Fortune 500 conglomerates, Reliance Brands, and D2C companies in fashion, jewelry, health, and real estate. TEDx speaker. 300K+ followers across Instagram and YouTube.