Technical SEO Checklist: 15 Issues I Find on Every Client Website
Most websites lose organic traffic not because of weak content or missing backlinks but because of technical problems that prevent Google from crawling and indexing pages correctly. After auditing hundreds of websites over 14 years, I have identified 15 technical SEO issues that appear on nearly every site I review. Some cost businesses thousands in lost traffic. Most are fixable within a week.
This checklist covers the exact issues I look for during a technical SEO audit, organized from the most common to the most damaging. Each item includes what the problem is, why it matters, and how to fix it.
1. Missing or Duplicate Title Tags
Title tags remain the single most influential on-page ranking element, yet roughly 60% of the sites I audit have pages with missing, duplicate, or poorly written titles that waste ranking potential on every search result impression. Google uses the title tag as the primary clickable headline in search results, and when it is missing or duplicated across multiple pages, the search engine either generates its own version or struggles to differentiate between pages.
Impact: Pages without unique title tags compete against each other in search results, splitting click-through rates and confusing Google about which page to rank for a given query.
How to fix: Run a Screaming Frog crawl and export the "Page Titles" report. Filter for missing, duplicate, and over-length titles. Write unique titles under 60 characters for every indexable page, placing the primary keyword near the beginning.
2. Slow Page Speed From Unoptimized Images
Uncompressed images are the number one cause of slow page load times on the sites I audit, adding 2-5 seconds of load time on average and directly harming both Core Web Vitals scores and user engagement metrics that Google uses as ranking signals. According to Google's Core Web Vitals documentation, Largest Contentful Paint should occur within 2.5 seconds.
Impact: Every additional second of load time reduces conversions by approximately 7%. Google has confirmed page experience as a ranking factor, and slow pages receive less crawl budget.
How to fix: Convert all images to WebP or AVIF format. Implement lazy loading for images below the fold. Set explicit width and height attributes to prevent layout shifts. Use a CDN for image delivery. Run PageSpeed Insights on your top 10 landing pages and address every image-related recommendation.
3. Broken Internal Links and 404 Errors
Broken internal links waste crawl budget and create dead ends for both users and search engine bots, yet I find an average of 15-30 broken links per site during audits, often caused by deleted pages, changed URLs, or typos in manual link placement. Every broken link is a missed opportunity to pass authority between pages.
Impact: Broken links prevent link equity from flowing through your site architecture. Users who hit 404 pages have a bounce rate above 90%, and Google interprets high bounce rates as a quality signal.
How to fix: Crawl your site with Screaming Frog and filter for "Client Error (4xx)" responses. Set up 301 redirects for any deleted pages that had backlinks or traffic. Fix internal links pointing to non-existent URLs. Check Google Search Console's "Pages" report for crawl errors monthly.
4. Missing or Incorrect Canonical Tags
Canonical tags tell Google which version of a page is the "original" when duplicate or near-duplicate content exists, but misconfigured canonicals are present on roughly half the sites I audit, causing Google to index the wrong version of a page or ignore the canonical entirely. This is especially common on e-commerce sites with filtered product pages and WordPress sites with multiple URL parameters.
Impact: When Google indexes the wrong URL variant, you split ranking signals between multiple versions of the same page. In severe cases, the page you want ranked gets deindexed entirely.
How to fix: Every indexable page should have a self-referencing canonical tag. Canonicals should use the same protocol (https) and www/non-www version as your primary domain. Check that paginated pages, filtered URLs, and parameter-based URLs point their canonical to the correct parent page.
5. No XML Sitemap or Outdated Sitemap
XML sitemaps serve as a direct communication channel to Google about which pages exist on your site and when they were last updated, but I regularly find sites with no sitemap at all, sitemaps containing noindexed or redirected URLs, or sitemaps that have not been updated in months. Google's sitemap documentation is clear that sitemaps help discover new and updated content faster.
Impact: Without an accurate sitemap, new pages may take weeks longer to get indexed. Sitemaps containing 404 pages or redirects waste crawl budget.
How to fix: Generate a dynamic XML sitemap that automatically updates when content changes. Remove any noindexed, redirected, or 404 URLs from the sitemap. Submit the sitemap through Google Search Console. Keep sitemap files under 50,000 URLs and 50MB.
6. Missing HTTPS or Mixed Content Warnings
HTTPS has been a confirmed Google ranking signal since 2014, but I still find sites serving mixed content where the page loads over HTTPS while some resources like images, scripts, or stylesheets load over HTTP, triggering browser security warnings that destroy user trust and cause measurable drops in engagement metrics.
Impact: Mixed content warnings cause browsers to display "Not Secure" labels. Users see this and leave. Google may also prefer the HTTPS version of competing pages over yours if your implementation is incomplete.
How to fix: Install a valid SSL certificate. Update all internal links and resource references to use HTTPS. Set up a 301 redirect from HTTP to HTTPS. Use the Content-Security-Policy header to block mixed content. Check for mixed content warnings in Chrome DevTools Console.
7. Poor Mobile Usability
Google uses mobile-first indexing for all websites, meaning the mobile version of your site is the version Google evaluates for ranking, yet I find mobile usability issues on over 70% of the sites I audit, including text too small to read, clickable elements too close together, and content wider than the screen viewport.
Impact: Mobile usability issues directly affect your rankings because Google indexes and evaluates the mobile version first. If your mobile experience is poor, your desktop rankings will also suffer.
How to fix: Test all pages with Google's Mobile-Friendly Test. Ensure tap targets are at least 48px with 8px spacing between them. Use a responsive viewport meta tag. Check the "Mobile Usability" report in Google Search Console and fix all flagged issues.
8. Thin Content Pages With No Search Value
Thin content pages with fewer than 200 words, duplicated boilerplate, or no unique value dilute your site's overall quality score in Google's evaluation and waste crawl budget that could be spent on your most important pages. I typically find 20-40% of a site's indexed pages fall into this category, including tag archives, author pages with one post, and placeholder pages.
Impact: Google's Helpful Content system evaluates your site holistically. A large percentage of thin pages signals low overall quality, which can suppress rankings for your valuable pages too.
How to fix: Audit all indexed pages in Google Search Console. Noindex pages that provide no search value (tag pages, thin archives, placeholder content). Consolidate similar thin pages into comprehensive resources. Either expand thin pages to provide genuine value or remove them entirely.
9. Missing Structured Data
Structured data markup helps Google understand the content and context of your pages, enabling rich results like review stars, FAQ dropdowns, and product information directly in search results, yet most sites I audit have either no structured data or only basic organization schema that misses the highest-impact opportunities. Google's structured data documentation lists dozens of supported types.
Impact: Pages with rich results earn significantly higher click-through rates. FAQ schema can double your SERP real estate. Local business schema improves visibility in map and local results.
How to fix: Implement JSON-LD structured data for your business type (LocalBusiness, Organization). Add Article schema to blog posts, FAQPage schema to FAQ sections, and Product schema to product pages. Validate all markup with Google's Rich Results Test.
10. Crawl Budget Waste From Parameter URLs
URL parameters from tracking codes, session IDs, sorting options, and filter combinations can multiply a site's apparent page count by 10x or more, causing Google to waste crawl budget on thousands of near-duplicate URLs instead of your actual content pages. I have seen sites with 500 real pages generate 50,000 parameterized URLs in Google's index.
Impact: Wasted crawl budget means new and updated content gets indexed more slowly. Parameterized duplicates can also cause keyword cannibalization when Google indexes the wrong URL variant for a query.
How to fix: Identify all URL parameters through Screaming Frog and Google Search Console. Use the robots.txt file to block crawling of non-essential parameter combinations. Add canonical tags pointing parameterized URLs to their clean versions. Implement URL parameter handling at the server level where possible.
11. Missing Alt Text on Images
Alt text provides Google with textual context for images, enables image search visibility, and serves as essential accessibility information for screen readers, but the average site I audit has 40-60% of images missing alt attributes entirely, forfeiting both ranking signals and compliance with accessibility standards.
Impact: Images without alt text cannot rank in Google Image Search, which drives 20-30% of all Google searches. Missing alt text also creates accessibility violations that can have legal implications under the ADA.
How to fix: Run a Screaming Frog crawl and export the "Images" report filtered for missing alt text. Write descriptive alt text for every content image, incorporating relevant keywords naturally. Decorative images should use empty alt attributes (alt="") rather than no alt attribute at all.
12. Incorrect Robots.txt Configuration
The robots.txt file controls which parts of your site search engines can crawl, and misconfigured rules can accidentally block important pages, CSS files, or JavaScript resources that Google needs to render and understand your content. I have found sites accidentally blocking their entire /wp-content/ folder, preventing Google from seeing images and styles.
Impact: Blocked CSS and JavaScript files prevent Google from rendering your pages correctly, which affects how it evaluates content and layout. Blocked content pages obviously cannot rank at all.
How to fix: Review your robots.txt file at yourdomain.com/robots.txt. Use Google Search Console's robots.txt Tester. Ensure you are not blocking CSS, JavaScript, or image directories. Only block URLs that genuinely should not be crawled, such as admin pages, internal search results, and cart/checkout pages.
13. Redirect Chains and Loops
Redirect chains occur when URL A redirects to URL B, which redirects to URL C, forcing both users and search engines through multiple hops before reaching the final destination, and redirect loops occur when redirects create a circular path that never resolves. I find redirect chains of 3+ hops on most sites that have undergone URL structure changes or domain migrations.
Impact: Each redirect hop loses approximately 15% of link equity. Chains of 3 or more hops may cause Google to stop following the redirect entirely. Redirect loops result in complete inaccessibility.
How to fix: Crawl your site with Screaming Frog and filter the redirect report for chains and loops. Update all redirects to point directly to the final destination URL in a single hop. Update internal links to point directly to the final URL rather than relying on redirects.
14. Missing or Poorly Implemented Hreflang Tags
For sites targeting multiple languages or regions, hreflang tags tell Google which version of a page to serve to users in each locale, but implementation errors are extremely common and cause Google to serve the wrong language version to searchers, resulting in high bounce rates and lost conversions in international markets.
Impact: Incorrect hreflang implementation can cause the English version of your site to appear for Spanish-speaking searchers and vice versa. Google may also treat multilingual pages as duplicates if hreflang is missing, suppressing one version entirely.
How to fix: If your site targets multiple languages or regions, implement hreflang tags in the HTML head, HTTP headers, or XML sitemap. Every hreflang set must include a self-referencing tag and an x-default tag. Validate implementation with Ahrefs Site Audit or Screaming Frog's hreflang report. If your site only targets one language and region, you do not need hreflang tags.
15. No Internal Linking Strategy
Internal links distribute authority throughout your site and help Google understand topic relationships between pages, but most sites I audit have no deliberate internal linking strategy, resulting in "orphan" pages with zero internal links and important pages buried 4+ clicks from the homepage. The sites that rank best in competitive niches consistently have well-structured internal linking that guides both crawlers and users through logical content paths.
Impact: Pages with more internal links receive more crawl attention and rank higher. Orphan pages may never get indexed. A flat site architecture where important pages are reachable within 3 clicks consistently outperforms deep, siloed structures.
How to fix: Map your site's content hierarchy. Ensure every page is linked from at least 2-3 other relevant pages. Add contextual internal links within body content, not just navigation menus. Use descriptive anchor text that includes target keywords naturally. Audit internal links with Screaming Frog's "Inlinks" report.
How to Prioritize Your Technical SEO Fixes
Not all 15 issues carry equal weight. When I deliver a technical SEO audit to a client, I prioritize fixes based on a combination of impact and implementation difficulty. Here is how I rank the urgency.
| Priority | Issues | Typical Fix Time |
|---|---|---|
| Critical (fix this week) | Robots.txt blocking content, redirect loops, missing HTTPS | 1-2 days |
| High (fix within 2 weeks) | Title tags, broken links, canonicals, page speed | 3-5 days |
| Medium (fix within 1 month) | Structured data, sitemap, mobile usability, alt text | 1-2 weeks |
| Ongoing | Thin content, internal linking, crawl budget, hreflang | Continuous |
Tools I Use for Technical SEO Audits
A thorough technical audit requires multiple tools because no single tool catches everything. These are the tools I use on every audit, and each serves a specific purpose in the diagnostic process.
- Screaming Frog SEO Spider: The foundation of every technical audit. Crawls your entire site and identifies broken links, redirect chains, missing titles, duplicate content, and dozens of other issues in a single pass.
- Google Search Console: Shows you exactly how Google sees your site, including indexing errors, mobile usability issues, Core Web Vitals data, and which pages are actually in Google's index versus which ones you think are indexed.
- PageSpeed Insights: Provides both lab data and real-user field data for Core Web Vitals. The field data section is what Google actually uses for ranking, so that is what I prioritize.
- Ahrefs Site Audit: Catches issues that Screaming Frog misses, particularly around content quality, internal link distribution, and keyword cannibalization.
If you are a small business owner without access to these tools, start with Google Search Console. It is free and provides the most critical data about how Google interacts with your site.
Why Technical SEO Matters More in the Age of AI Search
With Google AI Overviews and other AI-powered search features now appearing in more than half of all searches, technical SEO has become more important, not less. If your SEO results have plateaued despite ongoing work, technical issues are often the reason -- I cover this in depth in my post on why SEO stops delivering results. AI systems pull information from pages they can crawl, render, and understand. If your pages have crawl errors, slow load times, or missing structured data, AI systems will source their answers from your competitors instead.
I have observed this pattern across my client portfolio: sites with strong technical foundations are cited by AI Overviews, ChatGPT, and Perplexity at significantly higher rates than technically broken competitors in the same niche. Clean technical SEO is now a prerequisite for AI search visibility.
Frequently Asked Questions
How often should I run a technical SEO audit?
I recommend a comprehensive technical SEO audit at least once per quarter, with automated monitoring through Google Search Console running continuously. Major site changes like redesigns, CMS migrations, or significant content additions should trigger an immediate audit. Most of the 15 issues on this checklist can reappear after routine site updates, so quarterly checks catch problems before they cause measurable traffic loss.
Can I do a technical SEO audit myself, or do I need to hire a professional?
You can identify many of the issues on this checklist using free tools like Google Search Console and the free version of Screaming Frog (limited to 500 URLs). However, interpreting the data and prioritizing fixes requires experience. I have seen business owners spend weeks fixing low-priority issues while critical crawl errors continued to suppress their rankings. If your site generates meaningful revenue, the ROI of a professional audit typically pays for itself within the first month of implementing fixes.
How long does it take to see results after fixing technical SEO issues?
Critical fixes like resolving robots.txt blocks or redirect loops can produce ranking improvements within days of Google recrawling the affected pages. Most technical fixes show measurable results within 2-4 weeks. Some improvements, particularly around crawl budget optimization and internal linking restructuring, accumulate over 2-3 months as Google recrawls and reevaluates your entire site.
Is technical SEO more important than content or backlinks?
Technical SEO is the foundation that determines whether your content and backlinks can produce results. The best content in the world will not rank if Google cannot crawl and index it. I think of it this way: content and backlinks are the fuel, but technical SEO is the engine. Without a working engine, the fuel does nothing. In practice, I always fix technical issues first because they create the conditions for content and link building to have maximum impact.
Related Articles
Dmytro Verzhykovskyi
SEO and digital marketing consultant in Irvine, California. 14+ years of experience. Gold Winner, Best SEO Professional, ECDMA Global Awards 2025. Google Partner. About Dmytro