Developer’s Guide to Fixing Common SEO Issues

When most people think of SEO, they picture keywords, blog posts, and marketing strategies. But here’s the thing, a huge part of SEO success actually comes down to code. The way a site is built directly affects how search engines crawl, index, and rank it.

Developers play just as big a role in SEO as marketers do.

The challenge is that many common SEO issues aren’t obvious while you’re writing code. They creep in through missing tags, bloated scripts, slow load times, or overlooked accessibility features. On the surface, the site might "work," but behind the scenes, those details can quietly hurt performance in search results.

Missing or Misused Meta Tags

Meta Title and Meta Description tags are the first things people see in search results. A strong title helps search engines understand what a page is about, while a clear description encourages users to click. Skip these tags, or duplicate them across multiple pages, and you miss out on two major opportunities: relevance and clickthroughs.

Developers often forget to set unique <title> and <meta name="description"> tags, especially when working with dynamic pages. Other times, the same title and description are reused across the site. Both issues make it harder for search engines to know which page to prioritize, and they create a poor user experience in the results page.

Fix

Always define unique, descriptive tags for each page. Keep titles concise and descriptions clear, using natural language instead of keyword stuffing. A simple example:

<head>
  <title>Handmade Ceramic Coffee Mugs | Willow & Clay Studio</title>
  <meta name="description" content="Discover artisan ceramic coffee mugs crafted by Willow & Clay Studio. Each piece is unique, durable, and designed to make your morning ritual special.">
</head>

Tips

  • If you’re working with a CMS or framework, set up server-side templating to generate meta tags dynamically. For example, a blog system can pull the post title and summary into <title> and <meta> tags automatically, ensuring every page has unique metadata.
  • It’s worth noting that Google doesn’t always use your provided meta description verbatim. Since around 2018, Google has increasingly modified titles and descriptions in search results based on what it thinks best matches the query. That doesn’t mean you should ignore them, well-written tags still guide how your content is understood and often influence what Google chooses to display.

Resource: Google’s guide on meta tags

Improper Heading Structure

Headings aren’t just for visual styling, they define the content hierarchy of a page. Search engines use them to understand what’s most important, and users rely on them to skim and navigate content. A well-structured heading system makes pages more accessible, more readable, and easier for search engines to parse.

Developers sometimes use multiple <h1> tags on a page for styling, or they skip heading levels entirely (i.e. jumping from <h1> straight to <h4>). This creates confusion for both search engines and assistive technologies like screen readers. It may not break the site, but it does weaken SEO and accessibility.

Fix

Think of headings like an outline. The <h1> is the main title, <h2>s are section headers, and <h3>s are subsections. Stick to one <h1> per page and use lower levels logically to show structure, not just to adjust font size.

<h1>Wildlife Photography Tours</h1>
<h2>African Safari Adventures</h2>
<h3>Guided Serengeti Sunrise Shoot</h3>

Tips

  • If you need a smaller font size but the section is still an <h2> in meaning, use CSS to style it rather than skipping to <h4>. Keep headings semantic, structure first, style second.
  • A page should have only one <h1>, but you can have multiple <h2> through <h6> headings as long as they follow the hierarchy in order

Slow Load Times

Page speed isn’t just a technical metric, it directly affects both SEO rankings and user experience. Search engines prioritize faster sites, and users are quick to leave if a page takes more than a few seconds to load. A slow site means higher bounce rates, fewer conversions, and lost visibility in search results.

Slowdowns often creep in through large, uncompressed images, render-blocking scripts that delay content from loading, or bloated CSS files full of unused styles. These issues might not be obvious during development, they add up fast once the site goes live.

Fix

Start with the biggest offenders: images, scripts, and CSS. Use lazy loading for images that aren’t immediately visible, defer non-critical scripts, and minify your CSS and JavaScript to reduce file size. Here’s a simple example of lazy loading:

<img src="/hero.jpg" alt="Hero Banner" loading="lazy" width="1200" height="600">

Other performance boosters include serving images in next-gen formats (like WebP), enabling caching, and using a CDN to deliver files closer to your users.

Tip

  • Run your site through tools like Lighthouse or PageSpeed Insights to identify bottlenecks. Even small fixes, like moving a script to the footer or compressing one oversized image, can make a noticeable difference.

Poorly Structured URLs

URLs are signals for both search engines and users. A clean, descriptive URL helps search engines understand what a page is about, while also making links easier to read and click. Think of it as a built-in label for your content.

It’s common to see sites using query strings like /page?id=123 or generic slugs like /services/page1. While these work functionally, they don’t give search engines or users any context. Worse, overly long or keyword-stuffed URLs can confuse crawlers and look untrustworthy to people.

Fix

Use short, descriptive slugs that reflect the content of the page. Keep them consistent, human-readable, and lowercase. For example:

/tours/northern-lights-expedition

This structure is clear, easy to remember, and more likely to get clicks in search results.

Tips

  • Use hyphens (-) instead of underscores (_) to separate words.
  • Keep URLs as short as possible while still being descriptive.
  • Avoid repeating the same keyword multiple times,  once is enough.
  • Stick with a logical hierarchy (e.g., /blog/seo-tips instead of /blog/article?id=99).

Missing Alt Attributes on Images

Alt text serves two important purposes: it makes websites accessible to people using screen readers, and it gives search engines context about what an image represents. Without it, both accessibility and SEO take a hit. A well-written alt attribute can improve user experience and help images show up in search results.

Developers sometimes skip alt text entirely, leaving images blank. On the flip side, some try to "game" SEO by stuffing alt attributes with keywords (e.g., alt="avocado toast avocado breakfast avocado recipes avocado sandwich healthy avocado toast recipe"). Both approaches backfire, one excludes users, and the other looks spammy to search engines.

Fix

Always include concise, descriptive alt text that explains the image in plain language. Think of it as how you’d describe the image to someone who can’t see it:

<img src="/team-photo.jpg" alt="Our web development team at work">

Tips

  • If the image is purely decorative (like a background flourish), use an empty alt="" so screen readers can skip it.
  • Keep alt text short but meaningful, usually under 125 characters.
  • Focus on describing the image, not cramming in extra keywords.

Ignoring Mobile Optimization

Google now uses mobile-first indexing, which means it evaluates the mobile version of your site before the desktop version. If your site doesn’t work well on smaller screens, your rankings can suffer, even if the desktop site looks perfect. Beyond search, poor mobile usability frustrates users and drives them away.

Developers sometimes build sites with a desktop-first mindset, leaving mobile as an afterthought. The result? Fixed-width layouts that don’t scale, navigation that’s impossible to tap, or content that gets cut off on smaller screens. All of these create friction for users and signal to search engines that the site isn’t mobile-friendly.

Fix

Design with responsiveness in mind from the start. Use flexible grids, scalable images, and media queries to adjust layouts at different breakpoints. For example, stacking a navigation column on smaller screens can improve usability:

@media (max-width: 768px) {
  .nav {
    flex-direction: column;
  }
}

Tips

  • Test your site on actual devices, not just resized browser windows.
  • Make sure buttons and links have enough padding to be easily tapped.
  • Avoid pop-ups or elements that block content on mobile.

Overlooking Canonicalization

Search engines don’t like confusion. If the same content appears at multiple URLs, they may not know which version to index or rank. This is called duplicate content, and it can dilute your SEO signals, splitting link equity across multiple variations of the same page.

It’s easy to end up with several versions of a page without realizing it. For example:

  • https://example.com/page
  • https://example.com/page/
  • https://example.com/page?ref=123

All three technically serve the same content, but search engines may treat them as separate pages. Without guidance, you risk competing with yourself in the rankings.

Fix

Use a canonical tag to tell search engines which version of a page is the "official" one. Place the tag inside the <head> section of your HTML:

<link rel="canonical" href="https://example.com/page">

This signals to search engines that all variations should consolidate their authority to the canonical version.

Tips

  • Always pick a preferred URL format (with or without trailing slash, www vs. non-www).
  • Make sure canonical tags point to the correct page, accidental self-references to the wrong URL are common.
  • Combine with proper redirects (301s) to strengthen consistency.

JavaScript Rendering Issues

Modern frameworks like React, Vue, and Angular rely heavily on JavaScript to render content. While this creates dynamic, app-like experiences, it can also cause problems for search engines. If critical content only appears after JavaScript runs, crawlers may miss it, leading to incomplete indexing or weaker rankings.

A common pitfall is relying entirely on client-side rendering. The page initially loads with minimal HTML, and all the meaningful content is injected later via JavaScript. Users eventually see the content, but search engines may struggle to render or index it properly.

Fix

Implement server-side rendering (SSR) or use prerendering to ensure search engines can access the full content immediately. SSR frameworks like Next.js (for React) or Nuxt.js (for Vue) render pages on the server before sending them to the browser, giving crawlers fully-formed HTML to work with.

Example workflow with SSR (conceptual, React/Next.js):

// pages/index.js (Next.js)
export async function getServerSideProps() {
  const data = await fetch('https://api.example.com/posts').then(res => res.json());
  return { props: { data } };
}

export default function Home({ data }) {
  return (
    <main>
      <h1>Latest Posts</h1>
      <ul>
        {data.map(post => <li key={post.id}>{post.title}</li>)}
      </ul>
    </main>
  );
}

Here, the HTML is pre-rendered on the server so both users and crawlers see the full content immediately.

Tip

  • If full SSR isn’t practical, prerendering tools (like Prerender.io) can generate static snapshots of your pages for crawlers. Always test how your content is being indexed using Google’s "Inspect URL" tool in Search Console.

Resource: Google’s guide on JavaScript SEO

Forgetting XML Sitemaps and Robots.txt

Two small files, sitemap.xml and robots.txt, have an outsized impact on SEO. An XML sitemap acts like a directory, giving search engines a roadmap of all the important pages on your site. Without it, crawlers may overlook content that isn’t well linked internally. On the other hand, robots.txt tells search engines where not to go. Without a proper setup, bots might crawl (and index) things like staging environments, admin dashboards, or duplicate URLs that shouldn’t appear in search results.

Common mistakes:

  • Skipping the sitemap entirely, leaving crawlers to "figure it out."
  • Forgetting to update the sitemap when new pages are added.
  • Overly aggressive robots.txt rules that accidentally block important sections of the site.

Fix

Generate and maintain an XML sitemap that lists all relevant, indexable pages. Submit it in Google Search Console for faster discovery. Use robots.txt to block unnecessary areas (like /admin/ or /cart/) while ensuring key sections remain crawlable.

Example robots.txt:

User-agent: *
Disallow: /admin/
Disallow: /checkout/
Allow: /
Sitemap: https://example.com/sitemap.xml

Tip

  • Automate sitemap generation in your build pipeline (many frameworks and CMSs have plugins for this) and review robots.txt regularly to make sure you’re not unintentionally blocking search engines from crawling key content.

Neglecting Structured Data (Schema Markup)

Search engines don’t just crawl pages for text, they also look for context. Structured data, usually implemented through Schema.org markup, helps search engines understand what a page is really about. Whether it’s a product, an event, an article, or a recipe, schema makes content machine-readable. The bonus? It can also enable rich snippets in search results, like star ratings, pricing, event dates, or FAQs, which improve visibility and clickthrough rates.

Many developers skip structured data because the site "looks fine" without it. But without schema, search engines miss valuable context, and pages may not qualify for enhanced search features. Sometimes developers try to add schema but choose the wrong type or fail validation, which means search engines ignore it.

Fix

Use JSON-LD (JavaScript Object Notation for Linked Data) to add structured data in a clean, non-intrusive way. Here’s a simple example for a product page:

<script type="application/ld+json">
{
  "@context": "https://schema.org/",
  "@type": "Product",
  "name": "Handmade Ceramic Coffee Mug",
  "image": "https://example.com/images/mug.jpg",
  "description": "A handcrafted ceramic mug perfect for your morning coffee ritual.",
  "sku": "MUG-123",
  "offers": {
    "@type": "Offer",
    "url": "https://example.com/products/mug-123",
    "priceCurrency": "USD",
    "price": "24.99",
    "availability": "https://schema.org/InStock"
  }
}
</script>

Tips

  • Add schema for your most important content types: articles, products, events, reviews, FAQs.
  • Use Google’s Rich Results Test to validate your markup.
  • Even simple schema (like Article or Product) can give your site a visibility boost.

Hard-Coding Text into Images

When text is baked into an image, like a banner with a headline or a button with a call-to-action, search engines can’t read it. That means valuable keywords and context are invisible for SEO. It also creates accessibility issues, since screen readers can’t interpret the text. The result: your content looks fine to users but may as well not exist to search engines.

A designer provides a nice-looking graphic with text already embedded, and it gets dropped straight into the site. Or developers use an image for a call-to-action instead of actual text. While it may match the design perfectly, none of that content contributes to search visibility.

Fix

Use real HTML text layered on top of background images with CSS. That way, search engines and screen readers can understand the content, and the design still looks polished.

<div class="hero">
  <h1>Your Next Adventure Starts Here</h1>
  <p>Book guided tours across the world with local experts.</p>
  <a href="/tours" class="btn">Explore Tours</a>
</div>
.hero {
  background-image: url('hero-banner.jpg');
  background-size: cover;
  text-align: center;
  color: white;
  padding: 100px 20px;
}

Here, the banner headline and CTA are real text, styled to appear as part of the hero image.

Tip

  • Use background images for design, not for delivering content.
  • If an image must contain text (like a logo), make sure the text meaning is repeated in the surrounding HTML or in alt attributes.

Ignoring Internationalization (hreflang tags)

If your site targets audiences in multiple languages or regions, search engines need a way to know which version of a page to serve. Without proper signals, the same content in different languages or country-specific variations can look like duplicate content. This can lead to ranking confusion, with the wrong version showing up for users in the wrong place.

Developers often launch translated or regional sites without adding hreflang tags. For example, a Canadian English page (/en-ca/) and a US English page (/en-us/) may compete against each other, instead of being clearly distinguished.

Fix

Add hreflang annotations in the <head> of your HTML (or in your sitemap) to point to alternate versions of a page. This tells Google which version to show based on a user’s language or location.

Example:

<link rel="alternate" hreflang="en-us" href="https://example.com/en-us/page" />
<link rel="alternate" hreflang="en-ca" href="https://example.com/en-ca/page" />
<link rel="alternate" hreflang="fr-ca" href="https://example.com/fr-ca/page" />

Tips

  • Always include a self-referencing hreflang tag (each page points to itself as well as its alternates).
  • Use ISO language and region codes (e.g., fr-ca for Canadian French).
  • Double-check your implementation with Google Search Console’s International Targeting report.

Not Handling 404s Gracefully

Every website will eventually have broken links, maybe a page gets deleted, a URL changes, or someone mistypes a link. When this happens, users land on a 404 (page not found) error. A poorly handled 404 frustrates visitors and can cause them to leave immediately. For search engines, lots of unresolved 404s also waste crawl budget and can weaken overall site authority.

Leaving the default server 404 page in place, which is usually a plain "Not Found" message with no way for the user to recover. Or worse, letting old URLs just die without redirecting them, leaving behind link equity and confusing crawlers.

Fix

Create a custom 404 page that matches your site’s branding and includes helpful navigation back to important sections (home, categories, search). Set up 301 redirects for old or outdated URLs that still have traffic or backlinks, pointing them to the most relevant new page.

Example of a simple, helpful 404 page in HTML:

<div class="error-page">
  <h1>Oops! Page not found</h1>
  <p>Sorry, the page you’re looking for doesn’t exist. Try one of these instead:</p>
  <ul>
    <li><a href="/">Home</a></li>
    <li><a href="/blog">Blog</a></li>
    <li><a href="/services">Services</a></li>
  </ul>
</div>

Tip

  • Don’t just send all missing pages to the homepage, it confuses users and search engines. Redirect only when there’s a clear replacement; otherwise, let the custom 404 handle it gracefully.

Skipping Accessibility Checks

Accessibility isn’t just about compliance, it’s about building sites that work for everyone. The bonus is that many accessibility best practices also improve SEO. Semantic HTML helps both screen readers and search engine crawlers. Clear alt text, ARIA labels, and proper heading structures make content easier to interpret. Even something as simple as ensuring buttons are keyboard-friendly can improve usability and search visibility.

Developers sometimes assume accessibility is only relevant for government or large enterprise sites, or they leave it for "later." The result is missing labels, poor colour contrast, or navigation that doesn’t work without a mouse. These issues frustrate users, exclude audiences, and can hurt search performance.

Fix

Focus on implementing basic accessibility fixes that also benefit SEO. Use semantic elements (<header>, <main>, <nav>, <footer>) instead of generic <div>s where possible. Add aria-label attributes for interactive elements when context isn’t clear, and make sure all functionality (menus, forms, buttons) can be used with just a keyboard.

Example:

<button aria-label="Open site menu">
  <svg>...</svg>
</button>

Tip

  • Run automated accessibility tests as part of your QA process. Tools Google Lighthouse can catch common issues early, saving time and making accessibility part of your workflow rather than an afterthought.

Over-Optimized Anchor Text Looks Spammy

Internal linking is great for SEO, but only if it looks natural. When every internal link uses the exact same keyword phrase, search engines may see it as manipulation instead of helpful navigation. This can weaken the impact of your links and even make your site look spammy.

Repeating the same anchor text everywhere, like:

<a href="/hot-sauce">organic hot sauce </a>

If that phrase appears 20 times across the site, it stops looking like a natural link and starts looking like keyword stuffing.

Fix

Mix up your anchor text to reflect context. Use variations, branded terms, or natural language around the link. For example:

<a href="/hot-sauce">organic hot sauce</a>  
<a href="/hot-sauce">our small-batch hot sauce</a>  
<a href="/hot-sauce">browse artisan hot sauces</a>  
<a href="/hot-sauce">learn more about our hot sauce collection</a>  

Tip

  • Think of internal links as part of the user experience, not just SEO. If it reads well for people, it usually works well for search engines too.

CSS display:none vs. visibility:hidden

Both display:none and visibility:hidden hide elements on a page, but they behave differently for users and crawlers. Hidden content isn’t visible to people, search engines can still index it, as long as it’s hidden for good UX reasons (like tabbed content, accordions, or dropdown menus).

Some developers (or old-school SEOs) try to hide keyword-stuffed content using CSS, hoping search engines will index it while users don’t see it. That’s considered cloaking and can hurt rankings.

Fix

Use hidden content only when it improves usability, not to sneak in extra keywords. For example, tabbed interfaces often rely on display:none, and that’s perfectly fine:

.tab-content {
  display: none;
}
.tab-content.active {
  display: block;
}

Or using visibility:hidden:

.hidden-text {
  visibility: hidden; /* element takes up space but isn't visible */
}

Tips

  • display:none removes the element from the layout, while visibility:hidden keeps the space reserved.
  • If you’re using hidden elements, make sure the content is genuinely useful and accessible (e.g., reachable via keyboard navigation). Google is fine indexing it when it’s part of a better UX, but not if it’s just SEO trickery.

Building SEO Into Your Code From the Start

None of these problems are flashy, but they all have a direct impact on how well a site performs in search. SEO isn’t something to bolt on after launch, it’s part of writing good code. By treating SEO as a core part of the build process, you’ll create websites that not only look great and function well but also get found.

Run a quick site audit on your current projects. Look for the issues we walked through and apply the simple fixes. A few minutes of cleanup now can save you (and your clients) major headaches later.

Need a Helping Hand with Your Project?

Whether you need continuous support through our Flexible Retainer Plans or a custom quote, we're dedicated to delivering services that align perfectly with your business goals.

Please enter your name

Please enter your email address

Contact by email or phone?

Please enter your company name.

Please enter your phone number

What is your deadline?

Please tell us a little about your project

Invalid Input