seoClarity has been executing site audits for many years and sometimes it can be frustrating to come across the same technical SEO issues over and over again.

This post outlines the most common technical SEO problems that seoClarity has encountered while doing hundreds of site audits over the years. Hopefully, our solutions help you when you come across these issues on your site.

When we talk about “technical SEO,” we’re referring to updates to a website and/or server that cause a direct or indirect impact on search engine crawling, indexation and ultimately: ranking. Technical SEO includes components like page elements, HTTP header responses, XML sitemaps, 301 redirects, metadata, etc. Technical SEO does NOT include analytics, keyword research, backlink profile development or social media strategies.

These eight common technical SEO fixes are often overlooked, yet straightforward to fix and crucial to boost your search visibility and SEO success.

Download Our Guide to the Basics of Technical SEO

 

1. No HTTPS Security

Site security with HTTPS is more important than ever. In October 2017, Google began rolling out a “not secure” warning in Chrome every time a web user lands on an HTTP site.

To check if you’re site is HTTPS, simply type your domain name into Google Chrome. If you see the “secure” message (picture below), your site is secure.

Site secure

However, if your site is not secure, when you type your domain name into Google Chrome, it will display a gray background—or even worse, a red background with a “not secure” warning. This could cause users to immediately navigate away from your site.

not secure

How To Fix It:

  • To convert your site to HTTPS, you need an SSL certificate from a Certificate Authority.
  • Once you purchase and install your certificate, your site will be secure.

2. Site Isn’t Indexed Correctly

When you search for your brand name in Google, does your website show up in the search results? If the answer is no, there might be an issue with your indexation. As far as Google is concerned, if your pages aren’t indexed, they don’t exist—and they certainly won’t be found on the search engines.

Not sure how to check? Here’s how…

  • Type the following into Google’s search bar: “site:yoursitename.com” and instantly view the count of indexed pages for your site.

site indexation

How To Fix It:

  • If your site isn’t indexed at all, you can begin by adding your URL to Google.
  • If your site is indexed, but there are many MORE results than expected, look deeper for either site-hacking spam or old versions of the site that are indexed instead of appropriate redirects in place to point to your updated site.
  • If your site is indexed, but you see quite a bit LESS than expected, perform an audit of the indexed content and compare it against which pages you want to rank. If you’re not sure why the content isn’t ranking, check Google’s Webmaster Guidelines to ensure your site content is compliant.
  • If the results are different than you expected in any way, verify that your important website pages are not blocked by your robots.txt file (see #4 on this list). You should also verify you haven’t mistakenly implemented a NOINDEX meta tag (see #5 on this list).

3. No XML Sitemaps

XML sitemaps help Google search bots understand more about your site pages, so they can effectively and intelligently crawl your site.

Not sure how to check? Here’s how…

Type your domain name into Google and add “/sitemap.xml” to the end, as pictured below.

sitemap

If your website has a sitemap, you will see something like this:

How To Fix It:

  • If your website doesn’t have a sitemap (and you end up on a 404 page), you can create one yourself or hire a web developer to create one for you. The easiest option is to use an XML sitemap generating tool. If you have a WordPress site, the Yoast SEO plugin can automatically generate XML sitemaps for you.

4. Missing or Incorrect Robots.txt

A missing robots.txt file is a big red flag—but did you also know that an improperly configured robots.txt file destroys your organic site traffic?

Not sure how to check? Here’s how…

To determine if your robots.txt file is incorrect, type your website URL into your browser with a “/robots.txt” suffix. If you get a result that reads "User-agent: * Disallow: /" then you have an issue.

robots.txt

Photo credit for robots.txt bad file = https://en.wikipedia.org/wiki/Robots_exclusion_standard

How To Fix It:

  • If you see “Disallow: /”, immediately talk to your developer. There could be a good reason it’s set up that way, or it may be an oversight.
  • If you have a complex robots.txt file, like many e-commerce sites, you should review it line-by-line with your developer to make sure it’s correct.

5. Meta Robots NOINDEX Set

When the NOINDEX tag is appropriately configured, it signifies certain pages are of lesser importance to search bots. (For example, blog categories with multiple pages.) However, when configured incorrectly, NOINDEX can immensely damage your search visibility by removing all pages with a specific configuration from Google’s index.

It’s common to NOINDEX large numbers of pages while the website is in development, but once the website goes live, it’s imperative to remove the NOINDEX tag. Do not blindly trust that it was removed, as the results will destroy your site's visibility in search.

Not sure how to check? Here’s how…

  • Right click on your site’s main pages and select View Source Code. Use the Find command to search for lines in the source code that read “NOINDEX” or “NOFOLLOW” such as:
    • <meta name="robots" content="NOINDEX, NOFOLLOW">
  • If you don’t want to spot check, use Clarity Audits, seoClarity's site audit tool to scan your entire site.

How To Fix It:

  • If you see any “NOINDEX” or “NOFOLLOW” in your source code, check with your web developer as they may have included it for specific reasons.
  • If there’s no known reason, have your developer change it to read <meta name="robots" content=" INDEX, FOLLOW"> or remove the tag altogether.

6. Slow Page Speed

If your site doesn’t load quickly (typically 3 seconds or less), your users will go elsewhere. Site speed matters to both your users' experience and to Google.

Not sure how to check? Here’s how…

  • Use Google PageSpeed Insights to detect specific speed problems with your site. (Be sure to check desktop as well as mobile performance.)
  • If you don’t want to spot check, use seoClarity Page Speed insights to scan your site.

How To Fix It:

  • The solutions to site speed issues can vary from simple to complex. Common site speed solutions can include image optimization/compression, browser caching improvement, server response time improvement and JavaScript minifying.
  • Speak with your web developer to ensure the best possible solution for your site's particular page speed issues.

7. Multiple Versions of the Homepage

Remember when you discovered “yourwebsite.com” and “www.yourwebsite.com” go to the same place? While this is convenient, it also means Google may be indexing multiple URL versions, diluting your site's visibility in search.

How To Fix It:

  • First, check if different versions of your URL successfully flow to one standard URL. This can include HTTPS and HTTP versions, as well as versions like “www.yourwebsite.com/home.html.” Check each possible combination. Another way is to use your “site:yoursitename.com” to determine which pages are indexed and if they stem from multiple URL versions.
  • If you discover multiple indexed versions, you’ll need to set up 301 redirects or have your developer set them up for you. You should also set your canonical domain in Google Search Console.

8. Incorrect Rel=Canonical

Rel=canonical is particularly important for all sites with duplicate or very similar content (especially e-commerce sites). Dynamically rendered pages (like a category page of blog posts or products) can look like duplicate content to Google search bots. The rel=canonical tag tells search engines which “original” page is of primary importance (hence: canonical)—similar to URL canonicalization.

How To Fix It:

  • This one also requires you to spot check your source code. Fixes vary depending on your content structure and web platform. (Here’s Google’s Guide to Rel=Canonical.) If you need assistance, reach out to your web developer.

Investigating the eight common technical SEO solutions in this blog post is the best way to quickly improve your SERP visibility and can have a hugely positive impact on the user experience of your site.