An SEO audit sets the stage for your SEO efforts. It reveals quick wins that drive results and a path to success the organization can rally behind.

Yet, so many SEOs still struggle to audit a site to deliver immediate results or instill long-term confidence in the SEO program.

To help overcome the issue, I created a simple, easy-to-follow SEO audit checklist that delivers high results. In it, I listed every audit check you should perform to drive the search performance forward. Simply click the button below and follow this guide to get started* on your website SEO audit. 

DOWNLOAD CHECKLIST

 *A couple of notes before we get started: 

  1. The advice you’ll read below comes from my 10 years of experience in assisting enterprise-level SEOs in driving results from the search engines. No fluff or theory-based information included.
  2. If this checklist seems exhaustive, that's because it is! We want you to have the full scope of an SEO site audit, not just bits and pieces. If you want to audit only certain elements of your site, I suggest the following:
User Experience Audit On-Page Audit Technical Site Audit

 

What Is an SEO Audit and Why Does It Matter?

An SEO site audit uncovers tactics and technical elements to prioritize to improve SEO positioning.

Why conduct one? I can list at least five reasons.

  • Site audits help identify any missed SEO opportunities and fix any poorly executed SEO implementations.
  • Identify core issues with the site and allow inspecting the website's technical framework and infrastructure.
  • A website audit enables you to re-evaluate the effectiveness of your website in terms of lead generation and conversions.
  • Regular SEO audits and website crawls determine your progress in eliminating issues.
  • Site audits help refocus SEO efforts on users first and search engines second.

Assessing both the content and technical aspects of your website will open up opportunities to drastically improve the traffic and conversions your website generates and prioritize your SEO strategy.

 

What Do You Need to Audit a Site and Act on the Findings?

For enterprises and larger sites, it’s helpful to have an enterprise SEO platform or set of seo analysis tools ready at your disposal. A platform can serve as an all-in-one SEO audit tool if it is capable of crawling your site, giving you the flexibility to design custom crawls and returning insights at a blazing speed as well as pointing out opportunities with content and improving the search experience. It also supports the content and external linking tactics I outline in the final steps.

You’ll also want to set up the Google Search Console and analytics (e.g. Google Analytics) to monitor the impact fixing site issues would have on its performance. These are invaluable free SEO tools to help with your site audit. I include other free SEO tools inline within each audit section that can help bridge the gap if you are only using free SEO tools or other point solutions.

Finally, you also need a team ready to take in the findings and prioritize the recommendations. Once you execute the audit and find key takeaways, you’ll likely need a developer, content writer, PR team, and other associated team members. Together you should understand the tech stack of the site, capabilities of the CMS, and available scope of the development teams to implement your findings. At the end these team members may know the last-mile ways to fix the issues found in the audit in the most sensible and impactful way.

You’ll need to show that there is opportunity to be had with the takeaways that will payoff to inspire these groups to get them interested in the audit takeaways. This will motivate your team to take on SEO projects and build a culture of SEO excellence in your organization. seoClarity users can leverage Traffic Potential and other projection tools to provide estimates on the impact of making improvements found in the audit.  

 

How to Execute This SEO Audit

Work through each step in order. Give each step a grade of Pass, Fail, or Needs Improvement.

Fail issues are a big deal. They are huge wins that will pay off if fixed. Pass means the site meets the standards outlined by Google, it’s not a factor holding back the SEO performance. Needs Improvement is for where the issue is not necessarily a detriment but you’re being outpaced by the competition.

Through each step you’re spotting how well the site shows in the audit area. Then a natural prioritized list of the issues will emerge to help improve your organic traffic. SEO roadmaps, projects, team member roles, lunch-and-learn summits, goals, execution plans, new tests, and ultimately successes all come from auditing these areas. In the end you’ll, prioritize the Fail issues first, followed by the most promising Needs Improvement issues based on your time and resources.

And, if you want to follow along for quick SEO audits, refer to the chart at the beginning of this guide to know which areas to direct your focus.

Let's get started! Open the checklist from the button below and follow along for this thorough site audit checklist that's sure to drive better search visibility and ROI for your business. 

DOWNLOAD CHECKLIST

 

The Ultimate SEO Site Audit Checklist

#1. Robots.txt file

A search engine bot views the Robot.txt file before crawling a site. It gives directives on how to crawl (or not crawl) the website, which makes it a good first step in the audit. For one, it contains instructions about folders or pages to omit as well as other critical instructions. As a good practice, it should also link to the XML sitemap so the bot can find a list of the most important URLs.

How to Audit

You can view the file manually by going to mydomain.com/robots.txt (replace “mydomain” with your site’s URL, of course). Look for commands that might be limiting or even preventing site crawling.

If you have access to an SEO tool to crawl the site, let it loose on the site and be sure to set the user agent to follow instructions given to Googlebot. This way, if you’re blocking Googlebot via the robots.txt file, the data from the crawl will reflect that with a “403 Forbidden” status code for the URL instead of a “200 OK” status code and the information for the URL.

Google Search Console historically reports URLs where Googlebot is being blocked. seoClarity users can find this in the advanced settings in a Clarity Audit crawl.

Robot.txt Recommendation

In order to be found, a robots.txt file must be placed in a website’s top-level directory. The file is case sensitive and must be named “robots.txt” (not Robots.txt, robots.TXT, or otherwise).

Some user agents (robots) may choose to ignore your robots.txt file. This is especially common with more nefarious crawlers like malware robots or email address scrapers.

The /robots.txt file is publicly available: just add /robots.txt to the end of any root domain to see that website’s directives (if that site has a robots.txt file!). This means that anyone can see what pages you do or don’t want to be crawled, so don’t use them to hide private user information.

Each subdomain on a root domain uses separate robots.txt files. This means that both blog.example.com and example.com should have their own robots.txt files (at blog.example.com/robots.txt and example.com/robots.txt).

It’s generally a best practice to indicate the location of any sitemaps associated with this domain at the bottom of the robots.txt file. Here’s an example: https://www.seoclarity.net/robots.txt.

Common Mistakes with Robots.txt

Auditing of the Robots.txt file sometimes offers the lowest of all low-hanging fruit in SEO: the code often found on development sites that block Google from crawling the entire domain (pictured below). This code is sometimes left in place after the site goes live (or carried over from a site in development), which will continue to prevent it from performing in SEO, so it’s quite a find!

2019-04-25_13h33_09
Image Source

Other mistakes include:

  • Including crawl delays to slow bots down, saving server resources for real users - but this is ignored by Googlebot so no need to include.
  • Missing general user agent directives -not using separate rules for each sub domain and protocol. Each subdomain and protocol on a domain requires its own separate robots.txt file.
  • Linking to the XML sitemap with a relative path - it must be absolute. For example: “/sitemap.xml” would not be respected but: “https://www.example.com/sitemap.xml” would be.
  • Blocking a URL in robots.txt will remove or prevent it from being indexed. You can spot these URLs in the Google index because instead of a description tag Google displays a message that they do not know any information about the URL because they are being blocked. Adding the meta=noindex tag, and removing the block will allow Google to de-index the page in this case. Many sites block quite a bit for security or tactically, I give this a pass here if done in a consistent way though it’s not my preference.
  • Not starting a disallow rule with a slash when you mean to block a sub-folder. For example the below is correct for that use case:

Disallow:/something

This rule is correct because it disallows every URL which sits on the root path www.example.com/something

While the mistake would be the too broad regular expression pattern:

Disallow:*something

This rule would disallow every URL which contains ‘something’ that you don’t mean to be blocking, e.g. www.example.com/stuff/a-big-something.

#2. XML Sitemap

A sitemap contains the list of all pages on the site. Search engines use it to bypass the site’s structure and find the URLs directly.

How to Audit

Your sitemap should reside in the root folder on the server. The most common place to find it directly is at mydomain.com/sitemap.xml or linked to/from the robots.txt file. Otherwise the CMS may show the URL if there is one.

Crawl the sitemap URLs to make sure they are free of errors, re-directs, and non-canonical URLs (e.g. URLs that have a canonical tags to another URL). Submit your XML sitemaps in Google Search Console and investigate any URLs that are not indexed. They’ll likely have an error, re-direct or non-canonical URL.

2019-04-25_13h39_29

XML Sitemap Recommendation

Enable your CMS to automatically maintain the site’s XML sitemaps. This way, every new piece of content published will be added to the file automatically.  If your CMS does not do this you can utilize crawl tools such as seoClarity to create an XML sitemap from crawling the front end of the website.

The XML document must follow Sitemap protocol. Search engines expect a specific format for Sitemaps; if your file doesn't conform to this format, they might not process it correctly.

Make sure you submit the sitemap to Google Search Console to ensure the code is valid and correct any errors. Remove all URL redirects and blocked URLs from the file and remove URLs with canonical tags pointing to another URL. You can add detail to your XML sitemap such as images, hreflang tags, and videos too.

Sitemaps should contain a maximum of 50,000 URLs. If you have more, then you should make a sitemap index that links to multiple sitemaps. Generally it’s useful to include as many sitemaps as there are page types on your site, broken down into different types. For example, one for product pages, one for blog pages and one for category pages. Submit and monitor each in Google Search Console. I also recommend regular crawls of each sitemap to monitor for errors.

Common Mistakes with XML Sitemaps

  • Containing URLs that redirect, return an error (404), canonical to a different URL, or are blocked. Make sure all sitemap URLs are a 200 OK status
  • Also through site updates, the XML sitemap may become detached from the CMS - so new pages cease to be included or old pages not removed.

 

#3. HTTPS/SSL Encryption

SSL encryption establishes a secure connection between the browser and the server. Google Chrome marks secure sites (those having an active SSL certificate) with a padlock image in the address bar. It also warns users when they try to access an insecure site.

Most importantly, though, Google also uses the HTTPS encryption as a ranking signal.

How to Audit

Visit the site in Chrome and look at the address bar. Look for the padlock icon to determine whether or not your site uses an SSL connection. You can also test your SSL encryption at ssllabs.com/ssltest/ to ensure it is valid.

HTTPS Recommendation

Do it. For many years this lived in the “nice to have” bucket of SEO but it continues to be pushed by Google to become the web standard. With the final push from Google making it a ranking signal, you’re in laggard territory leaving your website on a non-secure server. Ensure that any visit to a non-https automatically rewrites (and 301 redirects) to the https version (e.g. if you type in http://www.yoursite.com/ it 301 redirects to https://www.yoursite.com). You can test the status code of your pages with crawl tools such as webconfs.

Common Mistakes with HTTPS

Common mistakes include not moving all assets to HTTPS. For example: image files, CSS files, and JavaScript files being hosted on the old http locations. Everything should be on a secure server to get a pass here.

The other common mistake we have seen is not updating the https URL within canonical tags. If you have a separate mobile website, then alternate and canonical tags should both have https URLs. Also: https URLs in sitemap.xml for https site and internal links on https site leading to http pages. seoClarity users can run a crawl and leverage the Parent Page report to find all instances of old internally linked http URLs so they can be updated to the new HTTPs version.

#4. Mobile Friendliness

More than half of web searches come from mobile now. You should really think of the mobile presentation of your website as your website that also happens to work on a desktop if someone tries. At this stage of the audit  you’re checking that the basic mobile-friendly aspects are in place.

How to Audit

Select your most important templates, for example a category page, product page, and blog post. Test them with the Google Mobile Friendly Tool. Prioritize issues reported for the development team to fix.

2019-04-25_13h46_51

Mobile Friendly Recommendation

The mobile experience should be considered throughout the design. The best way to ensure being mobile friendly is to have a responsive web design site on one URL, though Google will still serve m. sites to searchers. The most important mobile friendly aspects are the page loading in less than two seconds, having readable text, and having fast loading images. Read more from Google on how to be Mobile Friendly.

Common Mistakes with Mobile Friendly Aspects

There are many ways to go wrong in mobile SEO. The most common issues arising today come from limiting the mobile experience compared to the desktop too much. Give mobile users a full experience, not just the parts of the desktop site that work OK on mobile.

Other mobile design aspects that come up are:

  • Tap targets being too close together
  • Content presented wider than the viewable screen (viewport not configured).
  • For separate mobile site:
    • Missing annotations (aka “switchboard tags:” rel=”alternate” and rel=”canonical” to connect the URLs as equivalents) between the two sites
    • A non 1-to-1 ratio between the mobile page and the corresponding desktop page
    • Faulty redirects – make sure that desktop pages don’t inadvertently redirect to a single, unrelated mobile page.

#5. Page Speed

Page speed is an SEO’s best friend. It’s one of the most critical factors affecting a site’s visibility in Google and it's one of the few “factors” everyone can agree impacts the experience of the site.

Studies have shown increased page speed impacts conversions. It’s often first among the “it’s boring but it really works!” takeaways from SEO practitioner discussions. As a result, optimizing page speed often delivers instant results to a company’s organic presence and sets the tone for improving the search experience.

How to Audit

Use Google’s Page Speed Insights tool to evaluate key templates on the site. This data is also within the Google Lighthouse data found at web.dev.

The seoClarity’s Page Speed Analysis gives a handy point of view by combining all these issues across the site to prioritize the impact. It also allows you to keep track of page speed score month over month, making it easier to monitor and evaluate your progress.

Page Speed Recommendation

Try and have the page load less than two seconds. This seems to be the threshold in most studies and how long the average web surfer today can be expected to wait. Get a handle on your top detriments to improve page speed.

Page Speed Common Mistakes

Common issues that lead to poor page speed include: bloated images and code, too many render-blocking resources, missing cache policies that requires the browser to recall the image every trip instead of caching it, and images and CSS that can be delayed until needed but instead load as a render-blocking asset. For example, you don’t need to load that huge image in a menu flyout that only appears when the user performs an action of hovering over it.


#6. SEO Tags in the Head Section

A few important <head> section tags help Google index the site properly. These exalted tags include: meta title, meta description, canonical, and hreflang for international sites.

Without these tags, Google is forced to make assumptions on where to pull content from (title and description) to create the listing, which content among duplicates should be shown to users (canonical tag) and who to show it to (hreflang).

How to Audit

Install the seoClarity Chrome Plugin Spark, or manually inspect the code on key landing pages via Inspect Element in Chrome to spot these tags. Assess whether key SEO tags are present in the <head> section. The Spark plugin will display the data if it’s properly coded.

Also, an seoClarity crawl  will collect these issues as shown below from an seoClarity Clarity audit. This report will show any potential issues with these tags (i.e., duplication, values that are too long, or are missing).

SEO Tags Recommendation

Utilize and configure each properly for every page on the site. Later we dive into how to properly optimize these elements in the “On Page Optimization” section, but at this stage it’s about if they are present and valid on the site.

Common Mistakes with SEO Tags

The most common mistakes are around duplication, without the proper “canonical” or  noindex in place that shows the duplication is intentional. For example if a multiple-page article has the same Title Tag but is missing code to show that page 1 or the version show all the pages should be where Google lands people.

Some sites may overlook including the canonical tag on every page that causes duplication issues after tracking parameters are added to URLs.

It also occurs that these tags are found outside the <head> section of the website, which is the same as not being there at all.

Another common mistake is having the values in the SEO tag change between the page’s HTML (seen in the View Source of the page) and the DOM (seen in the Inspect Element section of Chrome, and is the HTML code rendered after JavaScript execution). The values in these SEO tags should remain the same whether or not the bot executed JavaScript. If different - for example only unique title tags being displayed after JS, can cause Google to display the plain HTML version instead of the optimized post-JS execution version in search results.

#7. Crawling

For the search engines to index and rank a site, they need to crawl its pages first. Google, for example, releases a bot to crawl a site by executing internal links. Errors, broken pages, overuse of JavaScript, or complex site architecture might derail the bot from accessing critical content or use up the available crawl budget trying to figure out your site.

How to Audit

Use an SEO crawler to imitate the path taken by the search engine’s bot. Look for reports of crawl issues caused by unnecessary URLs, broken links, redirect chains or incorrect canonical configurations. Google Search Console also surfaces crawling errors it has found.

Crawling Recommendation

Remove any hindrances in crawling the site. Limit the use of parameters so bots don’t have to consider if or how it impacts the content. Utilize links thoughtfully in the content that contextually ties the site together. Design the menus and hierarchy in a logical way so the relationships can be seen from this perspective as well.

Common Mistakes with Crawling

Several hurdles are possible with crawling including broken links, internal redirects, and “spider traps” where Google is caught crawling link after link without finding anything new (for example a calendar with links to each new month into infinity). Pages that are dead-ends to new content should have the “nofollow” tag to communicate this so Google can go down a different path.

Other common mistakes have to do with a lack of links. For example a mobile site that doesn’t include the header menu, effectively making it an orphan page. Category pages using “infinite scroll” sometimes find that some pages do not get crawled or indexed because Google doesn’t see the link path to them because they are below the initial load visible area. Google doesn’t scroll. Extra code can push these links into the HTML or you can add them directly. Find more about crawling errors and how to fix them.

#8. Rendering

Rendering relates to the way Google sees and displays your page as a document - both the content and the code. You’ll stay safe here sticking to progressive enhancement principles where the content and core functionality of the site is accessible even from a text-only browser. Then as users execute CSS and JS files the content is all available to the way Google renders, or views, the page. For many years Google did not execute JavaScript but now they do so be sure Google can access all the files needed to create the page just like any other user, i.e. remove any blocking of these files.

How to Audit

Run a site crawl with JavaScript enabled to render pages exactly as they would appear in the browser. In doing so, you’ll evaluate issues Google might encounter when rendering your pages with their JavaScript crawling capabilities.

Another way to check how well your page renders to Google is to view the cached version of a few important page templates. You can view this after searching for the page and clicking the option next to the URL to view the cached version. Google tends to store only the HTML of websites in the cache opposed the JavaScript executed version. If properly executed the page should still render all important content and SEO elements in the HTML state.

Additionally the Mobile Friendly Testing tool is known to behave as Google’s headless browser, executing the page’s JavaScript and rendering the page. This is a great SEO tool to test if anything is stopping Google from accessing the content.

Rendering Recommendation

Utilize progressive enhancement so that SEO elements and content render in HTML. Find more details from Google. Other methods such as initial static rendering and pre-rendering can get the job done too as a second option to progressive enhancement. Read more on JavaScript SEO core principles and familiarize yourself with the SEO technical renaissance with content and JavaScript over the past several years to help discuss with developers.

Common Mistakes with Rendering

Putting content behind a user interaction. Google will render the DOM with their headless browser but will not view a JavaScript click that reveals more content. A quick way to know if Google can find your content is to search it directly in Google and see if that exact text appears on the search results page, highlighted in bold to show it matched your query.

#9. Indexing

The index is where Google stores information about pages it has crawled. It’s also where it selects the content to rank for a particular search query. Google is rapidly expanding, culling, and updating its “index” of the Internet.

How to Audit

A quick search in Google for “site:domain.com” will show indexed pages. You can dig into these and likely find duplicate content and other “over-indexed” pages too. If you skip to the last page there is typically a “Google has removed duplicates” message that you can click to reveal that will show the pages it has found but consolidated because they think it’s duplicate anyway.

At this point in the audit you know you’re free of “crawl issues.” So to move on from here you want to make sure Google has found your content and has it indexed, ready to show it to searchers. Indexing is the outcome of crawling and rendering your site properly, with the proper tags, and unique and valuable enough that Google thinks it’s worthwhile. You’re indexed by Google if you can see your URL or site by searching it in Google.

To get a bigger picture, Google Search Console provides indexation levels. They recently added an Excluded assets section as well. Some pages on the Internet aren’t worth indexing and here Google will share where that’s the case so you can work on making it better.

In seoClarity’s Site Health report, you’ll find the indexability data showing:

  • The total number of indexable pages on the domain,
  • The number of errors as well as the reasons for them, and
  • A complete list of indexable pages to help you access potentially problematic assets, evaluate and correct.

 

Finally, you can review bot activity on the server to see how Googlebot is crawling your site that may explain how your site is being indexed. In seoClarity, you can filter the results by response codes to identify page errors instantly.

You may find that some pages that are not getting crawled simply have no links to them. Improve internal links continuously to ensure the bot can reach pages deep in the site’s architecture as well. With non-indexed pages, check if the content isn’t too thin to warrant indexation.

Indexation Recommendation

The number of pages indexed after searching “site:domain.com” should ring true when you consider your site. If you’re a 100 page site and have a million pages indexed there’s something off, same thing if you're a million page site with 100 pages indexed.

Analytics is helpful here too. I like to think that every page indexed by Google is there to meet a demand. So the number of pages indexed equaling the number of landing pages in analytics over the course of one year should be pretty close if you’re a great website.

Common Mistakes with Indexation

Indexation problems typical signal upstream issues with crawling and rendering. If the site is still under-indexed, then it's meta tags and/or content may need to be updated to improve the unique value offered by the page so Google will deem it worthwhile to index.

The most common mistake though is using the robots=noindex tag inadvertently. More than once this tag has mistakenly been added to a prized landing page after a development release. Google obliges and removes the URL, while the SEO team submits and urgent ticket to get the noindex tag removed and the page re-indexed. seoClarity users leverage Page Clarity, which checks the URL daily, to push an email alert if this tag is found (hopefully before Googlebot).

Faceted navigation (Step 13) intersects this step as well. Faceted navigation creates an exponential amount of pages for Google to index. If these are not created thoughtfully several duplicate URLs can be generated causing your site to be over-indexed and dilute the impact of your most important URLs.

#10. On-Page Optimization

This is where the SEO analysis switches hats slightly from technical-minded to content-minded. Your site is in the game, now let’s think about it’s being played. In particular, how well it is optimized for relevant keywords? Key areas to audit are how the meta tags, header tags, and body copy are being used to create a great search experience for the target keyword topics.

Before evaluating on-page SEO, conduct thorough keyword research so that you know what phrases various content assets target.

How to Audit

You can do it in a couple of ways.

Evaluate the key landing pages for on-page issues manually. Although, I admit that assessing whether a keyword is present in H1 or H2 headings or the meta description across hundreds or even thousands of pages would soon prove too cumbersome. That’s why, seoClarity offers a better option.

Crawl your content to find pages on the site featuring specific words or phrases. By doing so, you will identify common problems on pages - wrong keywords or phrases featured in headings, for example. Or audit your content at a granular level.

But you can use this capability to do so much more.

Let’s say that you want to find all pages with video on them to audit their on-page optimization. Simply look for instances of words such as “video,” “YouTube” or “short clip” to access every page featuring a video.

On-Page Optimization Recommendations

  • Title Tags - approximately 70 characters, use target keyword  
  • Meta Descriptions - approximately 150 characters. Wow the searcher in to clicking through to the site, assure them you have their answer. Find more on writing a meta description. I recommend owning the book “Words that Sell” to give you an idea of how to make your listings stand out in search results.
  • Headers - Specifically the “H1” tag should be a shortened version of the Title Tag typically, between 2-3 words. H2 tags should be used if they follow the format of the page. H3s and beyond are of less importance but if used should be relevant and used in order.
  • Body Content - Created to improve the search experience. Write with authority and help solve the searchers problems using the target keywords. Find the target keywords for the page and write to them with authority. seoClarity users can leverage Research Grid for this by finding the keywords where the ranking URLs for target keyword are also ranking.

At this stage of the audit, the goal is to spot a few quick seo tweaks on pages where the keywords are ranking between 3-8. By doing this you can gain wins and reveal an ongoing workflow to improve these elements for the search experience.

Common Mistakes with On Page Optimization

There are a few common issues with these elements.

  • Blank meta description tags may be the most common if SEO is being overlooked.
  • Generic title tags are a semi-normal occurrence for example just the product name or category name (e.g. “Hotels” or “Solutions”).
  • Keyword stuffing in the page title. That’s old school SEO that many sites used to employ. A setup with 2-3 target keywords stringed together as opposed to a brief, clear, title using keywords that is best for searchers.
  • Multiple H1 tags. This has been more of an occurrence with HTML 5 and “approved” by Google but I find it confusing to searchers and to the organization of the page to have multiple H1 tags and still consider it a mistake when I find it. For searchers, the page has one primary topic that should be the H1.
  • Not using keywords at all in the headers, instead using marketing language is pretty common. Searchers don’t care about your brand message, they want to know if this page is where they can find answers.
  • With Meta Title and Meta Description writing, not using active or unique language is a common mistake. These should be more like PPC ads than developer back-end labeling that can be the case.
  • Broken images and broken links in the body copy. A crawl of the site will reveal these elements.

#11. Relevance

Consider the information gaps of the person coming in from outside the site. Do you offer all the information and context needed to choose the best available product or service for their needs, beyond stating the keyword?

Do you help them learn about the product or service and make a well-informed buying decision? Can they see what is unique about your offering or information? This is relevance.

How to Audit

Review the top ranking sites for your keyword and determine why Google thinks they are so special. You’ll find sites giving extra context. Information that is outside the target keywords but helpful to the search experience.

Providing relevance is considering the contextual aspects of UX optimization. This is simply focusing on the visitor. Search engines collect data to understand the user behavior. If they see someone is bouncing back to the search results after hitting your site, they just know they bounced, they don’t really know why. For them they just see the bounce and, perhaps, factor that into how well you should rank.

Which means websites that do a better job of meeting the needs of searchers have a better chance of landing on the first page of the search results.Your job as the SEO is to determine what’s causing this behavior and then figure out ways to provide a better result for the visitor overall.

To get this right you must consider the user's search intent. For example, if you’re selling a grill, stating how many “burgers” its holds might be more important to the user rather than the surface area in inches. This is a better connection to their intent of buying the grill that will feed their family.

seoClarity users can leverage Content Fusion again here. Take another look at semantically related keywords - and see how they can help improve the relevance of your content.

Relevance Recommendation

Create a great search experience. This will be unique to every site and include good content, and site layout/structure. By reviewing the top sites, and your own sites, projects such as content updates, new content ideas, and creation of new tools and experiences to bring your visitors will likely emerge. Here are more ideas to get you started in improving the searcher experience by improving relevance.

Common Mistakes with Relevance

Common mistakes with content include keyword stuffing or not considering the searcher experience. Lack of semantically related keywords in body content is also very common when adding relevance to the site.

It’s easy to speak to your target keyword and its synonyms but it takes an extra effort and really thinking about your searcher to get this step right and compete among the top ranking sites.

#12. Structured Data

Structured data found at schema.org provides webmasters and SEOs with the ability to give semantic context to elements of the code. Google, in turn, uses this information to enrich search listings. It’s how Google can show a company’s telephone number, reviews and star ratings, event info and much more on results pages. This helps attract the user’s attention and boost the organic click-through rate.

However, for Schema.org to work, it must be properly executed in the code. As part of the audit process, I recommend reviewing the markup for potential issues and errors, and making a plan for the ideal schemas to use for the most important templates.

How to Audit

Use the Google Structured Data tool to evaluate your schema markup. Grab a few important pages and enter them in.

Google Search Console now also reports on potential issues with Schema. You’ll find the report under the Enhancements section (it shows if you have markup added to the site) and it shows errors, warnings and the number of valid URLs in total.

Schema.org Recommendation

Prioritize schemas that will generate Rich Snippets first such as Breadcrumbs and videos. This way it will have the benefit of providing information to Google and impact how your listing looks on the results page.  Check out the Rich Snippet Gallery from Google to see which Rich Snippets would make sense for your site. Product markup is great for e-commerce sites to achieve star rating and reviews. Other examples include Event schema to achieve the extra space on the search results so searchers can go directly to your event page.

Also, review managed keywords for universal ranking types to note which competitors might be using Schema to enrich their listings already to help you prioritize.

Install using the JSON-LD method as this is recommended by Google now and much easier of an ask to developers than coding the schema.org code in line (i.e. the Microdata method).

Common Mistakes with Schema.org

Not including all of the essential data, such as an “offer” under a product is typical. Google’s Structured Data Tool will flag these as “required” or “recommended” to flag these issues.

Sometimes developers will not include all the information in the tag or extra information, such as an author's first name getting cut off or an incomplete product name. After checking a URL with the Google Structured Data Tool, take a moment to read through the values to make sure everything is complete. Find more on implementing schema.

Don’t abuse structured data markup. Google is much more aware of manipulation of Structured Data these days and will happily apply a manual action if they feel you are spamming them.

The many ways you can trigger a manual action from Google include:

  • Marking up content invisible to users
  • Marking up irrelevant or misleading content
  • Applying markup site-wide that should only be on key pages


#13. Faceted Navigation

Faceted navigation helps e-commerce sites expand their reach by strategically creating sub-categories at scale. For example if you’re selling Chicago Bulls hats, you may have both red, black, and white ones. Most people search “Chicago Bulls Hats” and you have that search experience covered with your category page. With smartly applied faceted navigation you can allow your site to create a page for “White Chicago Bulls Hats,” “Black Chicago Bulls Hats,” and “Red Chicago Bulls Hats.” You likely already allow a user to filter products like this. Faceted navigation is what allows Google to index those filtered pages so the searcher that knows the color they want can skip that step in their conversion funnel and land directly on your Red Chicago Bulls Hats page.

However, faceted navigation can generate problems, particularly if each filter creates a new URL on the site without matching a demand. Google might pick those up, leading to the search engine crawling unnecessary URLs and indexing duplicate content. While being very useful to users, obviously it's helpful to have the ability  to filter down to the specific thing you want so you don’t have useless pages indexed by Google.

How to Audit

Search different filter and product category in Google to see if your pages show up in the index. Check to see if you’re creating too many pages by finding the URL pattern of how your site creates the pages, e.g. searching “site:wayfair.com inurl:color=

Faceted Navigation Recommendation

This may be the single thing separating the top sites from the pack. The top sites utilize faceted navigation brilliantly. Only pages that align with search volume are offered to be indexed and all of the key On-Page elements mentioned above update to the long-tail target. Work with your e-commerce platform to find the right way to implement as many of them have a solution for faceted navigation.

You can also review 3rd party tools such as Searchdex, YourAmigo, and Bloomreach to learn more about their offerings to help create this setup on your site and provide a great long-tail search experience for your searchers.

The most important elements are that on-page elements update making the “faceted navigation URL” stand on its own, this includes a unique URL, title tag, description tag, and H1 tag unique that page, as if it were another top level category page on your site.

Common Mistakes with Faceted Navigation

Every possible combination of facets is typically (at least one) unique URL, faceted navigation can create a few problems for SEO:

  • It creates a lot of duplicate content, which is bad for various reasons.
  • It eats up valuable crawl budget and can send Google incorrect signals.
  • It dilutes link equity and passes equity to pages that we don’t even want indexed.

We would recommend the following solution: Category, subcategory, and sub-subcategory pages should remain discoverable and indexable. For each category page, only allow versions with 1 facet selected to be indexed. On pages that have one or more facets selected, all facet links become “nofollow” links.

On pages that have two or more facets selected, a “noindex” tag is added as well in case Google does crawl these pages (perhaps from a direct external link).

Determine which facets could have an SEO benefit (for example, “color” and “brand”) and whitelist them. Essentially, throw them back in the index for SEO purposes.

Ensure your canonical tags and tags are set up appropriately.

For example: a common mistake is having the canonical tag updated to point to the unfaceted version (e.g. a primary category page), which would be a valid setup if you didn’t want Google to index the faceted URL.

Not creating a crawl path for Google to find these URLs, perhaps because the sort/filter functionality is all behind JavaScript execution. Also creating multiple versions of the faceted URL, for example “White Chicago Bulls Hats” and “White Hats - Chicago Bulls,” be sure that if your site can create the same filtered product list that the canonical tag on each point to a single version for your SEO targeting.

Finally, not including these pages in an XML sitemap is a mistake because it misses a chance to tell Google you really mean for these URLs to be indexed and shown to searchers. A crawl of your site will reveal most issues as well because it’ll reveal the URLs and if the proper tags are in place to match the search demand.

#14. Accessibility

A website should support people with physical, cognitive, or technology impairments. Google promotes these principles within their developer recommendations. For SEO, accessibility issues are a combination of rendering issues and relevance laid out above. Google describes accessibility to “mean that the site's content is available, and its functionality can be operated, by literally anyone.”

These elements help these users but it’s also great for everyone, for example having a  distinct color contrast ratio, which all users would appreciate. Finally Accessibility and Google’s championing of its principles gives SEOs the license to comment on usability in the design such as ensuring that the “user's focus is directed to new content added to the page” and that the content is “understandable.”

How to Audit

The Google Lighthouse Plugin (web.dev) does a great job of outlining accessibility issues. It will flag items in the code such as missing alt text on images or names on clickable buttons. It will look for descriptive text on links (no “click here”). These elements help bring the site to life for those using screen readers. It’s also not hard to see how a site built to these standards would help Google understand the web at scale.

Be on the lookout specifically for: generic or missing link names, missing alt tags, and headers skipping order (e.g. an H2, without an H1).  This is a great opportunity to teach the team about these issues and create an accessibility standard for the site that is great for the search experience for all users and highly encouraged by the gatekeeper of the web, Google.

If you use a crawler like seoClarity’s Clarity Audit, you’ll get notifications on those issues in the site crawl report as well.

Accessibility Recommendation

Strive for the basics such as alt tags, relevant names on links and buttons, and headers in logical order. These to me are the accessibility issues that have a logical crossover with SEO because they are likely also used by Google to consume the website.

Beyond that look to prioritize accessibility in design and help foster a culture of a website available to everyone that is designed to be simple and easy. This category is also a great catchall for something that makes a poor search experience.

Some portion of the Google’s intake on websites come from Quality Raters that are instructed to imagine searching for a term with a specific intent, landing on a given website and asked to evaluate it for its quality. SEOs typically have this eye as well and if something jumps out that gives a critically poor search experience than it can be added here within an SEO audit, for example content hidden behind multiple tabs. Content that is poorly written, translated, or outdated content counts. Sites with a poor site structure, confusing re-directs, or that lock you into a geo-experience are accessibility issues.

#15. Authority Content

Authority content is content that can only be written by the brand. Beyond using the target keywords (on-page optimization) and making it great for searchers (relevance), authority content showcases why you should be trusted to be giving the information.

This is where you really give the target content a hard look - look for original facts and research. Great content is genuine and written with care. Remember that the reader arrived after doing a search. They have a problem to solve, something to learn, or a fear to squash. Does your content answer the call? Does it calm their fear?

This is also the step where you evaluate the topic clusters for SEO, surrounding your content with blog posts and resources. These assets can target new terms in their own right and give you chance to showcase authority on a subject and interlink the content.

How to Audit

Shortcut to evaluating how the site is targeting “awareness” keywords. To see these, filter out Google Search Console queries containing words like “how” (seoClarity users can do it using Search Analytics). Evaluate rankings and performance, looking for low-hanging fruit (pages ranking between positions 11-40, requiring a little push to appear on page one.).

This is also a good step to find content gaps. These are topics you could be targeting but missed in your site’s plan. A great way to find these is to execute a Wisdom of the Crowds workflow. Find the keywords where three competitors are in prominent position but you are not. These are likely easy wins for you if you simply showed up and created the content.

You can also do keyword research to determine the opportunities at different intent stages related to the target keyword. For example if you’re selling running shoes, an article on “how to choose the best running shoes” and “tips for running a marathon” are topics that can expand your authority for the primary terms (running shoes), and help move searchers down the funnel, without leaving your site.

Authority Content Recommendation

Produce content about your target keyword as if you were a media outlet assigned to covering it. Cover your “beat” with content like new innovations, best practices, and biggest mistakes. Interact with your community to gain feedback on your content. Share on social media. Update it with new information.

seoClarity users can leverage tools such as Content Ideas that will pull in the top People Also Ask and other sources to generate content ideas. Find more on creating content for SEO.

#16. Off-Page

Off-page analysis is a look at everything happening off the website that is impacting SEO, i.e. external links. The quality and quantity of relevant websites sharing and linking to your content is a good sign that your content is worthwhile. It’s the original Google innovation and currency of the web.

How to Audit

Grab your top two competitors along with your own site. Run them through a backlink network such as Majestic to see the top linking sites. seoClarity users have access the Majestic link network through their subscription.

Review the top links pointing to your competitors. Break down where they came from, e.g. sponsorship, resource creation, media mentions, viral content. Be on the lookout for links related to your most important keywords. Do the same for your site to understand where your links are coming from. Also look at your analytics to see your referring sites - remember that the best links are the ones that drive traffic because it is a path on the Internet that real people actually found useful to travel. Google’s innovations around off-page analysis will correspond with links that get traffic because they indicate quality.

After this review of backlinks, you’ll have some specific targets and a good understanding of what drives linking in your industry.

Off-Page Analysis Recommendation

For your own link profile, look for at least 75% of your links pointing to a page other than your home page. Look for diverse anchor text that shows natural linking to many sources, i.e. if all your links are to your home page with your brand as the anchor text you’re probably only getting low-quality directory listing links.

Develop a plan to grow your referral traffic through strategic linking. Participate in industry forums and websites. Fix broken links. Redirect content to maintain the link path for users.

From here my favorite link building tactics are the Skyscraper and Content Outreach methods. Skyscraper is great if you find something specifically valued in your industry that is being linked to. These can be things like guides, annual reports on industry trends, a tool or calculator, or a survey or study for example. Create a better version of the asset (i.e. the tallest “skyscraper” in the “city”) and then outreach to all the websites that were linking to the competition’s content to let them know you just took things up a level and they may want to link to the best.

Content outreach works to take your authority content and then share it with people in your industry because it’s so great. As outlined by Ahrefs in this post on link building, you should take the time to provide value because “people with a large audience need a constant flow of awesome content to cater to their fans, so they will be grateful if you show them something of value.”

After auditing the backlinks you can develop a plan on how to incorporate these tactics to best your competitors.

Common Mistakes with Off-Page

The biggest mistake SEOs make with off-page tactics is doing outreach without researching the specific value that may be brought to the person they're contacting. Skipping the competitive research part, which can reveal invaluable nuggets on what is driving their off-page success is another misfire. Any participation in black and gray hat link building, which died with the Penguin Update in 2013 - that's a no-no, too. 

Conclusion

You made it! Once you work through these 16 steps you should have a full list of tactics ranging from quick wins and mid and long term strategies to improve any website.

Prioritize your site audit tasks in order of impact and level of effort in a grid. Do the high impact, low effort items first and work from there. Show progress on where it is paying off. For the areas you scored “Needs Improvement” determine the appropriate Standard Operating Procedures and Workflows to train your team to set a new standard. Over time these areas will turn into strengths as the SEO performance will improve.

Not sure if you’ve remembered it all? Grab this free site audit checklist now to guide you through each step of the process.