Advanced technical SEO is one of the most guarded techniques among SEOs. While there are many articles posted about the topic, very few dive deep into the subject and explore it to the fullest.

In this article, I'll show you how to take your techniques from routine to advanced and peel back the curtain behind the mysterious field of advanced SEO.

Our advanced SEO techniques are based on years of experience working together with expert SEO and digital marketing practitioners from today's top brands to reveal commonly used approaches.

Note: If you're looking to get started with your first site audit or you're new to site audits all together, I suggest you download our free, 16-step SEO audit checklist, complete with easy-to-follow, bucketed tasks for SEOs to follow. 

DOWNLOAD CHECKLIST

We will also review how to utilize our enterprise SEO platform to address indexation, accessibility, search intent, and crawl errors and identify some of those more advanced techniques to add new skills to enhance your technical audits.

(Note, for this walk-through, I’m going to use our suite of technical SEO capabilities within seoClarity.) 

#1. Indexation

Setting Up Recurring Crawls

Websites rarely remain static for long - new pages are added frequently, technical functionality changes, and other updates are made that increase the potential for new errors and issues.

Recurring crawls help you get ahead of errors and monitor progress, which also creates a baseline for which to report organic traffic to executive teams (and to monitor for any technical errors).  

I always recommend our clients decide on a flat audit schedule first before diving into the audit data.

This way, they can monitor the progress before and after each SEO audit.

If your organization adds new features or updates the site regularly, set up the audit to coincide with the development schedule. Otherwise, schedule crawls to happen weekly, bi-weekly, or monthly depending on your team's internal development change-rate.

In our guide to crawling enterprise sites, we offer advice on things like crawl depth and parameters so as not to slow down overall site speed. 

First Crawl

The first crawl is important because it sets the benchmark to evaluate the effect of any changes and updates you’ve made to the site.

crawl1.png 

Also, the benchmark data will help you in these other ways:

  • Demonstrating to your executive team the progress you’re making with improving the site
  • Pinpointing ongoing development issues to forward to your perspective teams
  • Allowing an initial performance benchmark so you can set up alerts to notify your team of any changes

XML Sitemap Crawls

When you have a new page on your site, ideally you want search engines to find and index it quickly. One way to aid in that effort is to use an XML sitemap and submit it via Google and Bing.

When launching the crawler, schedule a separate audit to assess your XML sitemaps.

While the first crawl will assist in finding any errors in the entire site, crawling the XML sitemaps ensures that all the URLs contained within the XML sitemap are fresh, further ensuring that search engines can identify and index your most important pages.

If you find server errors (not 2xx status), this process allows you to quickly identify those errors.

It’s important that your XML sitemap is up-to-date and contains all relevant URLs. Identifying errors in your XML sitemaps can assist in discovering any roadblocks in design, internal linking and errors that need to be resolved.

crawl2-1.png

In particular, focus on identifying these issues in the sitemap crawl:

  • Pages with status URLs other than 200 
  • Canonical mismatch URLs
  • An inconsistent number of pages crawled and the number of pages on the site.
  • Indexation issues, particularly the number of pages on the sitemap vs. the number of pages in Google’s index. Previously you were able to get a pretty accurate number utilizing “site:domain” search (see the SERP image below) but over the years Google began hiding the actual amount of indexed pages. One solution is to review the amount of indexed URLs provided with Google's Index Status Report. Advanced users can also reference this article written by Paul Shapiro in how to utilize R scripting language to get a more accurate read on how much of your site is currently indexed. You can also utilize Bot Clarity combined data in this reporting.

crawl3-1.png

Diagnosis Crawls

At seoClarity we also have the capability to crawl your site to identify specific, more advanced technical issues. Diagnosis crawls help you identify issues including content imports from external sources and third-party resources that fail to load or are inaccessible.

There are many use cases for errors and each will be specific to your site.  

  • seoClarity's built-in crawler allows users to crawl via the XPATH, CSS, a specific DIV_ID and DIV_CLASS. This is useful for practitioners that are looking to diagnose specific errors. For example, a client of ours launched new content across their entire site and loaded the content via a content delivery network, or CDN. As the content was loaded from the CDN into the site pages, it was clear there was an issue and rankings were subsequently affected. The additional block of content was identifiable by a specific <div> tag. This allowed us to create a crawl and identify which pages contained the <div> tag and cross reference which pages did not.

  • The key to utilizing the crawler is to ensure that the Development team also follows these best practices and identifies attributes as new items are added to the site. As you are performing an audit, you will want to zero in on the items that have launched and to ensure those items are following best practices.


CrawlProjectSetUp-1

For example, if your site or page is not being indexed, the most common culprit is the meta robots tag being used on a page or the improper use of disallow in the robots.txt file.

Our crawler has a setting that allows the crawler to obey the robots.txt file to find out if the pages are blocked by mistake.

If a page or directory on the site is disallowed, it will appear after Disallow: in the robots file. For example, if I have disallowed my landing page folder (/lp/) from indexing using my robots file, this prevents any pages residing in that directory from being indexed by search engines.

We also collect the robots meta tag to identify the issue, like when a site has added a noindex tag to the page but is meant to be indexed.

For example, you can tell search engines that links on the entire page should not be followed. This will ensure that you are limiting your crawl budget to only the most important pages you want indexed. 

We have had clients un-index entire sections of their sites due to adding a noindex tag to content. Having this ability to crawl along with setting up an alert allows our clients to be notified so that content does not get un-indexed and content that they want to appear becomes indexed.

Probably the two directives that are checked most often for SEOs are the appearance of the noindex/index and nofollow/follow. Index/follow, implied by default, indicates that search engine indexing robots should index the information on this page and that search engine indexing robots should follow links on this page.

Noindex/nofollow indicates to search engine robots to NOT index the information on the page and search engine indexing robots should NOT follow links on this page.

Now that you have all of your baseline data, you have utilized Clarity Audits to diagnose any issues and have clean crawls you will want to assess your site's visibility.
 

#2. Assessing Visibility

One of the most important SEO strategies is to make sure that your site and content within your site is accessible by your users to deliver the best experience possible (which, in turn, ensures accessibility by the search engines to index your content).

Even though it might be the most important strategy, it's often to the most difficult to execute.

When crawlers find a webpage, Google’s systems render the content of the page, just as a browser does. Our software can imitate a Googlebot and analyze pages to identify issues that would prevent it from accessing or indexing pages.

If our crawlers cannot access your site, our team will identify the errors causing the block so that you can be certain those pages are accessible.

Search visibility changes over time. Using advanced ranking analysis in seoClarity, you can dive deeper into the shifts in your ranking factors over a specific period of time. Integrate your analytics to have the complete picture in one centralized location. 

Diagnose Pages

Advanced practitioners utilize seoClarity to diagnose any issues their site may be having with accessibility and indexation.

The data set shows the number of ranking pages and any decreases in pages ranking. You can also filter by a specific URL type to identify which page type is being affected.

This saves time and allows practitioners to identify an issue quickly. 

RankingPages_ResearchGrid

Error Pages

Next, search your site for pages that trigger error server codes or redirects, and evaluate why. If possible, correct those redirects or fix server issues causing pages inaccessible.

SiteHealth_Status

We recommend also reviewing our previous blog posts performing a site audit to identify additional areas to quickly monitor. 

#3. Evaluating Search Intent

Google has recently published additional data on search intent and as Google searches become even more refined, prioritizing intent is more important than ever.

Advanced technical practitioners will want to divide their pages by intent and create pages that specifically fulfill that user intent.

Manually evaluating the SERPs to get a hint of the underlying intent of a query is not scalable.

seoClarity's Content Fusion allows you to check your pages and identify which pages are "Informational," "Transactional," "Navigational," or "Local" based searches. 
 Search Intent Content Fusion


Advanced practitioners create page tags and keyword tags around all of the keywords and pages based on intent. This isolates pages that need additional content versus pages that need to display product pages.

Once defined, zero in on optimizing the content around the particular search intent and also quickly identify errors in which the wrong URL is displaying for a particular search term.

#4. Eliminating Crawl Budget Issues

Search engine crawlers have a limited time to analyze your site. Any potential crawlability errors will waste that crawl budget and prevent them from accessing some of your content.

Devote part of your audit to eliminate crawler roadblocks to make the most of your available budget. seoClarity offers two distinct capabilities for this.

Bot Clarity monitors how search engine bots crawl your site in the following ways:

  • If there are challenges to crawling your site, Google will store the URLs to revisit.  
  • According to our experience working with clients, Google will store and revisit URLs approximately 4-5 times to gather information on the URL.
  • Bot Clarity allows you to see the frequency of visits to singular URLs or URLs by page type identifying an issue.
Check how deep bots analyze your site. If they stop the crawl before reaching the lowest level, update interlinking to shorten their path to the content they typically omit.

BotClarity.png

Similarly, Bot Clarity also allows you to identify spoof bot activity and block those crawlers from eating up your bandwidth.

Recommended Reading: SEO Crawl Budgets: Monitoring, Optimizing, and Staying Ahead

Clarity Audits can be used to imitate a particular bot and identify what issues specifically prevent them from accessing all site within the crawl budget.

Some of the issues this crawl might reveal include:

  • Broken internal links
  • Long redirect chains
  • API call outs
  • Inaccessible content
  • Errors in robots.txt file blocking bots from accessing specific sections

#5. Internal Links

I’m often amazed how little attention SEOs give internal links.

Internal links offer an incredible opportunity to boost visibility in the search results. Not only do they help search engines crawl the site, but they also let you communicate to the search engines what keywords you want a page to rank.

Not to mention the benefits internal links bring to the end users!

A site audit tool typically assesses a number of issues with internal links. Clarity Audits, for example, will tell you:

  • What pages have no internal links
  • Which ones have too many of them
  • Whether any of those links point to missing URLs
  • And what other problems occur with those links (i.e., using relative reference.)

crawl9.png


You can also crawl internal links to uncover any errors and quickly update across the site.

With such insight you can:

  • Update any missing or broken internal URLs
  • Replace broken links and improve crawlability
  • Ensure links are transferring value between pages
  • Improve site taxonomy and crawl budget use

#6. Identifying Errors in Ajax or Heavy JavaScript 

How to handle Ajax and JavaScript has changed rapidly over the years. In the past, Google embraced the usage of the "?escape_fragment_" and now sites are reporting that pages are being de-indexed that are still utilizing the escape fragment on Ajax sites.

As more and more sites move to Ajax or heavy JavaScript, advanced technical practitioners need to spot and identify errors as soon as possible or as my colleague, Richard Chavez argued in an article about how to embrace JS in SEO:

If there’s one programming language you simply must embrace as an SEO, it’s JavaScript.”

An advanced audit software like Clarity Audits that can crawl Ajax and JS assists in diagnosing errors. Find out right away if your site is using pre-renders and if so add the pre-render URLs in to perform site crawls.

This way, you can evaluate what issues exist on the render versus on-site.

Similarly, use can utilize Fetch within the Google Search Console to analyze the code with JS rendered, exactly as the search engine would see it.

If there are components that are not appearing then creating a crawl can help analyze why they are not appearing and uncover solutions.

crawl10.png

#7. Incorporate New Techniques in Your Site Audits

The methods above are some of the most used techniques reported by advanced technical practitioners, but there are always more techniques to learn to provide a more robust ongoing Site Audit process.

This article provides additional techniques to better understand how to quickly identify errors and ongoing search visibility to know what issues to eliminate to maximize your potential within the search engines.

These workflows also provide your customers with the best user experience possible when they visit your site.

If you have additional advanced technical SEO approaches or suggestions, please provide details in the comments below. We'd love to feature them in an upcoming post.