But optimizing dynamic content requires having at least a basic understanding of the technology behind it.
Multiple libraries and frameworks exist to help developers bring dynamics to the web. The most popular include jQuery, JSON, React, Mootools, Single Page Apps, and more.
We know that around ten years ago, the search engine started crawling and indexing content rendered with JS. Although, at the time, their abilities were quite limited.
Ensure Bots Read Your Site Like Your Users
One way to ensure that is by using dedicated tools to crawl JS on your pages to establish if no issues prevent bots from rendering dynamic content.
Our site audit technology, Clarity Audits, for example, can handle crawling JS on individual pages, allowing you to identify problems that could block search engines from accessing your entire content.
Fact: Page speed affects your page’s rankings.
Just take a look at this graph plotting page load time vs. page position in the SERPs to see how much of an impact it can have on online visibility.
For that reason, you should serve JS files directly in the page source (meaning that they are included in the main code that makes up the page, rather than in external files a browser needs to call out and load before rendering the page in full.)
Also, rendering files on-page means that Googlebot will have fewer assets to crawl, resulting in a better utilization of the available crawl budget.
How do you know if JS files affect your page speed?
One way is to use tools like seoClarity’s Page Speed to analyze your current page loading time and get the full list of files that could be slowing down your site.
Here’s how such report looks like, with .js files marked.
In other words, optimizing page speed ensures that the most of your content gets included in the snapshot Google takes to crawl and index JS.
Single Page Applications
If your organization uses Single Page Applications, then you also need to optimize its JS.
However, with so much of SPA content loaded dynamically on a page, there’s not much for search engines bots to crawl, index and cache.
One way to overcome this challenge is to render the most crucial on-page SEO elements (i.e. H1 tags, page title, etc.) as a static code. This way, they’ll remain visible to bots, regardless of the dynamically changing content. This is commonly referred to as Initial Static Rendering (see example below).
As Keith Horwood, a technical lead at Storefront points (note, the emphasis in bold is mine):
“Separate your concerns! How would you build your website if you just cared about branding and wanted people to download your app? Build that website. Keep your branding and archivable content separate from your application.”
Also, refrain from using JS onclick events to interlink pages on your site.
Most of my advice to this point related to load events - scripts trigger when a page is loading in the browser.
But users can generate events on page as well. The most common behavior is an onclick event - a dynamic link triggered by JS. And although it might add dynamics to your website, you should avoid using it for interlinking pages on your site.
Because, although Google will find those links, it won’t associate them with your site’s navigation, and thus, won’t build a proper image of your site’s structure and content hierarchy.
And in my upcoming article, I’ll dive deeper into SPAs and show you specifically how to optimize them for SEO.