My Content Can’t Be Seen When I Disable JavaScript
This week’s question comes from Thomas, who asks:
I disabled the JavaScript just to check the content of my webpage, but unfortunately I could not see any content except the banner H1 tag.
Will it hurt my SEO? If yes, what are the advisable solutions for this?
This is a great question – it’s something that all SEO professionals need to be aware of.
We spend so much time trying to create interesting, engaging content that it would be heartbreaking to think that it isn’t visible to search engines.
However, given the recent advancements in Google’s ability to render JavaScript content, is that something we still need to be concerned about?
The short answer is yes.
Why JavaScript Can Be A Problem
We know that to ingest information, Googlebot will discover a page, crawl it, parse and index it. For JavaScript, the crawler needs to “render” the code. The rendering stage is where JavaScript problems can occur.
JavaScript has to be downloaded and executed in order for the content to be parsed. This takes more resources than the bot parsing content in HTML.
As such, sometimes Google will defer the rendering stage and come back to a page to render it at a later date.
Most websites these days will use some JavaScript – that’s absolutely fine.
However, if your website requires JavaScript to load important content that is crucial to the page, then it might be a risk.
If, for some reason, a search bot does not render the JavaScript on a page, then it will not have any context as to what the page is about.
It is crucial to remember that not every search engine can render JavaScript. This is becoming increasingly important in the era of generative search engines – very few of which render JavaScript.
Diagnosing A Problem
You’ve done the right thing by starting to investigate the effect JavaScript rendering might be having on your site.
Turning off the JavaScript and seeing what content remains, and what is still interactive without it, is important.
I suggest going a step further and looking at what is available to the search bots to read on a page’s first load. This will help you identify content accessible without JavaScript rendering.
Check Google Search Console
First off, use Google Search Console URL Inspection tool and look at the rendered HTML. If the content is present in the rendered HTML then Google should be able to read the content.
Check Chrome Browser
You can go to “View Source” in Chrome to see what the pre-rendered HTML looks like. If the content is all there, you don’t need to worry any further.
However, if it’s not, then you can use the Developer Tools in Chrome for further diagnostics. Look in the “Elements” tab. If you can see your content, then again, you are probably OK.
Check The Robots.txt
Sometimes, developers may choose to block specific JavaScript files from being crawled by disallowing them in the robots.txt.
This isn’t necessarily an issue unless those files are needed to render important information.
It’s always worth checking your robots.txt file to see if there are any JavaScript files blocked that could prevent the bots, in particular, from accessing the content of the page.
Next Steps
JavaScript tends to worry a lot of people when it comes to SEO. It’s a significant part of the modern web, however. There’s no escaping the use of JavaScript.
We need to ensure that our websites utilize JavaScript so that both popular and emerging search engines can find and read our content.
You don’t need to worry but be diligent.
If you have developer resources at hand, you can work with them to identify the most applicable solution.
Here are some checks you may want to make:
Are We Using Client-Side Rendering Or Server-Side Rendering?
Client-side rendering essentially utilizes the browser to render the JavaScript of a page.
When a page is visited, the server responds by sending the HTML code and the JavaScript files. The browser then downloads those files and generates the content from the JavaScript.
This is counter to server-side rendering, where the content is rendered by the server and then sent to the browser with the data provided.
In general, server-side rendering is easier for bots, can be a quicker experience for users, and tends to be the default recommendation for SEO.
However, it can be more costly for the websites and, therefore, isn’t always the default choice for developers.
Is Our Main Content Able To Be Rendered Without JavaScript?
The most important content on your page, the main content, needs to be possible to parse without JavaScript rendering.
That is always the safest way to ensure that bots can access the content.
Are We Using JavaScript Links?
A further consideration is whether your links can be crawled easily by the search bots.
It’s not always an issue to have links generated through JavaScript. However, there is a risk that bots might not be able to resolve them unless they are properly contained in HTML element with an href attribute.
Google states it “can’t reliably extract URLs from elements that don’t have an href attribute or other tags that perform as links because of script events.”
Remember, though, it’s not just Google that you need to be conscious of. It’s always better to err on the side of making your links easy to follow.
In Summary
It is crucial to make sure your content is accessible to bots, now and in the future.
That means that if your website relies heavily on JavaScript to load content, you may struggle to communicate that information to some search engines.
It’s true that Google is much better at rendering JavaScript-heavy sites than it used to be, but the SEO playing field is not just Google.
To make sure your website can perform well in search platforms beyond Google, you may want to change how your website renders content, making sure your main content is in HTML.
More Resources:
Featured Image: Paulo Bobita/Search Engine Journal