It’s difficult for search engines to crawl JS due to computing capacity. HTML, which functions with PHP, CSS, and other technologies, can be directly read by a crawler – in this case, the source code is rendered immediately when the URL is called up.
1. Initial Request: The browser and the search engine bot start a GET request for the HTML code of the website and its affiliated assets.
2. DOM rendering: The JS script site delivers the DOM (Document Object Model) to the browser or the bot. The document shows how the content will be formed on the website and what the connections are between the individual elements on the site. The browser renders this information and makes it visible and usable for the user.
5. Load Event: As soon as the resources and the JS resources dependent on these are loaded, the browser starts the load, and the site is finished.
6. Post Load Events: After the JS site has been loaded, further content or functional elements can be changed or adapted by the user.
Load events and user events can clearly influence your SEO. Here you will learn why that is.
The time frame of the DOMContentLoaded can be measured with the Google development tool:
4. a href and img src: The Googlebot requires links that it can follow so that it can find further sites. Therefore, you should also provide links with href- or src-attributes in your JS documents.
8. Use a current sitemap: In order to show Google any possible changes in the JS contents, you should always keep the attribute “lastmod” current in your XML sitemap.
1. Visual inspection
In order to gain a feel for how a visitor will see the website, you should divide the content on the website into:
2. Check HTML code
Thereafter, you can control meta elements such as the title and website description. So that bots can index these elements, they must be accessible via the load event. In general, however, only Google can currently read these elements. It is therefore recommended to write title and meta tags in the HTMl code even with JS sites.
3. Check rendered HTML
The Google Search Console offers an additional opportunity to test your JS sites. A prerequisite for the inspection of your website is that your domain be stored and confirmed in the Search Console. In the Search Console, click first on “Crawling” and then on “Fetch as Google.” Finally, insert the directory that has to be edited and click on “Fetch and render.”
Figure 1: Test JS sites in the Google Search Console.
Further things to consider
prerender.io is an open-source software that optimizes the rendering of a JS site. With this, the site is cached after rendering and can be pulled up more quickly when accessed by a bot.
3. ANGULAR JS
With Angular JS, HTML snapshots can be prerendered so that the Googlebot can more quickly grasp and index JS sites.
With this program, JS code is likewise rendered as HTML and made crawl-able by Google. The program code is hereby transferred to your server. Having your own dashboard will help you to manage those of your JS elements and sites that have to be rendered. Moreover, the tool creates an XML sitemap with your JS sites.
With its Search Console, Google helps you check JS elements by rendering individual sites. Finally, the tool shows you possible crawling problems.
Practice makes perfect!
Published on 05/17/2017 by Irina Hey.
Who writes here
Irina Hey is a keynote speaker and an expert in the field of customer acquisition, lead generation and data driven marketing. Until April 2018 she worked as a Product Owner of Acquisitions and coordinated all strategic marketing activities at Ryte.Become a guest author »
Get more traffic and customers by optimizing your website, content and search performance. What are you waiting for?Register for free
Do you want more SEO traffic?
Improve your rankings for free by using Ryte.Register for free