1. Initial Request: The browser and the search engine bot start a GET request for the HTML code of the website and its affiliated assets.
2. DOM rendering: The JS script site delivers the DOM (Document Object Model) to the browser or the bot. The document shows how the content will be formed on the website and what the connections are between the individual elements on the site. The browser renders this information and makes it visible and usable for the user.
5. Load Event: As soon as the resources and the JS resources dependent on these are loaded, the browser starts the load, and the site is finished.
6. Post Load Events: After the JS site has been loaded, further content or functional elements can be changed or adapted by the user.
Load events and user events can clearly influence your SEO. Here you learn why.
The time frame of the DOMContentLoaded can be measured with the Google development tool:
4. a href and img src: The Googlebot requires links that it can follow so that it can find further sites. Therefore, you should also provide links with href- or src-attributes in your JS documents.
8. Use a current sitemap: In order to show Google any possible changes in the JS contents, you should always keep the attribute “lastmod” current in your XML sitemap.
A JS website audit is primarily a manual inspection of individual elements.
1. Visual inspection
In order to gain a feel for how a visitor will see the website, you should divide the content on the website into:
2. Check HTML code
Then, you can control meta elements such as the title and website description. So that bots can index these elements, they must be accessible via the load event. In general, however, only Google can currently read these elements. It is therefore recommended to write title and meta tags in the HTMl code even with JS sites.
3. Check rendered HTML
The old version of the Google Search Console offers an additional opportunity to test your JS sites. The new interface does not yet offer a similar function (as of February 2019). A prerequisite for the inspection of your website is that your domain be stored and confirmed in the Search Console. In the Search Console, click first on “Crawling” and then on “Fetch as Google.” Finally, insert the directory that has to be edited and click on “Fetch and render.”
Figure 1: Test JS sites in the Google Search Console.
Further things to consider
prerender.io is an open-source software that optimizes the rendering of a JS site. With this, the site is cached after rendering and can be pulled up more quickly when accessed by a bot.
3. ANGULAR JS
With Angular JS, HTML snapshots can be prerendered so that the Googlebot can more quickly grasp and index JS sites.
With this program, JS code is likewise rendered as HTML and made crawl-able by Google. The program code is hereby transferred to your server. Having your own dashboard will help you to manage those of your JS elements and sites that have to be rendered. Moreover, the tool creates an XML sitemap with your JS sites.
With the old version of the Search Console, Google helps you check JS elements by rendering individual sites. Finally, the tool shows you possible crawling problems.
Practice makes perfect!
This article was first published in May 2017, and updated in February 2019
Published on 02/17/2019 by Olivia Willson.
Olivia left her home town, Cheltenham, to start her degree in German and Music at King’s College London in 2011. She moved to Munich after finishing her degree and has been part of the Marketing Team at Ryte since July 2017, where she is mainly responsible for the English Ryte Magazine and English Wiki.
Get more traffic and customers by optimizing your website, content and search performance. What are you waiting for?Register for free
Do you want more SEO traffic?
Improve your rankings for free by using Ryte.Register for free