1. Initial Request: The browser and the search engine bot start a GET request for the HTML code of the website and its affiliated assets.
2. DOM rendering: The JS script site delivers the DOM (Document Object Model) to the browser or the bot. The document shows how the content will be formed on the website and what the connections are between the individual elements on the site. The browser renders this information and makes it visible and usable for the user.
5. Load Event: As soon as the resources and the JS resources dependent on these are loaded, the browser starts the load, and the site is finished.
6. Post Load Events: After the JS site has been loaded, further content or functional elements can be changed or adapted by the user.
Load events and user events can clearly influence your SEO. This is why:
The time frame of the DOMContentLoaded can be measured with the Google development tool:
4. a href and img src: The Googlebot requires links that it can follow so that it can find further sites. Therefore, you should also provide links with href- or src-attributes in your JS documents.
8. Use a current sitemap: In order to show Google any possible changes in the JS contents, you should always keep the attribute "lastmod" current in your XML sitemap.
A JS website audit is primarily a manual inspection of individual elements.
1. Visual inspection
To get a feel for how a visitor will see a website, you should divide the content on the website into:
2. Check HTML code
Then, you can control meta elements such as the title and website description. So that bots can index these elements, they must be accessible via the load event. In general, however, only Google can currently read these elements. It is therefore recommended to write title and meta tags in the HTMl code even with JS sites.
3. Check rendered HTML
Further things to consider
prerender.io is an open-source software that optimizes the rendering of a JS site. With this, the site is cached after rendering and can be pulled up more quickly when accessed by a bot.
3. ANGULAR JS
With Angular JS, HTML snapshots can be prerendered so that the Googlebot can more quickly grasp and index JS sites.
With this program, JS code is likewise rendered as HTML and made crawl-able by Google. The program code is hereby transferred to your server. Having your own dashboard will help you to manage those of your JS elements and sites that have to be rendered. Moreover, the tool creates an XML sitemap with your JS sites.
With the old version of the Search Console, Google helps you check JS elements by rendering individual sites. The tool also shows potential crawling problems.
Practice makes perfect!
This article was first published in May 2017, and updated in July 2020
Published on 07/22/2020 by Olivia Willson.
After studying at King’s College London, Olivia moved to Munich, where she joined the Ryte team till 2021. She was previously in charge of product marketing and CRO, and also helped out with SEO and content marketing. When she's not working, you can usually find her outside, either running around a track, or hiking up a mountain.
Own the SERPs with the only Platform using exclusively Google Data.Book free demo