The Google webmaster guidelines define the actions that webmasters can take to make their websites better indexable or crawlable. Moreover, the webmaster guidelines include a list of procedures that Google considers breaches of these guidelines which may result in the devaluation of a website or even exclusion from the Google index.
Google webmaster guidelines apply to all websites that Google has added to its index. If you want your website to be permanently displayed in the SERP, you should be striving to comply with these requirements. Google recommends that any new website gets reported to Google. Usually, it is sufficient if a site is simply “put online” and linked once, so that the Google bot can visit and index it. Additionally, it is advisable to create an account in the Google Webmaster Tools and submit an XML sitemap including all subpages. Of course, it goes without saying that webmasters must ensure that their website and all its sub-pages are permanently online, in order to prevent exclusion from the SERP.
Google webmaster guidelines emphasize that a page should be clearly structured in its design of the content. This may refer to both the navigation and the header structure. The content should reflect exactly what the website is actually about. Google, in particular, calls attention to the practice of some webmasters who mainly optimize their content for keywords and neglect the information value of the text. However, the guidelines also state that you should accommodate those keywords in your website, which you anticipate for the search queries of your visitors.
Links play a major role as well. Google recommends that each page should be accessible with at least one in-link, but the total number of links on a page should also remain at a “reasonable level.” Google specifically makes reference to the work that the bot has to perform in following each link. If it encounters too many links on a page, it cannot comb through a website in its entirety. Google advises webmasters to create a summary page for their visitors which includes all possible subpages. The term “sitemap” is a bit confusing in this context, since the way it is used by Google in the guidelines has nothing in to do with an XML sitemap. Broken links should also be avoided whenever possible.
As a general rule, a website should offer all important information in text form, since at the present, crawlers cannot yet read images. To optimize images for the index, ALT attributes should be used.
In the technical part of the Google webmaster guidelines, the robots.txt file plays an important role. Google explains the advantages of this file and mentions that the crawling of certain directories can be prevented. The same file can also be used to index AdSense content. In addition, the robots.txt file is meant to prevent visitors being presented with irrelevant content in the SERP.
Great importance is also placed on the loading time of a website. Since it is a ranking factor, every webmaster should take this aspect seriously and provide fast access to their website.
In the “Quality Guidelines” section, the Google webmaster guidelines provide any SEO the opportunity to reconsider their actions exactly. It lists all common forms of SERP manipulation. Google also encourages users to report spam sites to the Google anti-spam team.
Probably the most cited guideline is this:
“Make pages primarily for users, not for search engines.”
A Google webmaster guideline frequently mentioned by Matt Cutts is:
“Think about what makes your website unique, valuable, or engaging. Make your website stand out from others in your field.
In a nutshell, SEO can be described with just these two statements. But they don’t specify the methods to be used.
Below are some actions from the Google webmaster guidelines that are NOT recommended:
If a website is declassified by Google or a penalty is levied, then there is usually a prior warning and the indication that the webmaster has to change something. If a domain was completely removed from the index, a reconsideration request can be submitted so that the affected website can be rechecked and ideally get re-indexed.