Slow Death


Slow death in search engine optimization denotes the gradual disappearance of a website from the search results lists and thus from the index of a search engine. The reasons for a slow death are often not immediately identifiable.

Indicators of a slow death

As the name suggests, slow death is a slow and rather insidious development. An alarming first sign may be a continuous drop in visitors from organic search hits, resulting from poorer ranking for your key search terms. This can be verified using common analytical tools such as Google Analytics. Another possible indication for a slow death can be the disappearance of description texts in the search snippets. Google no longer uses description, but only displays the website as a pure URL for hits on subdomains of a domain.

Suspected reasons for a slow death

So far, reasons for a slow death are speculative because no reliable sources are known about the causes. In many cases, however, problems with duplicate content are suspected behind a slow death. Similarly, incorrectly created robots.txt files or other blockages can cause search engine crawlers to no longer consider them as relevant. It is also assumed that slow death can occur when a site has been corrupted or hacked, such as for spam purposes.

How to prevent slow death

The following measures can be recommended preventatively to minimize the risk of slow death:

  • Regular site monitoring with relevant programs such as the Google Search Console to respond immediately to any inconsistencies: For example, if webmasters get hints of a hack of the site or a “suspicious links” warning, they should begin troubleshooting immediately.
  • Avoidance of duplicate content using canonical tags: For example, duplicate content can be detected using appropriate web analysis tools. External duplicate content may also result from unauthorized copying of content. A simple way to check if other websites have been copying your content without authorization is to enter text elements from your website into the Google search bar as an exact match (search query in quotation marks). If you get hits of websites that have copied your content, the webmasters concerned can be prompted to remove or customize the protected content.
  • Regular control of the robots.txt file: Webmasters and SEOs should check whether important directories are excluded from crawling by the Googlebot. Check the robots.txt file using the Google Search Console.

Webmasters should ensure that all their content is kept up-to-date independently of the slow-death prophylaxis. This recommendation applies both to address data and to texts, which have to be adapted to current changes.

What can you do in case of a slow death?

If you suspect that your site is affected by slow death, you should act immediately before the domain entirely vanishes from the SERPs. If the website has been suspended from the index, it will take tremendous effort to regain its previous position. Finding the error is the first step. The following questions can help:

  • Are there problems with duplicate content?
  • Are there any scrapers (websites that copy and use my content) on my site?
  • Did I get and read all the news from Google Search Console?
  • Can a hacker attack be excluded?
  • Can all pages be easily crawled by Googlebot?
  • Can all pages be indexed or have canonical tags or has the noindex tag been set incorrectly?

If a google penalty is found, the webmaster should also act immediately. In contrast to a slow death, a reconsideration request can be submitted after all errors have been eliminated. It is also a good idea to send the relevant webpages to the Google index after the adjustments have been made. There is a function for this purpose in the Google Search Console. Moreover, the XML Google Sitemap should be updated and submitted again to allow Google to crawl the pages again.