Manual penalties applied as a consequence of violating Google Webmaster Guidelines are, next to Google algorithms such as Panda or Penguin, among the leading causes for a sudden loss of Google traffic. Kaspar Szymanski offers rare insights from a former long term Google Search employee perspective with years of experience both applying and lifting penalties from within Google.
Without exception, manual penalties are applied to sites that are identified to be in violation with Google Webmaster Guidelines. In other words, penalties are the unavoidable consequence of black hat techniques applied in an attempt to get ahead in the SEO game at that very moment. Whether that is happening intentionally because of a coordinated effort as part of a risky business strategy or it is merely a slip-up of oversight, does not matter. When a “Manual Action" message pops-up in Google Search Console (GSC), the decision has already been taken. From that moment onward all ongoing site development and release plans are on-hold, since any improvement introduced while under penalty is likely a wasted effort. The only priority must be to identify the cause of the penalty, address it and to successfully apply for reconsideration. A process that depending on the scale of the problem and saviness of the SEO team available may take anywhere between days and months.
Google is upfront in messaging their reasons for penalizing a website. In the GSC warning message they highlight the problem which in the majority of cases comes down to either an on-page or an off-page violation. In recent years it seems that a large chunk of all penalties issued are caused by off-page violations, mainly triggered by PageRank passing link building. Obviously the black hat method bears some potential for short lived gains, yet at the same time it is virtually sure to attract Google's attention and to trigger a review eventually. The respective GSC message refers to "Unnatural Inbound Links", that is links originating from other sites. Most of the time, the site affected experiences a sharp decline in Google Search visibility, short of being completely gone from SERPs. Penalty recovery in that case starts with a thorough backlink audit, link removal and or disavowing links that had become a thread.
Figure 1: Unnatural backlinks penalties often trigger a steep drop in Google Search visibility.
When it comes to on-page content violations, which Google refers to either by calling out "Spam Problems" or "Major Spam Problems" (occasionally referred to as Pure Spam) there’s a distinction which depends on the gravity of the spam issue. Sites flagged as major spam usually consist of scraped or auto generated content, are often hosted on free hosts and / or on throw-away domains. In short, they are beyond salvaging in their current state. A site: operator query in Google Search typically returns 0 results. Any attempt to remedy such a site translates to building a new site from scratch.
Figure 2: Major or pure spam penalties trigger a complete removal from Google Search.
Contrary to these -in Google’s view truly bad sites - the penalty associated with mere "Spam Problems" indicates some level of decent content, possibly a real brand site which, however, also include bits and pieces that are either very content lean pages or doorways. Or both. Consequently, the affected site drops substantially in Google Search but does not disappear completely. Usually it retains its branded query traffic but has no way of competing against other sites when it comes to non-branded queries. In this case, an in-depth SEO audit which includes analyzing technical and content signals alike precludes a cleanup operation ridding the site of the content ballast.
Figure 3: Content spam penalties cause continuous decline in Google Search visibility.
Content spam issues causing a manual penalty may however stem from other reasons entirely. So happens whenever either "Hacked Content" or "User Generated Spam" are detected. The former usually does not cause a drop in Google Search visibility at first. However, initially the site is labeled as compromised in the Google SERPs, ensuring that most users confronted with landing pages clearly highlighted as potentially dangerous to visit, look for an alternative, less risky result. That warning alone is sure to make user signals indicate to Google the site isn’t what users are looking for. Shortly after that realization, the site's rankings are certain to decline steeply, based on the premise that the site fails to live up to user expectations. If neglected, that situation, while recoverable, may require a long time regain both users' and Google's trust again.
Sites that are being abused by users due to insufficient oversight end up with a "User Generated Spam" manual penalty. Off-topic "Buy Viagra Here" comments and links, even in nofollowed is just one example of the kind of transgression that causes Google to take the action outlined. Forums, Wikipedia-type sites along with sites allowing for unsupervised commenting are more prone to be affected by user spam. The respective manual penalty almost always affects only a granular selection of URLs, which end up ranking poorly or not at all. The initial effect is rarely site-wide. That having said, if a site is just overrun with spam to the extent that it is hard to identify any added value left, more firm actions including a complete removal from the index are not unheard of. Preventing this from happening requires a cleanup operation and ongoing, rigorous oversight going forward.
Figure 4: Compromised websites are labeled with a user warning.
Yet another frequently applied penalty does not have any impact on rankings at all. Instead, wherever Google highlights detecting "Spammy structured markup", the SERP real-estate previously enjoyed with rich snippets is lost for the foreseeable future. Consequently, CTR and thereby conversions are likely to decline shortly thereafter. Structured data implementation may be flawed, failing to live up to Google Standards or downright deceptive, aiming at window dressing snippets with false claims. In consequence, no more stars, reviews etc. are being displayed in Google search results. As with all other manual penalties, this one too can be lifted, once the issue has been rectified and some "thinking time" - often several months - has passed. However, unlike the other manual penalties, swift return into Google's good grace is not a certainty at all with this penalty.
Figure 5: Once Google has found Structured Data to be incorrect rich snippet real estate is lost.
Few manual penalties used by Google seem to impact a site's rankings as lightly -often imperceptibly- as violating Webmaster Guidelines by selling PageRank passing links. Which is also an explanation why most publishing media sites world wide seem to maintain their prominent Google Search visibility despite blatantly selling links bearing any kind of commercial anchor text the buyer wishes for. It is a fair assumption that many of these media outlets are marked with an "Unnatural Outbound Links" warning in their respective Google Search Consoles. That this is a risky strategy given the fact it attracts Google's attention to what amounts to obvious willingness to break rules, requires little explaining. No manual Google penalty shall be ignored and left to linger around just because it hasn’t had any impact on the core business yet. After all, penalties can last for a very long time and can be renewed even when they timeout. Google is also free to change their policies at any time, including changing the scope and impact of penalties, which can further impact already penalized sites.
Resolving a Google manual penalty almost always requires two steps: complete focus on the problem highlighted in Google's Search Console message and gathering sufficient data to fix the issue. Penalties applied for link building in the millions are more painful, because it takes longer to collect a critical volume of backlink data for risk level assessment. In that situation the samples GSC provides never offer a complete picture which is why third party tools, such as RYTE, must be used.
Server and or CMS security vulnerabilities, which leave the site exposed, are likely to be an easy fix, while a thorough clean up of a compromised website also requires a complete removal of all unauthorised spam content.
Penalties issued for content quality issues, including user generated spam content, can be equally swiftly resolved but here again, there is no way around an audition the website, including crawling an at least representative portion thereof.
Lastly, it is possible for a site to be afflicted with two or more manual penalties, issued to cover different parts of the site for several Google Webmaster Guideline violations. In that case every penalty needs to be looked after individually.
Figure 6: One website can be affected by multiple manual penalties
Almost every manual penalty can be lifted within one or two attempts. Beside cleaning up outlined before, the rationale which must be submitted to Google via the Reconsideration Request form is the last crucial step. It is important to be aware of the fact that the process on Google’s end is a manual one. Experienced Google employees read, review and investigate the information provided. Thus following simple guidelines, such as using cordial, brief language is well advised. Providing verifiable facts e.g. the volume of backlinks disavowed or the number of doorway pages removed helps the person sitting on the other end to make their judgement call swiftly. Lastly, committing to following Google Webmaster Guidelines going forward - and actually doing so - can be an advantage in borderline cases.
Any other information or communication, touching on unrelated topics such as considering legal action in case of a rejection or pleading for turning a blind eye are not useful for getting a reconsideration request approved.
Once the reconsideration request has been submitted, there is nothing that can be done to expedite the process. Re-submitting the request does not have any effect. Google does not officially commit to a turnaround time. Past experience indicates that anytime between several hours and a few weeks is what it can take.
Published on 09/10/2018 by Kaspar Szymanski.
Kaspar Szymanski is a renowned SEO expert, former senior member of the famed Google Search Quality team and among the select few former Googlers with extensive policy driving, webspam hunting and webmaster outreach expertise. Nowadays Kaspar applies his skill set to recover websites from Google penalties and help clients to max out the potential of their websites in search engines.
Own the SERPs with the only Platform using exclusively Google Data.Book free demo