Former Google Search Quality Strategist, Kaspar Szymanski, shares his opinion on ownership and responsibility in SEO and shares tips for maintaining healthy rankings.
Webmaster forums are filled with heartbreaking accounts of utterly unexpected SEO disasters, through no fault of the website operator. Sometimes, it seems a vicious competitor's negative SEO campaign running havoc on legitimate, even deserved rankings. Other times, it's the search engine, profit-driven and relentless, possibly even intentionally taking top positions from the one website, which in the mind of its owner, truly deserves them. Occasionally, it appears that all the hard work, countless efforts expended on PageRank passing link building, do not command the respect they clearly should. Instead, a penalty is unjustifiably, arbitrarily issued, and keeps preventing the website from reclaiming its “natural” top position.
This only slightly exaggerated gist of countless SEO disaster stories told, does not highlight a key point. Whatever the reason may be for a website's failure to live up to expectations in search, it is almost certainly the website operator’s fault. Website rankings are the direct consequence of the on- and off-page signal input received by search engines. It is up to every website operator to make sure their website signals are as consistent as possible. Search engine ranking failures are avoidable. And can be turned into search ranking success. This opinion article in large parts does reflect the author's 14 years of search experience, including working for Google Search. SEO is my lasting passion. It can be yours, too. Keep reading.
Search engines hold no grudge against individual websites or their operators. They have no agenda other than keeping their users happy. As far as search engines are concerned, which website or landing page ranks top for any given query is of no consequence, as long as users are satisfied with the results. Search engine rankings are entirely signal input driven. This is great for all website operators, including anyone who believes their websites should do better in search because relevant SEO signal input can be deliberately improved. That is what data-driven Search Engine Optimization is about.
It isn't difficult to empathize with frustrated website operators. The public venting their difficulties rarely are often misdirected. To draw a comparison most drivers wouldn't think to blame someone else if they damaged their car while backing out of a parking lot. Or when taking a wrong turn and ending up with their car in a ditch. They are, after all, behind the wheel and in charge. It is possible the vehicle hasn't been maintained well or maybe some wearing parts broke, but it is still up to the driver to ensure the car is in good order before starting the engine. While most people will agree with this point of view, it appears that the sense of ownership and responsibility is often absent when it comes to website performance. Strangely so, given that many of the websites meant to be visible in search results are commercial and revenue-driven.
The overly dramatized opening lines of this article reflect countless real-life discussions revolving around the topic. They often share a self-centered perspective. Rarely are the users or the specific target audience identified as the group primarily affected by ranking failures. They also purport the notion of entitlement. Search engine visibility and the resulting, free traffic is taken for granted. It seems to be perfectly acceptable for traffic to grow. In contrast, traffic drops, no matter how seasonal or temporary, cause outrage. The reality of search however could hardly be further from this notion. Search engine results are in constant flux. Static rankings do not exist. Any position is merely a split-second snapshot. Even without any changes to the websites on- or off-page signals, there are countless reasons why SERPs fluctuate naturally. There are of course live tests conducted by search engines on limited volumes of queries, attempting to verify new hypotheses. New algorithm releases and updates to existing algorithms have equally well the potential to cause fluctuations. Some of these changes may not be working as intended and be swiftly rolled-back, resulting in yet more SERP turmoil. More prosaically and less dramatically, its new competitor websites, as well as existing competitors improved SEO signals, that have the exact same effect. It is, therefore, reasonable to work towards high visibility for relevant queries, accept their fleeting nature, and enjoy success when it comes.
The available indicators are unambiguous: search engines tend to be entirely indifferent towards individual websites. This dispassionate approach is understandable. After all, there is only one top spot for any user query including all questions never previously asked. Hence, if a website does not deliver, usually there are countless alternative ones. That having said, website visibility drops, even caused by actual penalties, are never triggered by prejudice or bias. Anyone still doubting whether search engines may not occasionally drop some unpopular sites, merely need to think of the despicable political movement or disgusting celebrity they personally disapprove of. Chances are all of the associated websites are still well indexed and ranking for even the relevant queries. At the same time, it is exactly that impartial indifference, which does away with the assumption that search engines and website operators were equal partners in the online business. They are not. Search engines will always be able to find suitable alternative websites to address user queries, while far too many websites are critically dependent on the organic traffic from just one search engine.
While no brick and mortar business finds it acceptable to be alarmingly dependent on the whims and caprices of an unpredictable third party, that is neither a customer nor a client, many online businesses rely to a high degree on organic search engine traffic only. A dangerous feeling of entitlement seems to creep in when search engine traffic is forthcoming, without much effort. This is a foolhardy attitude, the consequences of which can become catastrophic in short order when the torrent of free search engine traffic is reduced to a trickle overnight. No business should be critically dependent on just one sales lead or traffic source either.
Diversification is a key survival factor. The question of how to continue operations without high volumes of organic Google traffic is best raised before the situation becomes reality. Next to traffic source diversification and traffic source independence -that is type-in-traffic- is essential. The latter objective is not merely feasible, but it is the reward for providing a superb, superior service. Anyone wondering how to achieve that goal may think how they, themselves book flights, hotels or shop online. The majority of users directly access their favorite, tried and tested websites most of the time. For a reason.
The key to lasting online success isn't just clinching high rankings once. High visibility for relevant queries is the overarching goal. Yet rankings are and always will be temporary. That’s why they must be seen as a business-boosting factor, but never as the business foundation. The actual business foundation is in the first place the unique selling proposition.
Organic traffic and in that sense search engine optimization, have the potential to stimulate conversion growth and sales. Hence, they should not be left without close supervision. However qualified responsibility, rather than enthusiastic self-reliance is the guiding principle. After all, the responsible car driver, who isn’t a skilled mechanic, may change a flat tire but might shy away from changing his or her car's broken gearbox. In the latter case-specific skills, special tools and experience will be required. Similarly, with a website, no amount of effort expended is likely to compensate applying outdated or only presumed best practices. Without the necessary know-how, the net results will be negligible or even negative in consequence. This comparison also demonstrates another key feature of search: optimization is an ongoing effort.
Depending on the operator's tech prowess, the website's size, and complexity, there are a few steps that can prevent avoidable drops in organic search. When applied they provide no ranking guarantee. Such a guarantee is illusionary. The closest thing to actually achieving a certain level of confidence can be attained with annual defensive SEO audits, conducted just the same way as car maintenance cycles are. There are however both one-off and ongoing efforts that have the potential to maintain, even grow healthy website rankings:
Google Search Console provides a glimpse of Google’s reading of a website's signals. Which is why adding domain property and URL prefix of all primary subdomains and the naked domain, both for HTTP and HTTPS is paramount to gather as much of these insights as possible. With that step, often suspicion or doubt are replaced by verified insights.
Anyone operating a website consisting of hundreds or more landing pages is well advised to collect and retain raw web server logs. That data can never be recovered if it isn’t recorded. And it can provide tremendously relevant insights when it is evaluated. As websites grow in volume crawl budget management becomes a top priority at which point historic server log data becomes immensely valuable.
Backlinks are the very fabric of the World Wide Web. Long before PageRank became a Google ranking factor, backlinks helped crawlers to discover new content. Today building backlinks primarily for the purposes of content discovery, searchbot crawl prioritization, user navigation, and converting traffic remains a very important step towards online success. Unlike building PageRank passing links, it also is in-line with Google Webmaster Guidelines.
All other factors roughly equal the faster loading website wins. This is owed to the fact that users appreciate faster loading websites over slower ones. Speed is a winning factor, which is why steps including image optimization, browser caching, minifying CSS, JS and HTML, compression, and as few redirects as necessary are among the essential ones.
Google likes websites popular with users. Maintaining the focus squarely on the interested target audience is one of the most effective optimization techniques. Meeting user expectations throughout their entire experience with the website, starting off with the snippet representation is key. A popular website always stands a chance to prevail, even in a highly competitive market.
Embracing these five best practices has the potential to prevent disappointing rankings. While applicable to any website, they are by far not exhausting the almost limitless possibilities to constantly improve website SEO signals. They represent initial steps towards organic visibility growth. Much more advanced methods and techniques can be rigorously applied to outperform and outrank competitors again and again. This is what SEO is about. A never-ending competition for user attention and approval, for connection milliseconds, for citations, for converting traffic links, in short for excellence in an attempt to put the best foot forward. When done right, it is both enjoyable and immensely rewarding.
Published on Apr 7, 2021 by Kaspar Szymanski