In almost all of my articles, I often cause great shock with my 50-50 recommendations: Unlike before where 80% of the editing time would be dedicated to new articles and 20% to the maintenance of the website, I recommend investing at least 50% in nurturing old posts.
Thanks to this blazing introduction of republishing, I now have the undivided attention of all participants. And I’ll even take it a notch further: It would be better if you start with a 20:80 division, i.e. 20% for new articles and 80 for the maintenance of older ones. But most editors find this hard to adhere to.
Time and again, I have wondered why editors often find it unsexy to update old articles. And yes, there are plausible reasons for that:
Firstly, because working on old material is not that much fun: Republishing is more like rumination.
Secondly, I admit that it is more rewarding if, at the end of the day, you have managed to write two new articles about interesting topics than spending the entire day updating a ten-service article. But on the other hand, this is the crux of these two content strategies: You can use the same amount of time to modify much more content and you will be working on already existent content, most of which can be termed as already successful assets. In the end, you will certainly be more successful – and success is definitely sexy, right?
Back to the definition: What is republishing? As the term suggests, it has to do with updating an already published article and finding ways to revert attention to it.
The mantra of content republishing is: "Making the best even better". But before we go into the details, we should first clarify where and when republishing makes sense. Roughly said, this is whenever you have good content that is not "fresh" enough any more. This is almost always the case for blogs with advisory content, shops with guides, and all technical information. And you will always find a lot of appropriate information about such topics on the Web. Right?
Figure 1: Checking the historical rankings using OnPage.org
Regularly monitoring the rankings pays off: A quick update of the article can help it get back to the top in case it drops positions...
The core of republishing is to make a good article even better, more up-to-date, and probably more detailed. That is basically what Wikipedians do with their articles on a daily basis. Thus, this explains why it is no coincidence that Wikipedia is ranked in the top position in Google if you search "Syria" or "Katrina"... The only difference is the fact that Wikipedians constantly strive to update their articles, whereas we only do it when we think it’s worth it.
Figure 2: The Wikipedia article for "Syria" ranks at the top
No wonder Wikipedia always ranks so well. Also for news articles: Its articles are well detailed AND up-to-date – a result of constant republishing.
Before going into WHAT should be done, let’s first look at WHEN and WHICH posts should be re-edited. I differentiate between three key areas:
Google Analytics clearly shows which of your articles bring in the most readers. Here, courtesy to your users commands you to keep the content up-to-date. You should regularly check if such pages are up-to-date.
The ten key pillars of the visibility of your articles:
You must identify your most important keywords and the associated URLs. And if Google likes these pages, you have to expect this to change sometime in the future. Therefore, make sure you regularly check if the pages are still up-to-date.
Clearly popular content:
What if you then went ahead and analyzed pages that probably have many shares and links from other important webpages? What if a specific article got a good reception the first time you posted it on Facebook – why not set out for a decent re-edit?
Set the amount of time you want to devote to the republishing. For instance, if you have 20 to 30 important webpages (which is quite a lot), you can certainly manage to re-edit them all within a month.
Figure 3: Social visibility analysis using Searchmetrics
Friends on Facebook clearly love the article about the analysis of image traffic. You should therefore check if this will change in a couple of months.
If Google adds a new function in the Search Console, we will definitely want to include the new function in our SEO book. But do we really have to write a new article? We often try and avoid posting a news article. Instead, we opt to update the original article about the Search Console. This may not be very exciting – but certainly useful. It adds detail to the article and makes it more up-to-date. And we have noted that Google (and users) likes it, too.
However, if you still prefer to write a new article – maybe because the changes are simply too massive – you should link this news with the original article and make sure to also mention the changes there. I know this sounds like a lot of work – and editors are not exactly thrilled about this.
And by the way: If you were to measure our SEO book based on these tips, you would probably want to blow my brains out. I know. I swear I’ll improve...
It is advisable to regularly (every week) search for attractive keyword-URL combinations with updated articles. Such keywords include:
1. Threshold keywords:
In simple SEO terms, you should look at keyword-URL combinations that are newly ranked in positions 11 to 20 since this is where it pays off the most. Here, it shows that you have really gained in the ranks with the keyword – but you still get few or no clicks since users are just too lazy to navigate into the second search results page. Google likes your article – and this alone is reason enough to want to make the article even better. Maybe this will be enough to appear on the first page of search results.
2. Loser keywords:
The same applies to keywords for which, let’s say, you were ranked in position 4 the previous week – but have now dropped to position 12. In most cases, this is how Google indicates that the content has reached its expiry date. Which means: please update the content. According to my experience, this is also the most important and most profitable signal for republishing. You definitely wouldn’t be doing anything wrong if you only paid attention to these loser keywords!
3. Articles with a poor CTR:
The Google Search Console shows you the click rate of your URLs in relation to your position for each day. Here, it wouldn’t be good if you were in position 3 but only had a click rate of 5 percent. Instead, it would be better if you were in position 8 and had a click rate of 20%. This means that the URL in the 3rd position requires a bit of optimization.
And not just any sort of optimization: Here, it’s all about the click rate in the search results. Therefore, make sure you pay special attention to the title and description.
Figure 4: Identify winner and loser keywords using OnPage.org
But be careful: If you just optimize threshold and loser keywords wildly, you can end up in a pitfall. Suppose you only ranked with a URL for one strong keyword and one weak keyword – and the weak keyword falls into your loser list. You need not necessarily begin focusing on this keyword in your text. This could hurt your strong keyword.
Nonetheless: An overall update of the article is always a good idea...
So let’s assume we have an article, probably even a topic or keyword for which it should rank, and a little time. How do we proceed?
Figure 5: TF*IDF analysis on OnPage.org
Oh, dear! Words like "term weighting", "text optimization", and "document body" are missing in my TF*IDF post. Probably explains the poor ranking...
This sounds like a lot of work. But try and adhere to these nine points: Even if you choose the path with the TF*IDF analysis and internal linking, it will cost you just a fraction of the time you would need to create a new article. I believe you can republish up to five articles in the time you would need to write two new articles. And, hey, it’s totally worth it!
But doesn't Google prefer new articles? No! Why? Particularly in SEO, 500 very good pages are much better than 5,000 half-baked pages since 500 top 10 positions are infinitely more than 5,000 average positions (20 - 30) on Google – and with the same level of visibility...
And it really makes sense: Your existing pages will keep getting better, more and more detailed. They will always be better search targets and will provide more detailed answers to user questions. Isn’t that great?
Published on 07/04/2016 by Eric Kubitz.
... aka “Contentman” is founder and head of CONTENTmanufaktur. He also lectures in SEO at two colleges, offers various SEO and copywriter workshops, and is a frequent speaker at conferences (for example SEOkomm, SMX, SEO-Day). The experienced journalist writes mostly for contentman.de, and has compiled his knowledge in a training video for SEO beginners.Become a guest author »
Get more traffic and customers by optimizing your website, content and search performance. What are you waiting for?Register for free