Ryte’s new A/B Testing provides you with a data-driven approach to SEO. Test whatever changes you think might affect search performance, and carry out tests to see if they really make a difference.
A/B testing is a great way to easily improve the performance of individual webpages - particularly when you have some optimization ideas in mind but are not 100% sure which measures will actually have a positive impact on performance. Usually, A/B Testing tools are designed to help improve conversion rates of landing pages, but what about evaluating the effect of website changes or improvements on Google? That’s where Ryte’s brand new SEO A/B testing feature can help!
A/B testing (sometimes also referred to as “split testing”) is for comparing two versions of a webpage against each other to determine which one performs better, and is often used for evaluating how and whether certain changes to a landing page lead to a better conversion rate. A/B testing specifically for SEO helps to determine how changes to website content can impact search performance metrics clicks, impressions, CTR or position.
Benchmarking which changes on your webpages had an impact on search performance is usually difficult. Although it is not known how exactly the Google algorithm works, experienced SEOs assume that there are more than 200 different factors that influence rankings on Google. The best way to figure out what works best to drive rankings for your content is of course: trial and error. While A/B Tests are a common approach to optimize CTR in paid advertising channels (such as Display Ads, Retargeting Banners or Click Baiting) it is - up to now - not a common technique applied to SEO.
With Ryte’s new A/B testing feature, SEOs can leave their trial and error testing methods behind and easily see how changes to websites impact search performance!
Ryte’s SEO A/B testing feature is available under Search Engine Optimisation, and helps you to see whether changes to your website have an impact on search performance metrics clicks, impressions, CTR or position. If you do not yet have access to the feature are are interested in using it, please reach out to us.
You can compare the performance of 2 segments - one segment containing URLs where you have applied certain changes to the content or page structure compared to a "control group" of URLs which you did not change. By comparing the performance of the former group with that of the latter since the date your changes went live, you can see directly whether your changes had an impact!
With the new SEO A/B Testing feature, you can:
You can set up your A/B test in a few easy steps:
1. Click on "+ New test" in the top right-hand corner.
2. Create a hypothesis to help you carry out the test in the best way possible - the feature provides guidance how to do this. For example, "by adding "free shipping" to the meta titles and descriptions for X pages, we will increase clicks to the website."
3. Set up a test group - consisting of pages that were adjusted - and a control group. You can create the test and control groups as a segment - either by creating a new segment or using an existing one.
Tips for creating test and control groups
4. Enter the start date when the changes were deployed.
5. We now provide recommendations for adjusting your test to help you gain significant results. For example, if we detect duplicate URLs in both groups, this information will be displayed. You should then adjust the set-up based on the recommendations, and then start your test!
You will then see your test results prominently in the report - as well as detailed insights on the number of clicks, impressions, CTR or position for both the control and test group.
The report also shows the expected average value of the test group after the test was started - i.e. what it would have been had no changes been introduced. You can see this with the grey dotted line (for example in figure 4).
The expected average value for the test group is derived from the development of the control group (the segment of pages you did not apply any changes to). For example, if the average value of the control group increases by 10% after the date the test was started, we anticipate that the same development would have happened in the test group if you had never applied the changes. We then compare this expected average value of the test group to the actual average value we have measured and derive if and to what extent your changes affected the metric you wanted to improve.
Once the test has been running long enough to provide significant results, the test is stopped.
The report also shows a summary of your results so you can easily understand how your changes had an impact (if any):
Test duration and statistically significant results
We implemented a prediction for how long an individual test has to run in order to produce valid test results - meaning you don’t need to ask yourself how long the test should run for. Once enough data was collected we compute whether the test results are statistically significant or not. We show your test significance in an easy-to-digest way in the test summary. This means you can tell whether the test really produced positive results, as well as how likely it is that these results were actually driven by the changes you applied to the pages of the test group.
Optimize meta titles and descriptions
Do you have some ideas of how to optimize your meta titles and descriptions? Maybe adding the year in the title could make a difference to show that your content is up-to-date. Try adding "free shipping" to entice users, or maybe try to work out if questions work better than simple statements.
Get inspiration for Google Ads
If you test out different meta titles and descriptions for your organic content and see what works best, you can use this as inspiration for Google Ads. That way, you can find out what works best before investment significant amounts into Google Ads.
Carry out data driven tests to report easily to superiors and clients
Are you tired of not being able to prove with concrete data that your ideas work? When you want to try out new ideas (whether it’s optimizing meta titles or descriptions, adding more visuals to pages), now you can use data to prove why you’re making certain changes.
Set up A/B tests to get more insights into Google’s ranking factors!
If you’re really curious about investigating ranking factors, A/B testing is the ideal solution. Gather your ideas on what you think really impacts rankings or clicks, and carry out tests to gain more insights into factors that can affect success in the SERPs.
If I start an AB test today but put the start date a few weeks or months ago, when will I get test result data?
If GSC has been connected since that day, for most cases there should already be a result, but it depends on the metric and how severe the effect is. We check what the effect size (increase) until today is, use that to make a sample size prediction at the time of the change and check if the collected data is more than the necessary sample size. If it is, we go back day by day to check which date was the first date on which we had enough data.
Can the test run for more than 7 days if the results are still not significant?
Tests can take longer than 7 days, especially for clicks and impressions. For CTR and position in most cases the test will only take 7 days to finish.
There are endless ways of making the most out of the A/B Testing feature - a few examples where you might want to carry out a test include testing different product descriptions, testing different meta titles and descriptions, or seeing if adding videos or images make a difference. Any change you make - you can set up an A/B test under Search Engine Optimisation to see whether your changes had an impact.
Watch our webinar recording
Published on 05/19/2021 by Olivia Willson.
Who writes here
After studying at King’s College London, Olivia moved to Munich, where she joined the Ryte team till 2021. She was previously in charge of product marketing and CRO, and also helped out with SEO and content marketing. When she's not working, you can usually find her outside, either running around a track, or hiking up a mountain.