Your content strategy is hardly, if at all, a place for guessworking, gut feeling, assumptions or intuition. After all, your feeling about that particular headline, CTA or even the time of posting could be way off. So what does that mean for your content? Poor results, whether that’s low returning traffic, diminished conversions, fewer people sharing your content, or something else.
But how can you tell if your content resonates with your audience enough that they engage with it, share it, click on your CTA and so on?
This article will teach you how to A/B test your content strategy.
How to A/B Test Your Content Strategy?
You can test any element of your content, from headline, to content type and length, to opt-ins to CTAs. But before we get into that, you need to understand what A/B or split testing actually is to know how to A/B test your content strategy.
- Pick one variable to test
A/B testing is a type of conversion rate optimization (CRO) testing in which you put two versions of the same piece of content against each other, while changing only one variable between them. All other variables must remain the same. If you’re testing more than one variable, that’s called multivariate testing.
For instance, you could test different call to actions, such as “sign up for a free trial” vs “subscribe for a free trial”, move your CTA to the sidebar in one version and to the bottom in the other or change the alignment (left vs right) of your in-post images.
- Determine the “control” and “challenger” and split them equaly
In each of these cases, one version of content would be the “control” version, while the other is the “challenger”. The two should be evenly split, 50/50 between a predetermined number of your visitors. In other words, if you’re testing on 100 site visitors, you need to show the control version to 50 of them and the challenger version to the other 50.
- Decide how long to run the test
You also need to figure out how long to run the test. For example, one hour might be fine if you’re testing two different headlines of the same article, but for something like an opt-in or CTA, or an entire web page, you’ll probably have to leave the test to run a bit longer, like a week.
Stopping the test prematurely or simply not running it long enough might give you a skewed image. It’s important to let the test run its course for it to produce useful data and have a significant sample size. It’s one thing to make a decision based on a testing 100 vs testing 1,000 site visitors.
- Choose the metric to follow and how meaningful the result needs to be
Before you run the test, you should identify your goal and the primary metric you will focus on. Consider where you want this metric or variable to be at when you finish the test and measure your results based on how close (or far) you got to that goal.
This will make an effective test, which won’t affect your perception or user behavior.
Speaking of results, think how significant they have to be in order to validate one version over the other. You should look for a confidence level percentage of at least 95%, or even higher whenever possible. Think about it this way, would you bet if you were 20% sure that a team would win the next game or if you were 95% sure?
- Declare the winner (if any) and take action
The A/B test can have to outcomes. Either you’ll find a clear winner and kill the losing variant, or neither version will be significantly better, meaning that the tested variable has little to no effect on the results. This gives you two options. Either call it a day and continue on with the original version. Or run another split test, but this time with some different variations.
Okay, now that you hopefully understand how to conduct one such test, let’s take a look at what content elements you should A/B test?
What Content Elements Should You Test?
Headlines are the first (and often the last) thing someone will see from your content. You want them to attract the reader and not be like those billboards that people drive by without a glance.
OkDork analyzed 1 million blog post headlines and found the the most popular words/phrases in highly-shared headlines (above 1,000 shares) include “list post” (11.10%), “you/your” (6.74%) and “free/giveaway” (3.60%).
In addition, headlines with a higher emotional marketing value (EMV) are shared much more than those with lower EMV. You can test your headlines EMV using the Advanced Marketing Institute’s Emotional Marketing Value Headline Analyzer. Another great tool to determine if your headline has a high share potential is CoSchedule’s Headline Analyzer. I Highly recommend using both.
- Images (and other visuals)
An image is worth a thousand words. After all, a Facebook post with an image will get 2+ times more engagement over one without an image and a tweets with images are retweeted 150% more than tweets without them. (source).
That’s nothing to scoff at, but just having an image doesn’t necessarily mean your post will get shared at all. There are plenty of blog posts that contain images that never get shared or get shared very little. A good A/B test can tell you what you’re doing wrong.
For instance, you might try with a different type of image altogether. Perhaps a graphic or a screenshot would work better than a regular image? Another thing to test is the disposition of the image. Is left alignment better than right, or should you center your image? What about the number of images you are using per post? You need to consider the image-to-text ration in your content, especially when sending out emails. Informz recommends that 80% of your email should be text and 20% images.
Call to action is another content element you need to split test to find the best variation. A very small change here can make a huge difference. For example, Fab, an online community for buying/selling apparel, accessories and collectibles increased its CTR by 49% by adding “Add to Cart” text to its CTA button.
You can test any number of things when it comes to the CTA, including having text vs not having, color, images, where on your page to place the CTA button and so on.
- Signup Form
Having an email list is hugely beneficial in content marketing. That’s why most websites have an email form placed somewhere on their pages. Unfortunately, most signup forms sit completely misused and forgotten, never to be clicked on.
There can be a number of reasons for this. Maybe the message doesn’t “click” or manages to persuade the prospect. Or perhaps it lacks social proof that the ideal prospect would respond to. To find out what actually works, you need to put your email signup form to the A/B test.
- Posting Schedule and Frequency
A lot of blogs stick to a certain posting schedule and that’s a good thing. Users prefer blogs that publish regularly over those that do it haphazardly. But you may get into a trap of sticking to a schedule that doesn’t work. Are you so sure that it’s best to publish on Mondays and not Wednesdays? Should you be avoiding Fridays? Try posting on different days and see what happens.
The same goes for posting frequency. If you’re posting once per week, try doing it twice per week. Perhaps this will expose your content to more people. On the other hand, if you post too often, it might be too much for your reader base, or you just can’t keep up the quality this way.
We test things and get tested ourselves every day. It’s how we make decisions to buy this car and not that one, watch this movie and not some else and so on. The same goes for your content. Just a small change can make all the difference, but you won’t know this until you put it to the test. That’s what A/B testing is for.
I hope I gave you a good idea how to A/B test your content strategy through this post and that you’ll be able to apply this to your own content. If you found the article useful, don’t forget to share it on Facebook, LinkedIn, Twitter or Google+.