Over recent years, the increased ability to measure the success of marketing efforts have changed the way we, as marketers do our jobs. In many traditional advertising methods, measurement is taken by outside entities such as Nielsen Ratings, ad retention exercises and other methods like those that measure effective samples of a population, but are still just samples. In today’s digital marketing scene, the ability to measure success in actual numbers is available by measuring click through rates.
In addition, advertising research can now be conducted online using click through rates in order to determine the effectiveness of an advertisement. The method that I’m talking about here is called A/B testing.
In the same way that we conduct scientific studies, A/B testing involves the same concept. You have a control group who views your normal website or advertisement while simultaneously exposing a group to an alternative version of your media that has been altered in one specific way. By measuring which version is more successful at accomplishing the goal that has been identified, we as marketers are able to decide which version to use for the entire population.
What’s great about this form of testing is that it can be done live. There is no need to wait for trials to go through to determine whether or not to release it. You simply have to adapt as you discover which stimulus works in a given circumstance. This method ends up saving you a lot of time and money on marketing research efforts.
Tips for Success
In order to make sure that your information is as accurately representative as possible, here are some things to think about:
- Make sure that you are only testing unique visitors. You don’t want to confuse your regular visitors by changing up formats on them unless you have determined that it is the most effective means of achieving your goals.
- Trust the data. Your results may seems to go against what you believe is the best approach, but keep in mind that there is a reason why another version was more popular than the other, and it is risky to ignore that data.
- Make sure to only test two versions at a time, and make sure those versions are tested simultaneously to ensure that the accuracy of the data.
- Don’t make conclusions too early. Sometimes it takes a bit of time to see a statistically significant difference in a test. Be patient, your results will be more accurate.
- Remember to keep experimenting. Once your first experiment is completed, come up with a new one and run with it. An iterative process to experimentation will ensure that your media continually improves, and by improving, you will begin to achieve your goals more efficiently.
How to Get Started
There are currently a number of great tools out there to help you A/B test effectively:
Interesting Examples of A/B Test Results
Let’s take a look at some interesting examples of A/B testing that challenge our normal way of thinking about things.
Believe it or not, Version B increased the sign-up rate by 38% because even though it is bulkier, the headline is shorter and the sub-header bolds only the most important information.
Adding social proof into version B has helped this company increase their sales by 34%. However sometimes simplicity can be more effective. In another example, a company selling insurance showed competitor pricing which included consumer ratings. This use of social proof ended up confusing customers who were only interested in comparing the two in price.
This case is interesting because you would think that a certification of privacy would help conversion rates, however people tend to associate those seals with payments, and even though they didn’t have to pay anything, they were less willing to sign-up. The result was a 12.6% increase in sign-ups through Version B.
This is a great example of how color can go a long way. As you can see, it is much harder to read Version A’s “Get it Now” button. The light green makes the white lettering difficult to read and therefore led to Version B getting a 14.5% increase in conversions.
(These examples were done by Which Test Won)
As you can see, some choices can be counter-intuitive and it is important to remember that data speaks louder than a gut feeling in many cases. My advice here is trust the data when it comes to A/B testing. Now go out and use the tools here to try A/B testing on your business, and see what kinds of changes you see in achieving your goals.
Thanks for reading!