A/B testing (sometimes also called split testing) is comparing two versions of a creative ad to see which one performs better. You compare two creative files by showing the two variants (let's call them A and B) to similar visitors at the same time. The one that gives a better conversion rate, wins!

Well-thought-through A/B testing can often make a huge difference in the effectiveness of your advertising efforts. A/B testing allows you to narrow down the most effective elements of your message, and then combining them to make your marketing efforts more profitable and successful.

The first step involves creating a list of things you'd like to test. It’s important to decide early-on what you’ll be testing for, and also what you may consider success. This can include the location of the call to action, or the specific language and imagery used within the ad, or even the call to action itself - perhaps comparing two different headlines, or comparing a "Learn More" button with a "Get Involved" button.

Bearing in mind that this is a process, it's quite common for multiple A/B tests to be carried out prior to making a final decision or final change. This data-driven decision making process often throws up surprises. Any split tests must always run simultaneously to account for variations in timing. We don't recommend testing one variation on one day of the week, and another one the next day, because we can’t factor in any variables that may change between those two days. You always want to split the traffic that sees your variations at the same time.

It's also worth noting that A/B testing is not an overnight project. You may want to run tests for a period ranging  from a few days to a few weeks, depending on the length of a campaign. And most importantly, you should only run one test at a time for the most accurate results.

Did this answer your question?