Building a website or designing creative for display ads means making decisions about everything from the specific wording of section headers and calls to action to the colour and placement of buttons and other functionalities. But have you ever wondered how your choices translate into customer leads and conversions, or whether making wording or design tweaks can drive even better business results?
Scientifically testing this out may seem tricky at first, given the number of other possible variables that play a role in consumer behaviour. Let’s say you want to test whether potential customers are more likely to click through to your website if the call to action in a display ad tells people to click for a “free quote” vs. an “expert quote.” You make the change from “free” to “expert” and see 25 per cent more requests in the two weeks that follow, compared to the same period of time with the previous wording.
At first it may seem potential customers are more responsive to the word “expert.” But how do you know some other factor isn’t at play? For example, let’s say the business in question is a home heating company and two days after they launched the new ad, temperatures outside dropped from pleasant fall conditions to near freezing. How much of the increase in clicks can be attributed to the change in weather, and how much is because of the new wording? At a glance, it’s impossible to say.
Enter A/B testing.
A/B testing means comparing two different versions of the same web page, or display or search engine ad to see which performs better. Visitors are randomly shown one version or the other, allowing you to measure and compare how they interact with each. You can test for time spent on a website, clicks through, forms completed, or any other desired action. The key, however, is that the comparison runs at exactly the same time. This allows you to rule out other possible factors as the reason for the difference in behaviours.
Back to the heating example, if the company had used A/B testing, they could have found out that while one out of every eight people, or 12.5 per cent, who saw the expert ad clicked, one out of every five – that’s 20 per cent – of those offered a free quote clicked through. Without proper testing, it would have been easy to assume the wording was responsible for the increased clicks when people were probably just really cold! Now moving forward, the company can confidently use the “free quote” ads knowing that customers really do respond better to that wording.
What kind of items can you test? On your website, you can use A/B testing software to see how customers respond to everything from where you position your call to action buttons to section names and more. For search engine marketing, you may want to test out different keyword buys or play around with your wording or calls to action on ads served based on the same searches--same goes for the creative on digital ads. And if you’re using Yellow Pages Search Engine Marketing or Smart Digital Display ads, the process is extra simple because Yellow Pages creates the ads and manages the campaign on your behalf – while your business benefits from new customer insights.
Start getting better results
A/B testing allows you to zero in on how different creative or deployment strategies affect your website and advertising performance. Yellow Pages allows you to benefit from the enhanced customer insights and results, while managing the complexities of testing on your behalf.