How to increase ad revenue by A/B testing placements

ab test

How to increase ad revenue by A/B testing placements

At, we’re serious advocates of A/B testing. The difference a single ad placement can have on your ad revenue and user experience is huge. If you have any uncertainty about which ad placements will perform best on your site, then proper testing is the only way forwards.

However, it’s vital to conduct the right tests in the right way. You want to collect meaningful data from which you can draw effective conclusions. Here are the top tips from our very own revenue optimization team who conduct A/B tests for publishers and breathe data on a daily basis.

DO A/B test units which are comparable and similar. For example, if you want to test in-content units on your article pages, then you could test an inline placement on half the traffic and an in-view placement on the other half of the traffic.

DO keep other variables constant. By controlling the test environment, it will be clear what’s causing any difference in the results. So if you want to test different sized units, for example a 300×250 against a 300×600, then you would use the same type of placement in the same position on the page, and serve it in the same situations.

DO decide ahead of time how much data you want to collect before you analyse the results. Will you run the test for a predefined number of days or weeks? Or until you have a statistically significant number of ad impressions?

DO decide in advance what you would define as a successful test. In other words, what is your criteria for choosing which placement to serve 100% of the time? Are you looking for the placement that generates the highest ad revenue or is your priority to find the one that least impacts the user experience?

DON’T conduct random, arbitrary A/B tests because these won’t give you interesting conclusions. You won’t learn much from testing a sticky footer on mobile against an inline placement on all devices. The inline would likely generate more ad revenue than the other but it’s not a meaningful, like-for-like comparison.

DON’T go overboard and test too much at once because different combinations of ad placements can affect the performance of each other. We recommend running one A/B test at any given time. Two tests can also work if the placements are in different locations on your web pages and not visible together.

DO contact us here if you have any questions at all or are interested in testing out the effect of new ad placements on your website! If you are an existing publisher, then you can be in touch directly with your revenue optimization manager who will be happy to help.