FieldTest

View Original

When in Doubt, Experiment - How A/B Testing Can Lead to Unexpected Success

The FieldTest platform was built to be a full featured ad service, but sometimes we find that our clients need a little push to use it to its full potential. Often, clients will come to a campaign with a preconceived set of ideas about who their audience is and what appeals to them. This makes sense, if you’ve built your brand from the ground up you probably have a solid understanding of who has purchased your products and who you have designed them for. Still, by letting this be the end of the equation you risk leaving money on the table that could have been picked up fairly easily with some creative and targeting experimentation.

How do you pick up this extra money? With a process called A/B testing. This week we will run you through the process of A/B testing and steps required to use your learnings to mine unexpected successes from your test campaigns. These can be some of the most powerful tools in your arsenal to expand your customer base and bring new customers to your site, ofte ones you may have never expected. Let’s get started.



What is A/B Testing?

A/B testing is a process within the FieldTest Platform where you build out two similar ad campaigns with a crucial difference and measure the success between the two. This difference can be any of a myriad of variables in your ad campaigns, but the important thing is to watch and compare the results and use the learnings to inform your future campaigns. These campaigns can provide crucial insight not only into the performance of potential ad groups, creative directions and budgeting priorities, but can also shine a light on your evergreen campaigns and provide a good measurement of your standard ad campaigns performance.

For example, say you run a hiking shoe company. You may have already decided that your key audiences are hikers who regularly invest in quality shoes. Makes perfect sense, and you probably already have an ad campaign set up to approach these intenders. This should not be the end of the road for your brand! From here can experiment with new exciting approaches to advertising: Try some new headlines or new images and target them to your current buyers. Try targeting some new intenders and develop new creatives to appeal to them along with a custom landing page to drive them to convert. Maybe just experiment with new ad formats such as custom display banners and video ads. All of these experiments will then be able to be compared to each other in order to determine new directions for your ad budget as well your evergreen campaigns as a general bellwether of performance.

Isn’t it better to play it safe?  

It’s easy to call it a day once your evergreen campaigns are set and turning positive ROAS. However doing so leaves many opportunities to uncover new customers untouched. One recent client who upgraded from Platform to Platform Plus was a CBD brand whose campaign was targeted entirely to Los Angeles, New York, and Chicago - the largest DMAs in the country. Their logic? The largest cities are likely the most friendly to CBD products and therefore most likely to buy them. This makes sense and served as a solid foundation for a great campaign, but we knew if we tried something new and experimental we could deliver even better results.

We used a portion of their monthly ad budget to run a small nationwide test campaign and the results were surprising. Around 25% of the sales from this nationwide campaign ended up being from small suburban towns in texas, one of the last places this client would have thought to target. By taking this data point and using it to inform our next moves we were able to help them set up a very successful campaign targeting small suburban towns in more conservative parts of the country, increasing total ROAS considerably. Lessons like this are extremely valuable for brands seeking to achieve scale and they are not possible without devoting a portion of your budget to trying new things and experimenting. 


So what’s the right test budget for me?


Budget is an important piece of this puzzle. If your starting FieldTest budget is $1,000, you probably don’t want to A/B test across two dozen different creatives since that will dilute the data you get back and can make it far less valuable from an optimization and learning perspective. For more modest budgets ($500 - $3K) we suggest using no more than four sets of creative with each set having an A/B testing component (for a total of eight creatives for the campaign).

For budgets of $3K+, you have more room to experiment. Here you might even want to A/B/C test with a third variant if you are really stuck on three types of messaging and want to figure out which one is working. Still, even with larger budgets it helps to err on the conservative end of how many creatives you end up using. Remember- you are going wider with your audience to see what customers are responding to best. The more creatives you add into the mix the more complexity you are introducing to your campaign and the results you get in return won’t be as strong.

Ultimately, as long as you are putting thought into how much budget you are spending vs how many creatives you choose to run, you will be in great shape.


Experiment, test new ideas, and succeed

One of the greatest benefits of digital advertising is the sheer volume of consumer traffic available to advertisers who are willing to reach them. While many intuitively approach their advertising only to appeal to a select audience, truly savvy marketers know the scale of opportunity held in reaching out to new audiences and constantly testing new ideas. By always trying new things you keep your customer base fresh, diverse, and always looking to you for the next exciting development and new intriguing product. It also stands to increase your total footprint and continually grow your cutomerbase and ROAS.

Reach out below to find out how FieldTest can help you think up new creative concepts to A/B test and start bringing in exciting new customers now!