The single most honest (and infuriating) answer to nearly every marketing question is, “it depends.”

That’s because there’s no one right answer when it comes to marketing. What works for one business won’t necessarily work for another.

Customers and audiences may differ. Even within your own audience, factors such as region and age can affect how someone will react to your email marketing campaigns.

There are many variables that come into play with every email campaign and we’re often left asking ourselves if we’re doing the best that we can. Is your email marketing connecting with your subscribers, customers and leads?

What Happens in Vegas…

For the past couple years, we’ve been lucky enough to attend the MarketingSherpa Summit in Las Vegas. We learned a great many things, but one point that especially stood out was from MECLABS founder and managing director, Flint McGlaughlin:

There are no such things as expert marketers. There are only marketers with experience.

You may think you have all the answers, but really, we’re all playing guessing games. Granted, for some of us, they’re more educated guesses than others.

At the end of the day, we all have a unique offering and we need to find a way to connect on a personal level with our core customers.

One True Answer

One key pattern emerged at MarketingSherpa event. A more accurate answer than “it depends.”

The answer is testing.

From small businesses to international corporations, nobody found the answers they were seeking without testing.

Tests ranged from signup forms to email copy and even the number of steps in a conversion process that it took to establish trust.

The bottom line is that testing is king.

Consumer Reports Boosts Donations with A/B Testing

At MarketingSherpa Summit 2016, we got to participate in what Austin McCraw, Senior Director of Content Production at MECLABS, called “the largest collaborative A/B test on the planet” for Consumer Reports.

The goal was to help increase donations to Consumer Reports by testing and hopefully improving the email campaigns they send to do so.

First, the crowd was polled on various value propositions. The first set of tests was run based on the favorites of the crowd.

Next, we voted on various email treatments for the campaign. Tests were run on each. Throughout the testing, we learned some things that did not work … and ultimately what did.

That’s right. By end end of the two-day event we had helped Consumer Reports to increase revenue per donation by 32%!

Increase Open Rates with A/B Testing

The Apollo Education Group is the parent company of The University of Phoenix and more. They were unable to run A/B tests due to an old, outdated tools. They upgraded ESPs and soon found out how easy A/B Testing can be.

The first tests they ran were more simple things such as subject lines and from names. They were impressed by how much testing the from name improved their engagement.

From there, the Apollo Group ran A/B tests on the placement of the unsubscribe links being at the top or bottom of their campaigns. They continued from there, establishing a meeting to view results to gain an understanding of what was successful.

Speaking of success, the Apollo Group achieved:

  • An increase of 39% overall in open rates thanks to ‘from’ line testing
  • A 58% overall increase in click-through rates by testing their email templates
  • By adding a name in the preheader, they gained a 9% increase in open rates

MVMT Watches See Movement in Revenue

Among many other tests, MVMT Watches ran A/B Tests on email length and content. This process enabled them to send versions to small sample sizes of their email list. Then, the winning campaign was sent to the remainder of their list. By running A/B tests, as well as testing to optimize their emails and send frequency, they were able to see an increase in revenue of 105%!

The Proof is in the Pudding

As you can see, each of these companies had different ideas on what might be successful. Only through testing, were they able to see which ones would work.

Some strategies may seem counterintuitive or even impossible, but we can never know for sure until we test. By asking the right questions, running the tests and analyzing the data, these companies were able to gain the answers they sought. They created the proper tests and improved upon their goals.

Don’t Just Trust Your Gut

So, when it comes time to send your next email marketing campaign, don’t just rely on your instincts. Look to the data. Remember, there are no marketing experts. Instead, you can become a marketer with experience by running tests and learning what will be successful with your subscribers, customers and leads.

Put everything you do to the test. With Benchmark Email, you can test subject lines, from names and even full email campaigns.

I’ve said it many times. Using A/B Testing is like having your very own crystal ball. It enables you to see how your subscribers, customers and leads will engage with your email campaigns. That way, you’ll always know what your best performing ideas will be.

Now It’s Your Turn

When you go to run an A/B test on your own email campaigns, be careful not to test too many variables at once. You need to be able to identify which change was successful.

Keep it to one test at a time. Identify your goals and test one idea at a time to achieve it.

If your open rate is low, try to test the elements that can help improve that. If you want to see an increased engagement rate, focus on those items. You get the picture.

Here are some A/B tests that you can run:

  • Subject Line or From Name: helps to improve your open rate
  • Headline Test: can work to boost your engagement rate
  • Link or button tests: see which CTAs increase your click-through rate
  • Copy or voice: testing your content can help both engagement and click-through rates

 

Let Us Know What You’re Do

Share what you’ve tried. If you’ve already run A/B tests or you try one after reading this post, we want to hear about it. Tell us what you’ve learned in the comments below.