Ever since the dawn of email marketing, there has been the admonition to perform A/B split testing in order to accurately fine tune your campaign. This form of testing is simplicity itself: Send out half your emails with Subject Line A and the other half with Subject Line B, then check the metrics to see which one worked best. There is, of course, no end to what you can A/B test, proceeding to preheaders, Calls To Action and even the color of the background or the font of the newsletter font. A/B testing’s remarkable power has been applied to a vast range of purposes, from optimizing political websites to the constant barrage of tests conducted by Google on virtually all of its offerings. This widespread adoption should serve to prove the validity of the methodology and the necessity for all email marketers to embrace it.

Chop up Your Subscription List & Toss Different Looks at Your Customers

A/B testing is at the heart of a completely different and innovative model of online marketing where the brand can devise a huge variety of variables and just throw them all at the customer base to see what sticks. Instead of developing a new email campaign template, for example, why not chop up your subscriber list into sections and serve up a number of different looks for your newsletter to see which one performs best? The critical aspect of A/B testing is that only a single element is changed in each variant, but there is no legitimate limit to how many variants can be presented to your customers at any one time, given that the sample rate is large enough to be statistically valid. Therefore, the proper A/B testing methodology would call for a number of templates to be tested, then followed by a number of subject lines, then followed by preheaders, etc.

The Tyranny of the Data vs. the Wisdom of the Exec

A/B testing’s strengths can be its weaknesses as well. Committing to A/B testing effectively shifts the decision making process from the C-Suite and allows the data to make the determination. Some see this as the triumph of empirical evidence over the ethereal whims of the executive corps, while others see it as the tyranny of the mob to shift the entire focus of a brand marketing strategy without the secure guiding hand of the executive’s experience and wisdom. Savvy A/B testers are well aware that the reaction of customers in January can vary significantly by July on an identical test, so there is something to be said for the policy of letting the marketing strategy be determined by a blend of data and executive smarts.

Black or Silver Radiation Box? How about Neither!

Another criticism leveled at the proponents of omnipotent A/B testing is that it steers a brand into making incremental changes rather than revolutionary ones. The famous Henry Ford statement that “if I’d asked my customers what they wanted, they’d have said a faster horse” rings true, and it is questionable whether the original Apple computer or microwave oven would have survived A/B testing: “Do you want radiation to cook your food in a black or silver box? How about neither!”

When A/B testing is properly implemented it can be a significant boon to any online marketing effort. However, the onus should be left to the experienced executive to decide what extent the data should be allowed to single-handedly run the company.

Quick Options for Split Testing Your Email Marketing 

  • To use subject line personalization or not?
  • Call-To-Action verbiage
  • Landing Page copy
  • Different email templates
  • Sizing of the action button
  • To use emojis or not?
  • Overall email design options (color scheme, typography, etc.)
  • Different subject lines
  • Sender name
  • Send time

By implementing any of these quick options for multivariate testing, you’ll be able to see the difference in results from your email open rates or click-through-rates. You’ll need a decent sample size to implement these split-testing tactics and proper event tracking in place to produce significant results but you’ll have plenty of data to review.

Easy Metrics to Track When Email A/B Testing

  • Open Rates
  • Click-Through Rate / Click Rate
  • Ecommerce Conversion Rate
  • Call-To-Action Completion Rate

Once you send your test emails, you’ll be able to tweak the necessary items throughout your email marketing campaigns based on the results that portray a statistical significance in what is deemed as successful. If version A has a higher open rate, then it’s safe to say version B gets canceled. If the CTA on test A has a higher conversion rate than test B, well…you know the drill.