A/B Testing, also called Split Testing, is a digital marketing method that uses two different versions of a single campaign to determine which content version performs better in terms of increasing social engagement and improving conversion rates online.
One version of the campaign content, called the A group, is the “control,” and the other version of the campaign content represents the B group, which contains the variation content.
Testing differing campaign content in this manner can inform a marketer like you or me which campaign version we should focus on and invest our marketing budget in.
A/B Testing is most often used for email marketing content and social media ad campaign content, but of course it can be used to test any digital content campaigns you wish.
An example of using A / B Testing in a non-marketing scenario would look something like this:
A 4th grader named Tommy has a hypothesis that a plant that receives adequate sunlight will grow faster and appear healthier than a plant that does not have access to sunlight. In order to test his hypothesis, Tommy buys two plants of the same species and size. The plants have been potted with the same soil. He places one potted plant on the windowsill where there is direct sunlight. He places the other potted plant in his closet where sunlight will not reach it. Every day for two weeks, he waters both plants with exactly five ounces of water.
The result that he discovers confirms his hypothesis. At the end of two weeks, the plant which received adequate sunlight grew significantly and now appears healthy with bright green leaves and flowering buds. By contrast, the plant which was kept in his dark closet looks nearly dead.
Since the only differentiating variable between the windowsill plant and the closet plant was the presence or lack of sunlight, Tommy is able to confidently conclude that plants that receive adequate sunlight will grow faster and appear healthier than plants that are deprived of sunlight.
Sure, this example might seem overly simplistic, but the big takeaway that you understand from the A / B Testing of the two plants is that with the exception of sunlight every single variable was the same, i.e. the species of the plants, the quality of the soil, the amount of water used daily, and even the indoor air quality. The only difference between the A plant and the B plant was the sunlight variable.
In the world of digital marketing, an example of an A / B Test could look something like this:
As an adult who waxes nostalgic for the career he almost had in horticulture, Tommy now works as a digital marketer, has an overwhelming mortgage, and often contemplates whether he should yank his children out of public school. He also has a hypothesis—the email marketing campaign that he’s about to launch, which includes a paid subscription sign-up CTA, will perform better if the email includes an incentive offer associated with purchasing the subscription.
The goal of the email marketing campaign is to convert recipients into paid subscribers, and Tommy needs as many new paying subscribers as he can get, because he has that mortgage and those kids to think about….
So, he creates a copy of his email campaign, naming the copy “Test B” and naming the original “Test A.” The only difference between the two email campaigns is that the A Test (original) does not contain an incentive offer, whereas the B Test includes an incentive offer.
Tommy composes a sentence regarding the CTA incentive offer, which happens to be a free ebook download, and plugs the sentence into the B Test email template. Concerning the B Test email, anyone who signs up for a paid subscription will have access to a download link to receive the free ebook.
He launches both email campaigns, the A Test and the B Test, and waits… for two weeks.
Low and behold, at the end of the trial period, Tommy checks the performance of his A Test versus his B Test, and discovers that the B Test, which contained the free ebook incentive, resulted in almost 300% more paid subscription sign-ups than the A Test.
Tommy determines he will enroll his children in the nearby Montessori school that he passes every morning on his way into the office.
As the two examples demonstrated, A / B Testing is a scientific method that’s used to compare two different variables for the purpose of determining which variable will perform better and yield a desired result.
In the world of digital marketing, the desired result is usually conversion, whether it be converting website visitors to newsletter subscribers, newsletter recipients to monthly subscription customers, or monthly subscription customers to brand ambassadors who use word-of-mouth marketing to successfully refer their friends and family to sign up for monthly subscription packages. You get the idea.
A / B Testing is also used in digital advertising campaigns that appear on Facebook & Instagram, Google AdWords, and other online platforms.
A / B variables can be as simplistic as the color of a CTA button, or as complex as whether the CTA is a lead generation form versus a Shop Now button.
When used to compare digital marketing advertisement versions, the A / B Testing stage can last a few weeks to a few months, and the investments for both test campaigns tend to be low. Then, once the marketer ascertains which test campaign performs better, the failing test is pulled down and a large financial investment is then pumped into the winning campaign.
Using an A / B Test to compare two different options that could be used for one campaign variable will yield the clearest results. Meaning, when all campaign elements are identical except for one, you can conclude that if one campaign performs better than the other, it is because of the different variable.
For this reason, A / B Testing works best when only one variable, or campaign element, has two versions representing the A and the B. The results of the sunlight vs. darkness test told Tommy to “use” sunlight if he wants his plants to “perform” well. The results of the incentive offer vs. no incentive offer told Tommy to “use” incentive offers in his email campaigns if he wants the campaign to “perform” better, i.e. to successfully convert recipients into paying subscribers.
That being said, there is another form of A / B Testing that compares many different campaign elements between the A Test and the B Test. This form of A / B Testing is called “multivariate A / B Testing.” When this complex version of A / B Testing is implemented, it can be harder to pinpoint which specific variable or variables performed better.
For instance, let’s say you run a multivariate A / B Test for a website landing page. The goal of the landing page is to generate leads. In order to generate leads, the landing page uses a CTA incentive, which is a “locked” webinar. In order to unlock and watch the webinar, the visitor must complete the CTA. Both landing pages are identical except for the following variables:
● The Design Layout
● The CTA
● The Content Font Size & Color
The A Test uses a design layout that does not match the brand’s logo or website. The CTA is an email opt-in form. The landing page content uses an extremely large, purple font.
The B Test uses a design layout that so closely resembles a major competitor that it’s inevitable this thing is going to wind up in court. The CTA is a “Tweet This Now” button, and the landing page content uses a red, moderately-sized, cursive font.
A few weeks, or perhaps a few months go by, then the performance results are analyzed.
Did the A Test generate more email leads? Or did the B Test render more Tweets? How would a Tweet even generate a viable lead in the first place??? (Bonus points if you caught that critical error!) Which landing page succeeded?
As it turns out, Test B had more Tweets than Test A had emails… but what does that mean?
The data is, for lack of a better term, a pile of confusion.
Does this mean you should avoid launching multivariate A / B Tests? Not necessarily. If you have experience analyzing A / B Test data and feel confident you’ll be able to differentiate the performance results of many competing variables at once, then multivariate A / B Testing could potentially provide you with a wealth of valuable insights that help you launch an extremely successful campaign.
In fact, the following are the pros to launching multivariate A / B Tests:
● Provides valuable insights regarding the interactions between multiple content elements
● Provides granular data regarding which campaign elements positively or negatively impact performance results
● Enables marketers to compare many versions of a campaign, not just two, and conclude which one will have maximum impact overall
But multivariate A / B Testing also comes with cons:
● Can be highly complex and might require an expert to conduct an analysis of the resulting data
● Requires significantly more traffic than a normal A / B Test in order to render statistically valuable results
● Too many campaign variable combinations could cause the results to be too difficult to interpret, rendering the entire test and its associated costs a waste of time and money
If you want to launch a multivariate A / B Test but don’t want to go it alone, then give us a call. The marketing specialists at FTx 360 will make sure your test campaigns result in performance data you’ll be able to analyze easily… and yes, we’ll analyze the data for you, too!
So, you’re ready to get down to business and devise a digital marketing A / B Test. You’ve done your research and you’re already excited about improving one or many areas of your business. You envision achieving tangible business goals as a result of using A / B Testing, such as:
● Increasing Website Traffic
● Increasing Conversion Rates
● Lowering Bounce Rates
● Lowering Cart Abandonment
Here are the steps you should follow if you want to ensure the best A / B Test results.
Why do you want to launch an A / B Test in the first place? Have you launched a blog that isn’t getting any web traffic? Have you noticed that your Facebook “Shop Now” native ad hasn’t resulted in higher eCommerce sales? Do you suspect the subject lines of your email marketing campaigns are the reason your emails aren’t being opened? Before you can solve the problem, and before you will know how to solve the problem, you first must identify what the problem is.
Once you’ve identified the problem, you can identify the goal you’d like to accomplish using A / B Testing. The more specific your goal is, the easier it will be for you to reach it. For example, if your promotional emails aren’t being opened at an acceptable rate, you’ll want to make note of the current open rate and then decide what the goal open rate should be. Meaning, if the current email open rate for your promotional campaigns averages 8%, you might set an open rate goal of 16%. This is a tangible goal that you will be able to compare your campaign data with easily.
As we covered in this article, if you’re new to A / B Testing, then you should start with testing only one variable, or campaign element, as opposed to testing multiple variables at once. The precise variable you decide to test, however, should relate to both the problem and the goal you’ve identified. For instance, if the problem you’re facing is a low open rate on your email campaigns and you’ve set a goal of doubling the open rate, the only variables that pertain to your problem are the email subject line and the email lead-in description that appears in recipients’ inboxes. Those two variables are what your recipients can see, and based on what they see, they will either open your email or delete it. Choose to test only one of those variables. Let’s say, the email subject line.
This step is straightforward, but will be the most time consuming. It’s time to actually create the two email campaigns and set them up in your email automation software. Make sure that both tests are clearly labeled and that the different subject lines appear perfectly. Hopefully, you’ve spent adequate time researching how to write email subject lines that increase the chances of recipients opening them.
While splitting your sample groups might not always apply to the particular A / B Test you’re running, let’s take a look at this step anyway since it’s relevant to an email marketing campaign A / B Test. In this instance, you obviously can’t send two emails to the same recipient list, so you’ll need to create two recipient “groups.” The two groups should contain a cross-section of your subscribers that includes all demographics. Meaning, both groups should contain all ages, both genders, both new and old subscribers, both high-spending and low-spending customers, and so on. This is opposed to one group containing loyal, high-spending, mature subscribers and another group containing new, low-spending subscribers, which would skew the results.
Once you launch your A / B Test email campaign, automating that the A Test goes to one group while the B Test goes to the other group, you can kick back and allow an appropriate amount of time to go by. Depending on the nature of the A / B Test, you may want to let a few weeks or a few months pass. In the case of our email campaign, waiting a full week will be enough. The main point here is that while your tests are running, you can monitor the evolving results, but do not alter your campaigns.
After an appropriate amount of time has passed, you can analyze the performance of the A Test versus the B Test. Compare the results of each test to the goal you set for yourself at the onset of your campaign. Which test performed better? If so, how much better? Did either test meet or exceed your goal? Simply put, whichever test performed better is the one you should invest in moving forward.
In terms of optimal digital marketing strategies, email marketing remains one of the core channels to directly communicate with consumers.
Email marketing can be personal, informative, and lucrative. In nearly every industry, businesses of all sizes benefit from marketing directly to their customers, clients, and subscribers via email. There is a direct correlation between a company’s email campaign performance and the overall marketing success of that company.
Put simply, the open rate, clickthrough rate, and conversion rate of a business’ email marketing campaigns will make or break their business growth.
How successful are your email marketing campaigns?
In order to understand whether your campaigns are successful or dismal, you first have to look at the average email benchmarks for your industry, including the average open rate, the average clickthrough rate, the average click-to-open rate, and the average unsubscribe rate.
Across all industries, the overall email marketing statistics for 2020 were:
● 18% average open rate
● 2.6% average clickthrough rate
● 14.1% average click-to-open rate
● 0.1% average unsubscribe rate
Take a moment to check out your email marketing metrics across all the campaigns you ran in the past six months. How do those averages compare to those 2020 statistics?
Now, compare your averages to the campaign averages held by industries that typically benefit from email marketing the most:
If your company’s email campaign averages fall short, then the tips contained in this article will help you increase conversions and grow your business.
Before we get rolling, let’s first address the most important tip of all… the “Messiah of All Tips” if
Present professionally written emails, period. This is the most important piece of advice we will ever give you regarding your email marketing campaigns. You could even stop reading this blog post right now and focus solely on generating professional-level content, and your email marketing campaigns will achieve a higher conversion rate than you’re currently seeing.
Today’s consumers are smart and unforgiving. You cannot afford to launch email campaigns that contain typos, misspellings, and funky wording for obvious reasons. But it’s especially critical to use correct grammar, syntax, spelling, and punctuation, because email copy that falls short of this standard will reflect poorly on your brand and could damage your business’ reputation. Not to mention, poorly written copy generally fails to convey a clear message.
Remember: if your recipients can’t understand you, they won’t be able to engage with you.
Now that we’ve established the importance of the quality copy, let’s get rolling. In this article, you’ll discover how to use:
● Subject Line Hooks
● Intriguing Preview Text
● Professionally Written Paragraphs
● Crystal Clear Messages
● Irresistible Call-to-Actions
Let’s take a look at each tip…
The content you use for the subject line of your email marketing campaign will be the first thing your subscribers read, and if it isn’t written well, it’ll be the last thing they look at… Yup, the email will head straight for the Trash folder if the subject line doesn’t “hook” the recipient.
What is a subject line “hook”? You can think of a hook as an offer, incentive, or teaser. The purpose is to grab the recipient’s attention and compel them to open the email so that they can gain more information. Can you tell which of the following subject lines is “generic” and which includes a “hook”?
“Modernize Your Digital Identity” vs. “Our Digital Marketing Services Are Affordable”
Would you agree that the subject line “Modernize Your Digital Identity” is a bit mysterious? Reading a subject line like that, the recipient will probably ask themself, modernize… how? which will compel them to click open the email to find out!
TIP #2 MAKE SURE YOUR PREVIEW TEXT INTRIGUES
Even before your recipients open your email, they can see the first line or so of the body of the email. This “snippet” of copy is called “preview text”, and it should be just as intriguing as the subject line hook you used.
The email marketing platform you launch your campaigns from will automatically pull text from the body of your email to use as preview text. Meaning, you might not be able to change the actual copy without editing the body of your email. This is a reason to be very mindful of Tip #3.
However, some platforms will let you edit the preview text that appears or allow you to replace it altogether. When you create the email copy itself, just be mindful that the first sentence will become the preview text and make adjustments accordingly.
The importance of presenting professional-quality copy in your emails cannot be overstated. But top-notch writing goes beyond grammatical correctness. The pros use a “beginning, middle, end” content structure that flows conversationally and never sounds too wordy.
The foundational subtext built into the “beginning, middle, and end” structure could unfold as follows:
Set up the “pain point”, otherwise known as the “problem” the recipient is experiencing.
Highlight the consequence of not solving the pain point problem, and then immediately offer your own product, service, or offer as the solution.
Highlight the additional benefits of your solution, and finish with a clickable CTA.
Each segment—beginning, middle, and end—should logically inform the next segment, presenting a clear message to the reader.
TIP #4 CONVEY A CRYSTAL-CLEAR MESSAGE
Professionally written copy will deliver a crystal-clear message, as we just touched upon above, but before you launch your campaign, ask a colleague or friend to read your copy. If they can’t articulate the “point” of the email, then you haven’t conveyed a crystal-clear message.
Ask yourself, what do you want your recipients “to do” as a result of reading your email? Probably click the Call-to-Action you’ve included at the bottom of the email, right? Then the “message” of your email has to convince the reader why acting on your CTA will benefit them.
If you used one heck of a subject line hook, made sure the preview text was intriguing, and composed professionally written copy that conveyed a crystal-clear message, then your CTA will automatically be irresistible to your recipients.
But make sure the exact wording you use for your CTA really pops. Can you tell which of the following CTA text is “generic” and which is “irresistible”?
“Register for Webinar” vs. “Join the Conversation”
We’re willing to bet you agree that “Join the Conversation” sounds far more interesting than the other, generic CTA.
It’s true. The frequency with which you contact your email recipients will directly impact the overall open rate of each campaign, but this isn’t to suggest that the more you email your subscribers, the more likely they will be to open, read, and click.
In fact, the opposite is true.
The more you contact your subscribers via email, the more likely they will delete your email before opening it. Why? Because receiving too many emails can be annoying for the recipient, or worse, they can come across as “spammy”.
If you’ve noticed the open rate of your recent campaigns going down, it could indicate you’ve been emailing too frequently.
Be very careful not to overcorrect this issue, however.
If you slow your email frequency way down, you will inadvertently increase the risk of your campaigns winding up in your recipients’ Spam folders.
This is especially true when the sending email address contains “donotreply@” or suspicious characteristics such as a lot of numbers. Even using emojis in the subject line could trigger the recipient’s malware to automatically route your email into the Spam folder.
Play around with how often you contact your subscribers. There’s no one-size-fits-all solution, and successful frequencies will vary from one company to the next depending on the industry and whether the company operates within the B2C or B2B market.
Interested in increasing the open rate of your email campaigns, but aren’t quite sure how to put our tips into practice? Let us handle it! FTx 360 offers email marketing services that include content writing, campaign handling & strategy, and marketing automation.
Want to read more articles like this? Enter your email below to subscribe to our mailing list and be the first to know about the latest marketing trends!