Marketers are a confident bunch, but no team is perfect. Similarly, you can always get a better subjective understanding of your audience’s preferences and tendencies, but even the best marketers rely on data to supplement those impressions.
In Pardot and other marketing automation platforms, A/B tests (also referred to as split testing) act as a kind of dress rehearsal before the real campaign. Instead of going straight to the main event, you can compare different variations on a small scale before committing to the right variant. For example, you could change the subject line of an email and see which one leads to a higher open rate.
In this guide, I’ll walk you through the steps involved in running a successful A/B test through Pardot, including:
Identify your audience
Create a new email
Develop two variants
Finalize the test
Analyze existing tests
Recommendations for your A/B tests
Whether you want to focus on clicks, conversions, or another metric, consistently running A/B tests will help you identify your own strengths and weaknesses and gradually improve your marketing results.
Are A/B Tests Worth It?
Digital marketing is an incredibly complicated field, and you might be wondering whether it’s worth adding another wrinkle to your marketing practices. That said, A/B tests don’t require much time or money, and they can lead to substantial gains in conversions.
Harvard Business Review found that simply implementing A/B testing helped Microsoft Bing increase revenue by 12 percent.
Setting up A/B tests in Pardot is a relatively straightforward process, so there’s essentially no downside to consistently testing your content. Every test you run is another opportunity to evaluate ideas and get the information you need to run your campaigns more efficiently.
How to Get Started
As Pardot consultants and email marketing experts, we recommend that you pre-test your ideas before you get started with implementing them in Pardot.
1) Define your goals, which could include increasing the open rate or click-through rate (CTR). Make sure you select the right metrics based on the tests you are running.
2) Establish a process to implement changes in the future, based on the results of your A/B tests.
Choose an element you want to test, here are some examples you could explore:
Button colors.
Video thumbnails.
"From" copy: Personal vs. corporate.
Subject line: A small change can increase open rates significantly. Check the use of emojis, lowercase, personalization, and more.
Using merge tags for dynamic information.
Now let's move on to see how to actually run the split test in Pardot.
1. Identify Your Audience
Unless you want to add recipients manually, you will need to create a list for your new test. Under the Marketing tab, navigate to Segmentation and select Lists. From that screen, you can view existing lists or create a list by clicking Add List.
💡PRO TIP: Use segments of subscribers that are active and usually interact with your content.
💡PRO TIP: Ensure that the testing group is statistically significant. Optimizely has a great A/B test sample size calculator to help you out.
2. Create a New Email
You don’t need to go through the A/B testing page to run a test—just go to the Marketing section and select Emails then New List Email. Pardot will ask for some basic information about the new message.
On this screen, just click Enable A/B Testing to configure the message for a test. You will also need to add a name, select a folder and campaign, and determine whether you want to use text or HTML and text for the new email. Pardot will automatically copy the message’s content into a second variant (“version B”) which can be edited separately.
At this point, Pardot will ask whether you want to use an existing template or create the email from scratch. As you can see in the image below, the email editor makes it easy to switch between the A and B versions.
3. Develop Two Variants
The next step involves building two emails that are similar enough to be used in the same place in your campaign, yet different enough to offer meaningful information. The Pardot editor gives you complete flexibility in creating the test, so you can change as much or as little as you want.
Pardot doesn’t support testing for more than two variants. That said, it’s critical to test each variable individually in order to gather valuable data.
Changing several variables between just two variants will make it impossible to identify the best option for each individual variable — it will only tell you which overall combination performed better.
In other words, if you want to compare two different subject lines and two different images, you should run two separate A/B tests rather than combining them into one comparison. This approach takes longer, but it will lead to more consistent results over time.
4. Finalize the test
Once you’ve created two variants, you’ll have the opportunity to test your A/B test and get proof of your email to your inbox. You can add one or multiple test lists, and there’s also a field for individual email addresses.
Just below these options is the rendering test, which allows you to see how the message will appear in a number of email clients. Please note that certain email clients don't allow certain fonts, you can check this article to learn more about using web fonts in emails.
After previewing your new email, you’ll be ready to configure some final options and start the A/B test. The final “sending” tab gives you one last opportunity to review settings before sending the first emails.
Along with the recipients, you’ll also need to select a sender and reply-to address for each variant. Again, while you can use a different sender name for each variant, it isn’t always necessary if you made other changes.
Next, you’ll be able to enter subject lines and customize your testing criteria. Tests can be run for as little as an hour or as long as 30 days. You can test for either opens or clicks, and there’s also a slider that allows you to select the percentage of your audience you want to use for testing.
Finally, Pardot gives you the option to add completion actions to your new test. Completion actions can be set to trigger when a subscriber opens the message, clicks through, or unsubscribes. Actions range from adding or removing the lead from particular lists to sending a certain email.
6. Analyze Existing Split Tests
Of course, A/B tests are only helpful if you use them to extract valuable data. It’s easy to access the A/B testing module from the Pardot dashboard.
Select the Marketing tab, then navigate to Emails and click on A/B Tests. You’ll arrive at a screen similar to the one below:
From there, you’ll be able to view your A/B tests while filtering results by campaign or date created. The Tools option from the screen above also allows you to print the entire list or export it as a .csv file.
Pardot stores a deep set of data points for every A/B test. You’ll be able to see the overall delivery rate, HTML open rate, click to open ratio, and more along with detailed metrics for both variants. This page gives you all the information you need to identify the right variant for each campaign.
7. Recommendations for your A/B tests
To recap, below are a few takeaways and recommendations so you can get started with your A/B test email campaigns.
💡Limit your split tests to one thing at a time, having a clear hypothesis and pre-defined metrics aligned with the campaign goals.
💡Use a test audience that it's both statistically significant and engaged, so your split test results provide accurate information.
💡Check the A/B test results and continue to test. A/B testing should be a common practice in email marketing.
💡Share the results of your email split tests with other marketing team members, the A/B testing process should reveal information that can be leveraged across the marketing team, unless your tests are specific to a region or vertical.
A/B testing might sound complicated at first, but Pardot makes it easy to gather valuable data about your own strategies. Click here to learn how our team of Pardot consultants can help you generate better returns on future campaigns.