Dynamics 365 for Marketing gets better and better. We just published a bunch of features for you to preview. One of these is the highly awaited A/B testing feature.
You can use email A/B testing to test two slightly different email designs on a small part of your target segment to find out which design is more successful, and then automatically send the winning design to the rest of the segment.
To test this out lets start by creating a segment of 10 contacts.
Next let’s create an email, in which we define an A/B Test (1) with two versions of a subject (version A and version B). Version A (2) will include the First Name (3) of the recipient.
… and version B will include the Full Name (3) of the recipient
Having created a segment and mail we’ll proceed to create a customer journey and add the two.
When adding the email with the A/B test, you’ll notice the A and B icons in the upper right corner of the email tile. They will be greyed out until you flip the A/B testing switch in the Data task pane. After you flip the switch you can set the A/B testing parameters to your liking.
A/B testing parameters:
1: name of test
2: distribution (percentage to receive version A, and percentage to receive version B)
3: metric – opens or clicks
4: duration – how long to wait before declaring winner
5: fall back – what to do if inclusive test
When the Customer Journey goes live it starts out as expected by sending version A to 10% of the segment (one recipient, since the segment holds 10 contacts), and version B to another 10%.
Note the numbers (1) above the email tile reflects the fact the 10 contacts went into to email tile, but eight of them didnt receive an email yet (as expected). They will receive the winning email version once the A/B test concludes.
If we click the View Details link (2) we can understand which two contacts received the test mails.
Opening the first recipients mailbox we see that the recipient (Renee Lo) did receive version A (the first name subject), but didnt open the email.
Same thing can be verified on the Insights tab of the contact Renee Lo.
Opening the second recipients mailbox we see that the recipient (Diane Prescott) did receive version B (the full name subject), and she did in fact open the email.
Same thing can be verified on the Insights tab of the contact Diane Prescott.
Same thing can be verified on the Insights tab of the email (two deliveries, one opens)
To end this example we wait the six hours to conclude the A/B test as defined above.
The expected outcome is that the remaining 80% of the segment gets the version B of the email, since the recipient of version A did not open the email, whereas the recipient of version B did open the email.
After the six hours, the customer journey now reflects the fact the all members of the segment recieved an email (as expected).
One of the remaining recipients is Eric Gruber (2)
And if we open Eric Grubers mailbox, we can verify that he did in fact receive the version B (1) exactly six hours after the A/B test was initiated – 6:17 PM vs 12:17 AM (2)
You can now ensure the majority of your segment contacts receives the winning design and be more successful with your email marketing.
Stay tuned for more first looks, including the new email designer, the new segment designer, and more
See also
*This post is locked for comments