We've touched base on a number of best practices for testing emails in previous articles. Here we'll summarize those best practices for handy reference.

Best Practice for Testing Emails: List Segmentation by Email Service Provider

If you are going to segment your email marketing list by email service provider, chances are that you will later find yourself sub-segmenting by one of the other categories. Simply be sure to keep a careful eye on which lists are which to avoid duplicate email sends or over-sending to a single individual.

Also, be sure to regularly check in to ensure that a specific email service provider still accounts for enough of your list to warrant having its own email list segment.

Best Practice for Testing Emails: List Segmentation by Customer Behavior

As we noted earlier, stay organized! You can most likely come up with dozens of ideas and ways to segment your email marketing list by customer behavior. However, if you bit off more than you can chew, you'll soon find yourself overwhelmed with data and unable to tell which strategies are most effective with your users. Create a systematic plan. Then track what works and what doesn't and repeat the segments that have proven most effective.

Also, keep data up-to-date. A user with a history of one type of purchase may change to a different type of user as lifestyles, age, and other factors change. In order to ensure the best response to your list segmentation, query fresh data each time you segment a list.

Finally, don't be afraid to experiment. Your industry or market segment may have unique customer behaviors that make an "off-the-wall" list segmentation make sense. Anything is worth trying once!

Best Practice for Testing Emails: List Segmentation by Demographics

Don't fall prey to stereotypes! There is a fine line between using demographic data effectively and falling prey to stereotypes or, worse, bigotry, racism, or homophobia. While demographic data can inform more effective offers and ways to communicate with your clients, be sure to keep it respectful and remember that, even though you're dealing with a database, individuals are still people.

Don't be pushy about collecting data! Just because you want demographic data in order to segment your database, it doesn't mean that your users are comfortable giving it to you. Be aware that there will always be users who just don't want to share data. Don't force them into it. You'll end up alienating users and doing more harm than the good that the list segmentation will do.

Best Practice for Testing Emails: How to Run an A/B Test

Before you begin to run your email testing, you'll want to make sure that you understand the best way to split your email list for a clean A/B test. Because you want the resulting lists to be as even as possible in terms of valid email addresses and demographic information, consider taking the following steps before splitting your email list into two parts:

Remove Inactive Users: Begin by removing all of the most inactive users from your main list. This may simply be users who have never opened an email, or it may be users who haven't opened an email in a very long time. You'll need to decide for yourself what the criteria will be.

Remove Highly Active Users: Also remove your most active users before sending testing emails. This most likely will mean people who open the majority of the emails that you send, but it may also simply mean anybody who has opened an email within the last month or the last two email sends.

Sort Alphabetically: Often, the best way to sort a list and then split it down the middle is to sort the list alphabetically. It will often give you the most randomized data.

Do Not Sort by Join Date: However you choose to sort your list before splitting it, be sure that the list is not sorted by the sign-up or join date of the users. Your most recent sign-ups will be more active and likely to open than your older ones. If you sort by join date, one list is likely to respond to the email better than the other one based simply on the fact that they have engaged with your company more recently.

Split Your Highly Active or Highly Inactive Users Separately: After you have sorted and separated your main list, use the same technique to sort and separate your highly active or highly inactive users. The split the results back into the two new segments of your main list. This will ensure that when you send testing emails, each of your A/B segments will have a sampling of average, highly active, and highly inactive users.

Check the Percentage of Email Service Provider Addresses: Do a quick check of each of your newly halved email lists. Make sure that you have roughly the same percentage of Hotmail, Gmail, Yahoo! and any other large email service providers on each list. If you do not, you may risk having your results altered if one of your email service providers sends your message to the junk or spam folder.

Best Practice for Testing Emails: Image Testing

As with any email test, make sure that you're testing a pure A/B split of your list and don't make any other variable changes other than the change in the image. You're looking to see if making a change in how you display and choose images can improve your email performance. You can't do that if you change other factors at the same time.

Also, be sure to use similar alt and title text behind the images of your testing emails. For email clients that don't load images, the alt and title text can impact performance significantly. Don't allow changes in alt or title tags to make the data in your test unclear.

Finally, be sure to plan out your image tests so that you know what you're testing when. Consider starting with placement or density and then moving on to image type or color.

Best Practice for Testing Emails: Copy and Text Tests

As with all tests, make sure you are doing a pure A/B split of your list and don't try to change more than one factor at a time. If you are testing emails for "command" headlines against funny headlines, then make sure that the headline font, size, and placement is the same for both. Be sure that what you're testing is isolated.

Make sure that the same copy writer creates the copy for both versions of your test. Small differences in the tone or style of a specific writer can account for test variations that may not be true to your initial goal.

Best Practice for Testing Emails: Subject Line Tests

As always, be sure that your A/B list split is clean. This is particularly true of subject line tests. If you have an uneven balance of user states or email service providers, you're incredibly likely to have invalid open-rate data at the end of the day.

Test email subject line concepts over several sends. Even if you are incredibly cautious about the quality of your A/B split, there are still factors that you won't be able to account for such as long-tail opens and possible timing issues. In general, test your subject line concept three to five times before determining what the data means.

Be aware of inbox deliverability! Your subject line can have the greatest impact on your ability to get into the inbox instead of the spam or junk folder. If you're testing multiple subject line concepts, be sure to always check deliverability before you send to your list.

Best Practice for Testing Emails: List Segment Tests

Segmenting your list may be tricky, and you want to be sure not to over-email people. Take the time to run an extra manual check to ensure that you haven't inadvertently included the same people on multiple email lists that are being sent at the same time. Few things will get you marked as spam by a user faster than delivering multiple emails to their inbox on a given day.

Don't segment too small! You still need a critical mass of users in order to make the time and effort of sending a unique email worth your time. You'll need to determine "how small is too small" for a list based on your own business needs, but don't fall so in love with list segmentation that you create email lists that don't really have any value.

Take the time to think through your business needs and the best way to segment a list for you. You may have a list segmentation that is entirely different from the ones that we've suggested above.

Best Practice for Testing Emails: From Address Tests

As always, make sure that your A/B split is clean. Because your success metric will be open-rate, it's imperative that your lists be as similar as possible.

Test the email "from" address early. Because it can have such an impact, you'll want to settle on your preferred format early and stick with it.

Be aware of spam issues! Your from address greatly impacts your ability to make it into the inbox. Test to make sure you're being delivered before you send to your entire list.

The best "from" address can be situational. One may work better for transactional emails and a different one may be better for marketing emails.

Best Practice for Testing Emails: Send Date and Time Tests

As always, make sure that your A/B split is even. Younger people will be online more often, and older people may be tied to a more regular email schedule. When sending testing emails, be sure that your list is an even split of all of the database.

Be sure to try your day and time tests several times. Sometimes, there are factors that you cannot control for such as ISP slowdowns, holidays, and unexpected news events that keep users offline. All of those factors may impact the data from your test of a day and time to send. Be sure to try at least several times before being comfortable with your final data.

Think outside of the box! Just because most case studies say to send on Monday or Wednesday at noon or four o'clock Eastern Standard Time, that doesn't mean that that's what will be best for you. Take the time to think about when you might see the best results.

Best Practice for Testing Emails: Offer and Promotion Tests

In addition to testing your offers for bottom line revenue and profit margin promotion, consider the viral and acquisition value of an offer. Did an offer get you a lot of shared attention on the internet or did it get you many new customers? That may be worth it even if it didn't generate as much up front revenue as you were hoping for.

Don't lose money! Particularly in the era of social shopping and huge discounts, it's often considered necessary to offer massive promotions to generate customer interest. However, the math on that often doesn't add up and you end up losing money. A less well-responded to offer that makes a profit may be better than a popular offer that actually loses money for you.

Remember that there may be segments of your database that respond to one type of offer better than to another type of offer. Offers may not be "one size fits all", and you may want to test different offers to different portions of your database.

Consider that timing plays a role in offer success as well. You may want to run an offer test several times in case you've inadvertently sent an offer at a time when a product was unpopular or people were generally not spending as much money as usual.

Related Articles: