Like all other forms of writing, there’s an art to writing an effective sales email. Seeing a product through the eyes of a prospective customer is an act of imagination, as is a sense of what words will pack the most emotional punch.

However, once you’ve had the serendipity, you need to switch gears and test to see if your hypothesis (your artfully crafted email) will work as written or whether it needs some changes to get the most responses possible.

To do this, you use a technique called “A/B Testing” where you compare two variations of the same email to see which performs better. With each test, you vary only one element and leave everything else the same. After multiple tests, you have the best email possible.

Before going any further, there are two important metrics we use in email marketing:

  1. Open rate – the percentage of emails that are opened.
  2. Response rate – the percentage of emails that get a response.

Many email marketers tout a high open rate as good thing. However, a high open rate is only useful if it generates a high response rate.

You can get a high open rate, for instance, by using the SUBJECT line: “RE:”. However, you’ll get a very low response rate because people who open that email (thinking it’s in reference to ongoing business) will be irritated when they discover it’s not.

1. Test the Open Rate

There are five elements of an email that determine whether it’s opened. All these elements appear in the typical Inbox listing which is what the recipient is looking at when he or she decides to answer an email. Here they are, in order of importance:

  1. SUBJECT line.
  2. Teaser, which is the first 20 or so words of the body of the email.
  3. Date and Time the email was sent.
  4. Salutation. (i.e. “Mr. Jones” vs. “Joe” vs. “Hi, Joe”)
  5. Sender’s Email Address.

Because the SUBJECT line is so important, we’d typically A/B test the SUBJECT first.

Email marketing companies periodically publish statistics about what emails tend to get opened, so we know that short SUBJECT lines work better than long ones, for example. Thus, we probably don’t want to bother testing a long SUBJECT against a SHORT one.

Instead, we’ll pick a couple that are similar length but worded differently, like so:

A. “SUBJECT: Risk Management Structure”

B. “SUBJECT: Cost of Risk Management”

For this test, we send an equal number of otherwise identical emails to similar recipients at the same time of day. The only difference is the SUBJECT line. Whichever email gets opened more frequently is the winner.

We can continue testing other elements (different teasers, different days and time of day, etc.) to increase the open rate but there’s usually a point when changes don’t significantly change the number of opens.

It’s also useful to know if you’re in the general “ball park” of a reasonable open rate. I’ve seen multiple statistics on this so I’ll boil it down to a rule-of-thumb: look for a minimum open rate of at least 20%; anything over 50% is excellent.

2. Test the Response Rate

There are three elements of an email that determine whether, once opened, it will get a response:

  1. Benefit. “What’s in it for me?”
  2. Differentiator. “Why buy it from you?”
  3. Call-To-Action. “What’s the next step?”

To make this part of the discussion easier to understand, I’ll combine the benefit and differentiator.

Let’s suppose you have two sales emails, one where the benefit is “saving money” and the other where the benefit is “increasing revenue.” In that case your “teasers” might be:

A. “You may be able to save 25% of your shipping cost by handling your…”

B. “You can increase your sales revenue by 25% by handling your…”

Those aren’t particularly compelling benefits (they’re too “sales-y” and generic) but they’ll do for the purposes of illustration. To A/B test, you send out both versions (with everything else identical), and see which benefit gets the better response rate.

As with the open rate, you continue testing until you reach a point of diminishing returns. Here, too, you need to be aware of average response rates. My rule of thumb is you should expect at least a 5% response rate and anything over 20% is excellent.

Needless to say, the response rate will vary according to the customer and the offering. CEOs, for instance, are less likely to open cold emails than CIOs. Similarly, executives from large companies open few emails than those from smaller firms.

Based on my experience, if you start with a solid draft (you may need some help with this), you should be able to triangulate onto an email that gets around a 20% response rate after about six to eight A/B tests.

The opinions expressed here by columnists are their own, not those of