FREE Salesforce Consultation: 508.935.2275 or contact us online

How a Goldfish Can Inspire You to A/B Test

Lynette Rambo | September 30, 2015 

This may come as a shock, but a goldfish has a longer attention span than most consumers today. That tidbit of information is based on a study conducted by Microsoft Canada that looked into how attention spans in people have changed over the years.

The average human attention span in 2013 was 8 seconds. The attention span of a goldfish is 9 seconds. We’re not marketing to goldfish.

Microsoft found that the average human attention span in 2013 was 8 seconds – 4 seconds less than the average in 2000.  The attention span of a goldfish is reportedly 9 seconds. Anyone who has ever had goldfish knows that they basically swim around, eat crumbs, blow bubbles and stare out the sides of the fish tank … for 9 seconds at a time. Those of us in marketing today would be thrilled to have consumers paying attention to our emails or ads for a whole 9 seconds.

We, as marketers, should make it our goal to capture those coveted seconds of attention from the humans outside of the fish tank who want or need what we’re selling.


To grab your customers’ or prospects’ attention, you first have to know who they are and what they like. Testing is an excellent way to find out.

A/B Testing (also known as split testing) is basically comparing two versions of an email to see which one performs better. By creating an “A” version and a “B” version, you can validate how design changes, variations in messaging, different subject lines, etc. impact email opens, click-throughs, and purchases.

Most email platforms provide some level of A/B testing.  The Salesforce Marketing Cloud (ListEngage’s ESP of choice) makes A/B testing quite easy. Not only can you compare two static emails, but you can test dynamic content areas, as well.  After choosing a random segment of subscribers to receive each email, the Marketing Cloud can declare a winner and automatically send the winning email to the rest of your list.

Here are some suggestions and ideas to help make your A/B testing more successful.


Have a Hypothesis

The first rule of A/B testing is to have a hypothesis that you wish to test. Without some idea of possible outcomes, A/B testing just becomes a guessing game. And, without a certain objective in mind, it will be more difficult to discern the true impact of the variables being tested.

Keep a Granular Approach

One of the most common mistakes people make in A/B testing is comparing results of emails that are radically different from one another. Test just one element at a time. For instance, when testing two subject lines, make sure the email creative is identical for each send. Testing one email against another that is vastly different will not generate accurate results. And, if there are multiple elements of difference, it will be hard to determine which factors caused the improvement or decline.

Test Early and Often

In A/B testing, you are not restricted to performing just one test. The first test may not provide great insight into user behavior, but it can narrow the field so that additional tests can be run that will promote a better understanding of what motivates your customers.

Begin testing as soon as possible to eliminate ineffective design choices or inaccurate assumptions. The sooner you get access to actual data, the sooner you can incorporate changes that more accurately match customer behavior.

Be Patient

Meaningful results take time. Let tests run their course, even if it seems like they aren’t doing anything. Resist the temptation to end a test early, even if you think a clear winner has already been determined. Seeing it through to the end may give surprising results. You are building a statistical base upon which future assumptions can be tested.

Keep an Open Mind

Even though you developed a hypothesis in the beginning, it doesn’t mean the outcome will be what you expected. It can be tempting to dismiss data that differs significantly from what you expected. But, keeping an open mind to new ideas based on actual data and proven user behavior is essential to the success of marketing efforts that increase conversions.

Maintain Momentum

Successful testing not only helps improve user engagement, but also forms the basis of future tests. Learn from the initial tests so you can create more specific hypotheses for further testing. Becoming an A/B Testing expert takes time and practice. Always rely on the hard data.

Ensure Statistical Significance

Your sample size needs to be sizable enough to decide whether or not your test actually means anything. If you only test with 10 customers and find some shocking result, that doesn’t mean that what happened with 10 will apply to 10,000. Before jumping to conclusions, ensure that you have enough data to fully back or shoot down your hypothesis.


Here are a few ideas of areas you may want to test in your emails.

Calls to Action (CTA) vs. Calls to Value (CTV)

CTAs speak to the action you want from your customer (e.g., “Download Our Free Guide”). CTVs focus on the reason behind the action (e.g., “Become an Expert”). Early in the journey, CTVs tend to perform better. Test at various points in the campaign to see which is most effective.

Button Design

Test button sizes, shapes, colors, position, proximity to other elements, etc.

Testing Colors

Colors have psychological and emotional impacts on customers. Use the chart on page 15 to test which colors get the most responses. Which ones lead to higher conversion rates? For more information on the use of colors, see The Emotional Triggers of Color wheel by Talia Wolf of The Conversioner.

Testing Subject Lines

  • Personalization – using the customer’s name in the subject line vs. not using it.
  • Experiment with questions in subject lines.
  • Be accurate and deliver what you promise in the subject line.
  • Does a subject line with an incentive or a teaser work best?
  • Does including the name of your company increase opens?
  • Test a subject line that invites the user to buy vs. one that directly provides value to the user.
  • Use a symbol in one subject line but not the other.

Different Offers

For instance, an “eBook” vs. a “whitepaper.” (Note: using the word “Free” in the subject line may have a higher potential of triggering spam filters)


Test formatting and style – different fonts, plain paragraphs vs. bullets, all CAPS vs. lower case, bold text vs. plain, etc.

Sender address

Using a personal “from” address vs. a generic business address.

Try two different images, but make sure you keep them located in the same place in each email. If testing location changes, then be sure to use the same image in both emails.


Experiment with different placements of: copy, images, CTA buttons, etc.

Length of Email

A complete email message vs. one that requires a click-through (such as, “Read More”).

Delivery Date & Times

What day of the week gets better open rates? Does time of day affect click rates? Be sure to keep different time zones in mind.


Test a linked image vs. linked text. Which are subscribers most likely to click?


My use of the phrase “swimming with the fishes” doesn’t mean sinking to the bottom of the sea. Rather, the opposite. Go ahead and SWIM with that goldfish who has a second longer attention span than the consumer you’re after!  Doing A/B testing will give you a whole new perspective on your current and future customers and show you how to not only reach them but keep them.

Back to Blog
Lynette Rambo

Lynette Rambo | Marketing Consultant

Lynette Rambo is a Salesforce Certified Marketing Cloud Consultant, Administrator, Email Specialist, and Trainer. She has worked in marketing and communications for more than 20 years and with Salesforce Marketing Cloud since 2012. As a Marketing Consultant and Trainer for ListEngage, Lynette helps clients learn Marketing Cloud functionality, email marketing best practices, and effective campaign management. She also works with the Salesforce CRM and connecting Sales and Marketing initiatives. You can contact Lynette at