Inspect What YOU Expect
I spent last weekend at Screencast Camp. This is a once a year meeting put on by Techsmith, the company that sells Snagit and Camtasia. It was a great experience where I met marketing executives and education technologists as well as learning and development specialists. We talked about how “visual content” is transforming customer engagement and interaction. The research shows that images and video are far superior to text in getting clients and prospects to share content. The bottom line is that companies need to distribute video/image content to get noticed. But how do CEOs know what is effective?
Intuition, Experience, Testing
Many marketing experts might say it is their intuition or based on their experience. However, data-driven, state-of-the-art CEOs and CMOs will respond, “We Test." They might have a preference or a strong indication on what way to go but they test to make sure their approach is on-target. One of the sites I look at weekly to see what tests work is www.WhichTestWon.com. I think I am better than most at knowing what resonates with potential customers but this site is sometimes very humbling. What I like, what makes me click through a website or decide to buy or not to buy may not be same as what makes the target customers buy. And if I go with my preferences, I might be leaving many thousands of dollars on the table.
Customers Do What We Ask
Here, when I talk about buying, I am not necessarily talking about a sale that results in immediate revenue. Rather, I am referring to the concept of the potential customer/clients doing what we want them to do. Sometimes it is signing up for a newsletter, leaving an e-mail address, watching the video until the CTA (Call to Action), downloading a white paper, filling out a form, in other words, taking a prescribed action.
To get them to do this, we make an offer. The question that A/B testing helps answer: “Is the offer we are making as effective/impactful as we want?”
Let’s imagine we are on a website, looking at a product and the CTA says “Buy Now”.
How many of us press the button?
What if it instead says, “Next” and we press next and see the item in our shopping cart.
The question here is simple: how many people press “Buy Now” and purchase versus “Next” and purchase? The answers can be eye-opening. Let’s say you have a business with $10 million in revenue. If 1% of website visitors press “Buy Now” and 1.40% press “Next” and purchase, you have lost revenue if you use “Buy Now." If your average sale is $1000, to get to the $10 million in revenue, you need 10,000 customers to buy. If you change your CTA to “Next” and 1.4% purchase, revenue goes up by $4,000,000. One change in the CTA increases revenue. And it is not just wording. Significant differences can be seen based on including images or videos, the color of your CTA, size of font, white space surrounding the CTA, etc. In fact, there are so many factors that they can be overwhelming. I once read that Google tested 30 shades of blue before deciding on the optimum. Mid-size businesses can’t usually do that. But some simple A/B tests are very cost effective.
Which Test Won recently had a contest to evaluate tests. Here are some of the results. I also have included some others in the picture above.
"Is a newsletter sent by e-mail once a week, that is shorter with more pictures, more effective than a longer one sent each month? (The answer is yes, shorter, more frequent, with pictures wins.)
Does an offer of a free gift that costs you five dollars results in more sales than a “money back guarantee?” that costs you, based on additional returns, under two dollars. (The money-back guarantee wins in most cases)
We talk a lot at Chief Outsiders about being data-driven marketing experts. Testing, analyzing and iterating makes a difference.
What about you? Tell us about your testing experiences.