10 Case Studies To Help You Get More Clicks

Regardless of whether you want people to join your email list, make a purchase, follow you on social media or read the rest of an article, you’ll most likely need them to click on something. This “something” is how you call your visitors and readers to action.

There are many ways to do this, and the words you choose can have a big impact. You’ll need to do some testing to find what gets the best response from your audience. To give you an idea of where to start, here are 10 case studies you can learn from.

1. What Role Does Relevance Have?

Unbounce reported on a test WriteWork ran on button text:

“Read Full Essay Now” is the winner; “Get Instant Access Now” decreased conversions by 34%. This is most likely an issue of relevance – people are reading part of the essay and want to read the rest of it. “Read Full Essay Now” satisfies that need better.

2. Are You Asking or Telling?

ABTests reported on a test Rypple ran on the text in their buttons:

“Respond Now” brought in a 13% more clicks than “Give Feedback.” This might be because “Respond Now” sounds more direct, and influences the person to act promptly, especially with the exclamation point added at the end. “Give” also tends to entail more sacrifice, i.e. you give gifts to people.

3. Does “Free” Trump “Try?”

ABTests reported on a test Firefox ran on their button text:

“Download Now – Free” had 4% more conversions. This might be because of the word “free.” Other tests have also shown that using “free” can bring in more results.” It’s possible that the results were close because “Try Firefox 3” sounds less intimidating than downloading something – the next study will show why that might be the case.

4. Do Your Words Scare People Away?

We posted on our blog about a test the Cabot Heritage Corporation ran on their sign-up form buttons:

“Start my free subscription” decreased conversions by 22.9% in just two days. The word “subscription” entails more commitment, and that can scare anyone away. Notice the winning text also has the word “Free” in it.

5. Is It Clear What You Want To Happen?

ABTests reported on a test Dustin Curtis ran on his link text:

The former brought in 172% more clicks. This could be because it was more clear what was happening. “I’m on Twitter” could have just been linked to Twitter’s homepage, which wouldn’t be very helpful. The winning link makes it obvious you will be set up to follow Dustin Curtis on Twitter if you click that link. It’s also more actionable; it tells you what you should do.

6. Does “Free” Always Win?

ABTests reported on a test InDefero ran on navigation bar links:

“See plans and pricing” had a 52% increase in clicks. “Free Hosting” is certainly desirable, but it doesn’t communicate that a lot of information would be shared. It would be interesting to run this test somewhere other than the navigational bar; visitors may be looking for specific information in navigation links, so a special offer may be better in a different spot on the page.

7. What Does Your Audience Want?

Visual Website Optimizer reported on a test Veeam ran on a link in their sidebar:

“Request pricing” brought in a 161.6% increase in clicks. It’s actually pretty easy to determine why this happened; Veeam surveyed their audience and found that most people wanted an easier way to find pricing on the site. This test confirmed by changing “quote” to “pricing” that people were looking for that word.

8. Are You Using the Right Vocabulary?

ABTests reported on a test Gamesforlanguage.com ran on their button text:

“Instant Demo!” increased clicks by 83%. While “play” sounds more fun, “instant” means it will be quick and less time-consuming. “Demo” has a helpful connotation – I’ll learn what it is I’m getting into before being thrown into it.

9. Is It Better To Be Personal?

ABTests reported on a test Xemion Web Designer Directory ran on their navigation link text:

“Add Your Company” increased clicks by 43%. Both of these communicate the same goal, but “Advertising” doesn’t speak in terms of “your company.” This personal touch makes it feel like Xemion is speaking to you, not at you. “Yes, I would love to add my company to your directory,” might be the answer in the visitor’s mind.

10. Do Multiple Requests Impact Each Other?

Sometimes the words of other calls for action you have included can influence click-through rates. Visual Website Optimizer reported on a test by Artsy Editor that demonstrated this:

Artsy Editor’s goal was to get people to their pricing page. The first one had the best result – 47% increase in click-throughs to that page. The second one brought in a 17% increase and the last one had no improvement. Having the price in the button could have created more friction; people weren’t thinking about the cost until they saw it in that button. Artsy Editor might want to test different price points, but it’s clear that people prefer a demo/trial offer over a pitch to buy something.

Don’t Forget To TEST!

Researching case studies and experiments done by other companies is a great way to figure out what you should test. Keep in mind that you won’t necessarily have the same results – your audience is your audience and they might respond differently. These results are not rules you should be following; think of them as inspiration to run your own split tests.

Do You Have Test Results To Share?

We love to hear about test results! Please share what you’ve found in your own testing – be it emails, sign up forms or website pages.

Need help getting visitors to sign up on your web form?

This guide walks you through how to split test the web forms on your site. Whether you want to test the design, the call to action or different images, this guide will help you with where to start and includes more case studies.

The post 10 Case Studies To Help You Get More Clicks appeared first on Email Marketing Tips.

Do Buttons Get Clicked More Than Text Links?

Many of our readers have already signed up to the live seminar on split testing that we announced last week.

But even if you can't make it, you're probably interested in learning more about split testing now, right?

Fortunately, we happen to have a case study on hand that shows just the sort of information you can learn about your email marketing campaigns by conducting split tests.

Today, let's look at a split test that we ran on our own blog newsletter to get more of you to come to the site and read the latest posts.

The Test

Last year, Marc and I were discussing how to increase clickthroughs on the emails we send to our blog subscribers.

One of the ideas that came up was to replace the text links that we had been using to drive people to the blog with a "button."

Previous testing on the website had shown that in many cases, buttons make better calls to action than text links do. We thought the same might hold true for email.

So, Marc created a button-shaped image with the words "Read More" stamped on it:

We then created A/B split tests for our Blog Broadcasts, inserted this image into one version as the call to action (to read the full post on our blog) and continued to use text links in the other version as we had before.

The emails were otherwise identical — we kept subject lines, sending dates/times and templates the same for each version.

Measuring Our Success

Clicks-to-Opens: clicks divided by opens.

More email statistics.

Since we're trying to get people to visit our site to read the full blog posts, we want to compare clickthrough rates.

We chose to use clicks-to-opens as our measure of success. Instead of dividing clickthroughs by the number of emails sent, we divided it by the number of emails opened.

That way, if one version of the message got an unusually high number of opens, it wouldn't skew the results to make that version's call to action look more effective than it really was.

Our Expectations

Going into this test, we expected the button to beat the text links handily.


It was physically larger than the text link
It contained a clear call to action — "Read More" — while a contextual link might be less obvious.
It was an image placed in a part of the email where readers hadn't previously been shown images

Basically, we expected the button would grab people's attention as they scanned through the email.

On the flipside, we knew that readers might have images disabled and wouldn't see the button.

So we added the ALT text "Read More" to the button image.

Since with images disabled the text "Read More" would appear in place of the button, we felt that even for those readers with images disabled, the button should do at least approximately as well as the text link.

Initial Results

In our first split test, the button call to action outperformed our text link by 51.4%.

We started running the test on our Blog Broadcasts last year.

As we expected, the button grabbed readers' attention and incented them to click through, much better than the text link did.

Clicks-to-opens for the button was repeatedly higher — a lot higher — than it was for the text link.

In our first five split tests, the button drew a clicks-to-opens rate that was on average 33.29% higher than the text link clicks-to-opens rate.

At this point, about 2 weeks into our test, it was tempting to say, "The button clearly draws more attention and clicks than text links. Let's just start using buttons and move on to another test."

...But, We Kept Going!

We could have stopped after those first few tests — and in many cases, one-time or short-time split tests are appropriate.

However, even in our initial few tests, the text had beaten the button once, and by a large margin.

I wanted to see whether the button was doing better because it was a more compelling call to action in general, or because of the "novelty" of it.

So we continued to split test our Blog Broadcasts...

Later Results

Further testing showed that using buttons instead of text was not a good long-run tactic.

We ultimately ran the button-versus-text split test about 40 times, over the course of several months.

For a while, the button continued to beat the text links — but we noticed that it wasn't doing so by as large a margin as it first had.

While over our first five tests, the button beat the text by over 33%, after 20 tests it was only winning by an average of 17.29%, and the text version was beginning to hold its own in the win column.

Final Outcome

With each new split test, the text asserted itself as the better call to action.

By the time we ended our experiment, text links were consistently outperforming our button, winning nearly two-thirds of the time, by double-digit margins as high as nearly 35%.

Conclusions: What Happened?

The button is effective in the short run, but after a while readers become "numb" to it and no longer respond at the same initial high rate.

Consider the following two stats from our split tests:

  • Overall, text links outperformed buttons 53% of the time.

  • After an initial period where the button was more effective, text links outperformed buttons 67% of the time.

That first stat doesn't seem important in and of itself — 53% is barely over half the time.

However, for the first three weeks of the experiment, the button won two-thirds of our split tests. After that, the opposite became true — the button just stopped "working."

Which brings us to conclusion #2:

Test results are not forever.

What works today may not work tomorrow.

Had we stopped our testing after one broadcast, or even one or two weeks, we would have concluded that buttons were better than text links.

It's important to continually test your email campaigns to make sure that you know what works, rather than assuming you know what works.

Finally, one last point I feel obligated to make:

What works for someone else may not work for you.

The text links won out in our split test, but that doesn't mean a button can't be an effective call to action for you.

Buttons may work well for you in the short run. Split test them.

We send our blog newsletter often — 2 or 3 times a week. So we exposed subscribers to the button often, which may have increased the speed with which they started to ignore it.

If you send less frequently, or only use a button for emails where you have a particularly compelling offer, you may find it to be effective.

Plus, we tested a specific button. Perhaps another one, with different design or wording, may have been more effective.

Again, don't just take our word for it. Find out for yourself through your own testing.

Learn More About Split Testing

Unfamiliar with split testing? Want to see how you can methodically raise response rates for your email marketing campaigns?

Join us for a free one-hour seminar on split testing tomorrow — Wednesday, March 26th:

Split Testing Emails and Web Forms

Wednesday, March 26, 2008

2:00 - 3:00PM ET

Convert to Your Time (New Window)

Register Now

What Do You Think?

What are your thoughts on this case study?

Have you tested calls to action or other elements of your email newsletters? What were your findings?

Can you think of any other factors that may have influenced our results?

Share your reactions on the blog!

27 visitors online now
2 guests, 25 bots, 0 members
Max visitors today: 43 at 12:01 am UTC
This month: 77 at 08-03-2017 08:48 pm UTC
This year: 84 at 01-27-2017 11:42 am UTC
All time: 279 at 10-18-2013 05:24 am UTC
Get Adobe Flash player