Blog /A/B Testing

Why Your A/B Tests Fail

28th Aug 2014 – Why Your A/B Tests Fail

Today I wrote a blog on ConversionXL about what we learned from A/B tests performed on the A/B testing suite Convert Experiments. Here a short quote from that article.

It’s only after collecting and analyzing as much research as possible, and doing some basic hypothesis testing with wireframes, do the agencies get into the actual A/B testing process.

Now, you might not believe it, but there’s a fairly common issue with A/B testing tools causing a “blink” to occur, as the tool decides which variation to show the visitor, which can really skew test results.

This is mostly attributed to slow site speed, or poor test set up, but it’s an important thing to consider. We have seen that these blinks can lower conversions up to 18%.

Much of this has to do with the how the tool serves the variation (is the variation client side or server side?) Certain tools are also vulnerable, allowing competitors to peek at your experiments using this script. Some A/B testing tools that I’m sure don’t have this blink problem are SitespectGoogle Analytics Content Experiments, and Convert.com (ours).

Enjoy the read of the full article on ConversionXL, here. 

ConversionXL
Originally published August 28, 2014 - Updated January 18, 2022

Start 15-Day Free Trial With Us.

Quick form submission
Quick submission.
No credit card needed
No credit card needed.
You can always change your preferences later.
Check Icon
You're Almost Done.
Managing Marketing Team
Managing Tech Team
Hypothesizing Experiments
Coding and QA of Experiments
Convert is committed to protecting your privacy.

You’ve Unlocked Your Free Trial.

  • Check your inbox for the password to Convert’s trial account.
  • Log in using the link provided in that email.

This sign up flow is built for maximum security. You’re worth it!

PS: Gifts await you once you join us.