The Blink in Testing Matters!

Dennis van der Heijden
By
July 11, 2012 ·
The Blink in Testing Matters!

We built a predecessor for Convert Experiments and launched it in May 2011. This tool has a powerful infrastructure without any client tagging and only uses Java Script. There several ways for testing website visitors, as you might know:

  • Server-side backend testing (i.e., code on the server randomizes the variation and has internal reporting tools)
  • Server-side front-end testing (i.e., tagging templates), but still using client-side swapping of content
  • DNS rerouting and swapping content at that point
  • Client-side testing with Javascript tagging

Each of these methods has pros and cons, and so the method we selected while building for Convert Experiments was client-side Java Script tagging using a specific script in the header of the page and loading this asynchronously and swapping content in the client’s browser. We selected this method because only one snippet of script is needed to install it, thus lowering the barrier to testing and lessening the impact on the technology departments that deal with the agencies. Reducing friction in the acceptance of a new script that does not change can increase the acceptance of testing in a larger company, since no future change requests have to be made.

How do we determine who sees what?

Looking it up A/B testing is easy, just build a randomizer on the server in whatever programming language you are comfortable with and you are done…right? Well, no. In our software, the following steps are taken when a visitor comes to a website where we have installed software.

  1. Script in header is the first of the scripts triggered with a pageview. You did place it right behind the title tag, like we suggested?
  2. We determine if you have the latest version of jQuery library loaded on your site, and, if not, we send it the visitor’s browser.
  3. Script checks if a cookie is available or blocked (opt-out) cookies were found.
  4. When blocking cookies are found, then the process stops.
  5. Visitor details are sent to our servers, including IP, browser details, OS details, and optional cookie ID.
  6. When we get an IP, we look for the geolocation (continent, country, region, and city) and give the user a unique ID and then delete the IP from the cache.
  7. We run the rules to determine in the test to see if a user matches the group that needs to be targeted. To do that, we might look at 12 months of a user’s history.*
  8. When we determine if a visitor qualifies for a specific test, we randomize the user to see a variation or use an original version of the test.
  9. We collect the content the visitor made and present it.
  10. We connect the goals they hit in the test.
  11. If revenue is connected to the goal, like Google Analytics E-commerce Revenue tracking, we connect the value of the transaction and ordered items to the specific variation or original goal.
  12. Then we aggregate that data and present reports.

Just imagine a site with 10 million page views and a return visitor comes back. Then we need to find all the actions this user did in 10 million records for 12 months by going back in milliseconds. I have a lot of respect for our engineering team for getting results back in milliseconds after searching 120 million visitor records.

Speed of loading matters and ohhh-no it blinks

Now you can see it is an interesting process that can take time. In our case, this process takes a couple of hundred milliseconds on the first page load (even though the file-size of the snippet is between 24 kb and 80 kb) and then is four times faster on subsequent pages. Since this loading is done asynchronous, users will hardly notice this since we load while all of your content is loading: text, HTML, video, images, other scripts, etc. Since you know that page speed is important, we aim to make our system fast and this makes Convert Experiments fast—very fast. Sometimes, you see a blink that is hardly noticeable, but as neuroscientists know well, this blink is very important for conversion.

If you are familiar with the testing tools that work on client-side testing (e.g, testing in the browser of the visitor), then you know that there might be a delay on the content served. When the original version is shown, obviously there is no extra load time for this content. But variations take time. Imagine the page being ready and then on top of that you have to load the changes. It might happen that you see the original a fraction of a second and then the variation. This is known in neuroscience as the eyes and thoughts of the user are faster than the variation is being loaded. What does this blink mean for conversion optimization?

Research on “The Blink”

When attending the Conversion Conference in Chicago at the end of June, two people were keen on getting the first impression right. In his opening speech, Tim Ash said, “The brain forms its first impression of a website in just 50ms and change is scary and is usually bad”. Michael Summers had an amazing presentation in which eyetracking showed that users who participated in an A/B test completely missed the button swap that was tested, since the users’ eyes were already down the page and missed a important call to action button.

Urban Outfitters and Godiva are good examples where the blink was clearly visible on pages where a test or personalization is active and below you can see it happen on Qualaroo.com.

We ran our own experiments on the effects of the blink. We had a client selling to B2B using a very large e-commerce site. We took his shopping cart apart and rebuilt it in the client’s web browser. After hours of fire-bugging, we had it. Almost no element had his original place and, on top of that, it was all dynamic content. We loaded it in one of the early versions of Convert Experiments and found that even though we had thought of everything (trust, simple page, correct order, great photos, etc.), the page underperformed versus the original. It is a sad day when that happens to a conversion agency. We have to swallow ego and accept that we have to go to the clients, with our tail between ours legs and saying, “We learned great things today, but we did not lift conversions.” What can you do, if your testing tool is right…right?

Is your testing tool right?

I doubted our own old tool, and since we have more insights into the technology, and access to a larger set of data, we went out to hunt for the possible reason why my beautiful test failed…. It just could NOT BE ME (hahah, I would be proven wrong…read on). We took a look at this test in different browsers and we found that, on average, 90 milliseconds of original pages were shown before the variation was loaded. That is a century to the eyes and brain. Could I have finally found the proof so that it was not me who failed, not my beautiful cart, but our own tool?

The A/B test on the cart page showed a decreased conversion with 18% at a 99% significance using client-side testing without our own old tool. We then hard-coded the entire test inside the cart and manually ran an experiment using split URL testing and then ran a report on the data. We found a decrease of 4% at 99% significance on the same test. What? A difference of 14% between the two ways of testing hard-coded split testing and dynamics swapping the versions in the browsers?

I found some indication that I might be right in my initial thought of the improved variation being a better performing on then the original…but still it was underperforming when compared to the original. We took a look at what happened and we still saw a small blink, this time from the redirect. This is not uncommon and used in other tools like the Google Analytics™ Content Experiments (the former Google Website Optimizer) and some other big-profile tools in the testing space experience this on redirects. Next, more hacking, since we did not solve it yet.

There are tools that publish the entire test details in a JavaScript on their CDN, making it easy for business competition or the presidential competition to learn great deal about the existing test variations. We did not want to stimulate a new industry of test-mining, so we looked closer at the alternatives. Speeding up the CDN infrastructure would get us only a small step toward the reduced blink. So my co-founder Claudiu Rogoveanu played with the idea to not wait until the browser’s official signal that we could swap content, but analyzed the page structure so we could replace content before the human eye could capture the original version and notice the swap.

Implementing this (now patent pending) technology we had our first success in April 2012 as we could prove that on the same test, we could reduce the effect of the blink to next to nothing. We again got a negative result, but this time it was 1% with 99% significance and, although I did prove that my test really did not help the conversion (it hurt it), we learned something invaluable for the testing industry. Besides a hurt ego, the big lesson was “blinking matters.”

The blink matters

In my opinion, we arrived at an interesting point in Internet time, where we can personalize and test everything, but the blink that people might see has a very bad effect on the perception of the users on the website and, as my small test indicated, a very bad effect on the conversion rate while testing. I hope we will see more public studies on this blink so there is more data than my experience. But rest assured that Convert Experiments will not see a blink if the installation instructions as given in the three-step wizard of any project are followed. To benefit from this no-blink feature, the code must be loaded before any other code or style sheet with a preference under the closing title tag in the header. We are working on some other amazing features and zero-code integrations with Google Analytics™ that require this code in the header, so please don’t be too creative on the positioning of the code. If you don’t want a blink, then integrate the Google Analytics™ (GA) revenue tracking and the test variables sent as custom variables to GA, and therefore having the code up high will be the best way, if not the only way, to get it all working.

What does this mean for testing tools?

Everyone testing should consider ways to avoid the blink in testing since it negatively impacts the results. Besides our solution, there is split URL testing, but that gives a redirect that might increase load time, gives bad analytics data, and will not be very useful on dynamic pages or multivariate testing.

Moving all variation contents to the server side might be an alternative to reducing the blink if you are not using Content Experiments, as well as stopping usage of server-side rules like geo-targeting or historical data that might be stored in the cookies. Any traffic that is sent to the testing tool for decision making might slow the client swap of the content and introduce a longer lag of content display.

It is important to reduce the blink if you are testing. We got some strong signals that it affects conversions in a negative way. Users of Convert Experiments do not have to change anything in their setup, as we removed/solved the blink, so it not noticeable by the visitors. We hope you like it.

If you are interested in the technical details, look at our pending patent US20140013203 here.

Originally published July 11, 2012 - Updated January 20, 2022
Mobile reading? Scan this QR code and take this blog with you, wherever you go.
Authors
Dennis van der Heijden
Dennis van der Heijden Co-founder & CEO of Convert, passionate community builder and out-of-the-box thinker. 

Start Your 15-Day Free Trial Right Now.
No Credit Card Required

You can always change your preferences later.
You're Almost Done.
I manage a marketing team
I manage a tech team
I research and/or hypothesize experiments
I code & QA experiments
Convert is committed to protecting your privacy.

Important. Please Read.

  • Check your inbox for the password to Convert’s trial account.
  • Log in using the link provided in that email.

This sign up flow is built for maximum security. You’re worth it!