Blog /A/B Testing

42 Conversion Rate Optimization and A/B Testing Stats For 2022

28th May 2021 – Excel,Stat,Spreadsheet,Business,Analytics,Graph,Statistic,With,Graph,And

Are you interested in the state of CRO and experimentation in 2022?

Then look no further. We’ve curated, vetted, and organized the most up-to-date A/B testing stats below.

As can be expected, last year was a unique time for digital businesses and CRO. Some websites saw a burst in traffic from visitors at home, while other industries completely shut down.

This has led to some big changes in certain areas, while also giving us far more data points in other industries that we can use right now.

Wherever possible, we’ve tried to create a hypothesis of why certain things are happening.

So let’s dive in…

Stats About A/B Testing Tools and the A/B Testing Industry

1. The global CRO software market is set to grow to $1.81 Billion dollars by 2025. (Business Insider)

2. Less than 0.11% of the total websites online are using CRO tools or running tests. (Builtwith)

At the time of writing this (May 2021), there are 1,197,982,359 websites on the internet.

According to Builtwith, a tool that tracks the software that websites are using, they can only find 1,360,910 sites using CRO testing tools at this time, meaning less than 0.2% are testing!

3. Of the top 10,000 largest sites in the world (based on traffic volume), 48.4% or more of those sites are running CRO tests. (Builtwith)

So these are the top traffic sites in the entire world.

In reality, there is probably a much higher number of them testing, but the top traffic sites may be using custom-built tools that Builtwith can’t track.

4. When tracking the top 100,000 sites in the world (in terms of site traffic), only 19.05% of them are using CRO tools. (Builtwith)

5. When tracking the top 1 Million sites in the world (in terms of site traffic), only 5.83% are using CRO tools. (Builtwith)

(In each result we excluded the previous segment to get a more accurate representation of the change.)

It’s clear to see that sites with a lower traffic volume are focusing less on CRO, while inversely, those with the highest volume of traffic focus on CRO much more.

It would be interesting to see the cause and effect.

Do those sites test more because they have more traffic, or are more tests helping them to bring in more traffic by being more efficient?

Stats about the A/B Testing Process

6. 94% of beginner testers are not prioritizing tests objectively. (Speero)

This could mean 2 things:

  • People are deciding which tests to run next off of random decisions and not testing the most important, high-impact thing first.
  • They are doing the math in their head and not using a hypothesis generator.

The thing is, we all have biases. Without using a tool to decide, we can waste time on tests that mean nothing, versus running tests that could see the largest return.


Codeacademy started prioritizing their tests and saw a 28% increase in the annual plan conversion rate.

7. Only 20% of experiments reach the 95% statistical significance mark. (Convert)

We ran a study of 28,000+ users on our Convert app. 80% of the tests ran were stopped before they reached statistical significance.

This can be down to 2 things:

Either the idea didn’t work like you thought it might, or people were stopping tests too soon.

The thing is…

8. The average lift on experiments that reached statistical significance was 61%. (Convert)

Not only that, but 1 out of every 7.5 tests gave a significant lift in results.

On average you won’t find a winner, but the winner will often outweigh the previous tests. (If not from a large win, then from compounding future results).

The key of course is to run more tests.

9. Over 40% of CRO testers are creating archives of past tests to examine and learn from in the future. (CXL + Convert)

10. Convert users run more tests per month than the average tester. (Convert)

In a study of 28,000+ users of the Convert app, we found that:

(We had some outliers that we removed running 100’s of tests per month!)

11. Only 17.8% of testers are running server-side tests, while 27.1% are running server- and client-side together. (CXL + Convert)

test setup we can do both

Did you know that the Convert tool can do both server- and client-side tests?

12. A/B tests are the most popular experiment among Convert’s users, at 80.92% of all tests run. (Convert)

types of test ran

Only 0.78% of people are running Multivariate tests and 0.57% are running personalization campaigns.

The lack of MVT experiments can be attributed to traffic requirements for a statistically significant test for multiple variations.

Personalization may just be people needing to learn how to implement it better. (Especially when it can affect your airlift so well).

Stats About A/B Testing Maturity

We’ve already seen that most companies are not even practicing CRO. (Less than 1% of all sites online!)

But there’s a big difference between those who are just starting to run tests and those who have baked testing into their company culture.

Spoiler alert:

The companies who are heavily focused on CRO and ‘maturing’ their experimentation programs are usually the market leaders as you’ll see from the results in just a second.

I just wanted to give a quick shout-out to the team at Speero for their fantastic work on their new Maturity Benchmark report.

They walk you through the 5 levels of maturity in experimentation programs (from beginner to transformative), where you can improve and mature your own testing, along with some really interesting data.

I highly recommend that you read the full thing, but I also wanted to point out a few interesting stats that stood out to me.

13. Industries with a focus on conversions such as SaaS, Tech, Retail, and Ecommerce tend to have the most maturity in their testing programs. (Speero)

Why is this?

Like I said, these particular industries are usually very conversion-focused. With that comes the math and understanding behind performing certain metrics like AOV, LTV, MRR, churn rate, etc.

These numbers are ingrained into these types of companies. Rather than having to convince their boss to run tests, testing and data analysis are embraced at all levels.

This means they will often run more tests more often, get more results, improve their conversions and pull ahead of their competitors.

14. Surprisingly, 33% of the businesses that are at the highest maturity level have only been A/B testing for a year or less. (Speero)

You would think the companies that have been testing for 5+ years or more would have the most mature testing programs in place, but they only make up 44% of the companies surveyed.

So why would this happen?

Let’s say you’re a fairly new SaaS company with VC funding and the goal of scaling up. You already know your current metrics, are probably looking at new methods of recording data points and creating a strategy around that.

If you’re already tracking that core information, it’s relatively easy for you to justify doubling down on CRO methods, as every new winner compounds, and you have scaling goals to hit.

15. 12% of companies at the ‘beginner’ maturity level have been A/B testing for 3-4 years. (Speero)

Although they’ve been running for longer, their campaigns are only at the beginner level. (Perhaps performing a few tests per month.)

Why would this be?

The reality is that companies that are less ‘conversion focused’ tend to start CRO, but never really evolve their process. Usually, someone on the team takes on the additional responsibility and runs a few tests, but struggles to get buy-in from management to dive deeper or add further funds.

(In a 2020 study by Hubspot of 3,400 marketers, they found that 46.15% of them did NOT measure customer acquisition costs.)

This creates a bottleneck in tests performed and results achieved, causing management to not see the ROI they should.

They’re just not getting enough data to make decisions, or worse, they’re making choices based on gut feelings instead.

16. 0.11% of companies are running A/A tests. (Convert)

less than 2 percent run a tests

Now, this could be because of a few things.

An A/A test is run to make sure that:

  • Your current results are accurate, and
  • Your testing tool is giving you correct results.

Not many people realize this or test it. (We’ve actually started improving our onboarding process to help people start with A/A tests).

The most mature programs will have run an A/A test when starting out and possibly again during the year just to keep a check that everything is working correctly.

17. 88% of companies with revenue of $500+ Million are using Customer Feedback tools. (Hotjar)

18. 67% of companies with the most mature experimentation programs share their key findings across the company, compounding their results further, while 94% of beginner programs don’t share their findings outside of their department and hierarchy. (Speero)

Imagine if you found certain images and copy that doubled your lift on your landing page, but you never told your ad department, so they kept running the same campaigns?

The information alone could both cut your ad cost and increase CTR and sales.

It’s clear that the most mature programs build a culture of sharing and experimentation. Not only are they sharing their findings, but the market leaders are even teaching other departments how to test for themselves and giving them the freedom and tools to do so.

Democratizing Online Controlled Experiments by

19. Out of 2000 companies surveyed by Hotjar, 80% of the companies don’t collect Qualitative data, while also making less than $1 Million per year. (Hotjar)

20. At the same time, 60% of those companies that use data to make one-off decisions make on average $1 Million+ a year, while 16% of the companies surveyed who use data to make continuous improvements made more than $500 Million per year. (Hotjar)

Challenges Faced by Testers (and How to Solve Them)

Based on the last section, you can probably guess some of the key issues that testers are facing.

Some you might not have thought of before…

21. Language barriers and population sizes can directly affect the number of tests you might run. (Convert)

variation in stat sig tests by region

In a study of 28,304 users of our app, we found that those in the US run more test variations (of statistical significance) when compared to Western Europe.

It could be that US-based companies are simply testing more variations, but it probably comes down to audience size. A larger audience in the same language allows you to run more variations overall.

22. The 4 biggest challenges that testers face are: having testing processes, getting ‘buy in’ from management, having enough traffic to test effectively, and being able to spare someone to test! (CXL + Convert)

23. 27.4% of companies have a single tester responsible for their experiments, while 26.3% have a dedicated team in place. (CXL + Convert)

Again, this ties into what we’re seeing with regards to the ‘maturity’ of testing programs.

You have companies testing now and again, some with a single full-time tester, some with experimentation teams, and others who have even taught each team how to test.

24. Although agency and in-house teams seem to get the same average lift results from winning tests, agencies had almost 21% MORE wins. (Convert)

One of the major issues we see testers talk about is having a lack of process when performing experiments.

This could well be a maturity issue. You might have a single person running tests while also doing their day job, etc.

The difference with agencies is that tests are all that they do, so they need to have set processes they can follow for every new client and each campaign so that they can be both efficient and cost-effective.

All of those tests and clients give them experience of high-priority areas that can cause the biggest lift in the shortest time. The agencies are prioritizing areas where they see the easiest wins first.

Your Monthly Fix Of Expert Insights, Optimization Breakthroughs & Industry News.
No Spam. Convert Always Respects Your Privacy
Sign Up For The Newsletter

Stats About Conversion Rate Optimization Effectiveness

25. Approximately 33% of digital marketers in the US and UK are devoting more than 50% of their marketing budget to personalization. (Statista)

According to a report in 2019, 93% of U.S Internet users received marketing content that was not relevant to them, while 62% were willing to opt-in for more personalized marketing campaigns.

It’s no wonder that personalization has seen such a focus in the past year.

26. The number of marketers using Machine Learning for personalization grew from 26% in 2018, up to 46% in 2020. (Statista)

27. The highest converting landing pages of 2020 were in the Catering and Restaurants industry at 18.2%, followed by Media at 18.1%. (Unbounce)

conversion rate by industry

Just a quick note:

Unbounce put together a fantastic report using their tool’s Machine Learning to measure and analyze the conversion rates on their customers’ pages. It’s definitely worth a read!

I’ve pulled some of their most interesting stats below for the SaaS and B2B space, along with some other reports.

28. The highest landing page CTR in 2020 belonged to Catering and Restaurants pages at 21.2%, followed by Legal pages at 19%. (Unbounce)

At the same time, it’s interesting to see that Media pages had the highest form completion rate of 11.5%.

(I’m guessing it wasn’t just me who ordered food deliveries and watched every streaming platform then.)

29. Simplifying your landing page copy can double your conversion rate in the SaaS space. (Unbounce)

Software companies saw a jump from a 4% conversion rate to 8% when they wrote at an Elementary school reading level.

This seems to be true across every industry to varying degrees.

30. B2B pages can see a 50% lift by simplifying the language on their landing pages. (Unbounce)

31. Forget scare tactics. Focusing your copy on a positive outcome for the consumer can see a higher conversion rate. (Unbounce)

32. The average B2B conversion rate is just 3.5%. (Unbounce)

Ironically, Lead Gen companies seem to have a lower conversion rate than some other B2B industries.

33. Social and Paid Social convert at almost double that of other channels in the B2B space. (Unbounce)

Far more traffic is coming through from these channels than others, perhaps indicating a focus on paid acquisition.

Also, it could indicate sites are creating content not performing or ranking in Organic search.

34. Physical stores can see a 168% increase in conversions when tracked back to a digital touchpoint. (Wolfgang Digital and Google)

Obviously, this needs to be tested more and tracking set up, but it helps to see the value in improving online UX and touchpoints for physical locations.

35. User-Generated Content (UGC) can help increase conversion rate by 161%. (Yotpo)

Yotpo ran a study of 200,000 ecommerce stores and 163 million orders to see how UGC content affected sales.

The study results varied by industry, but in every instance, it provided a lift.

36. Tablets had the highest conversion rate of any other device in 2020 (3.32%), beating out Desktop at 2.1% and Mobile at 2.01%. (Statista)

(It varied slightly depending on the location.)

37. Tablets also had the highest average page views of any other device. (Statista)

Although not strictly a conversion stat on its own, it might be worth checking if your campaigns are optimized for tablet views.

38. Chrome Users had the highest conversion rate of any operating system at 3.76%. (Statista)

39. Chatbots can have an interaction rate (those who saw and clicked) between 2% and 12%. (Leadoo)

chatbot interaction rate

With more people interacting online, it seems chatbots are helping people to find more information.

Even more interesting is the conversion rate from clicking on the bot to becoming an MQL.

40. Chatbots can convert anywhere from 5% to 35% of chats to leads. (Leadoo)

If you’re selling a complex offer, then chatbots could help increase lift on automation.

41. Only 17% of marketers who use Hubspot are A/B testing their landing pages when running paid traffic. (Hubspot)

This kind of makes sense, in that most landing pages are tested with a control audience first and improved, before pushing new traffic channels to them.

In this instance, you can see that the marketers are focusing on improving lift elements in the ad itself and then will probably test the page again once they have a control ad to work with.

42. Conversion rates on opt-in forms can improve by almost 50% by reducing the number of form fields from 4 to 3. (Hubspot)

In a study of 4000 customers, Hubspot noticed that the conversion rate dropped the more form fields someone had to fill out.

The largest decrease was when they went up from 3 form fields to 4.


So there you have it. The most interesting, important, and up-to-date stats that we could find on CRO and A/B testing.

What are your takeaways?

As I see it, the majority of sites are not testing, the middle-of-the-road performers are running some A/B tests, and the top performers are testing heavily.

It’s also clear that UX and customer feedback are important. Is it the factor that’s causing the growth of the most mature teams? Perhaps… it certainly can’t hurt to improve your customer experience and retain them for longer.

One thing is certain: The most mature teams are not just running tests. They are building a culture of experimentation, training, and sharing the results. They will often build an ongoing repository of their past tests to learn from and then they take action based on those results, not opinions.

Originally published May 28, 2021 - Updated May 08, 2023

Start Your 15-Day Free Trial Right Now.
No Credit Card Required

You can always change your preferences later.
You're Almost Done.
Managing Marketing Team
Managing Tech Team
Hypothesizing Experiments
Coding and QA of Experiments
Convert is committed to protecting your privacy.

Important. Please Read.

  • Check your inbox for the password to Convert’s trial account.
  • Log in using the link provided in that email.

This sign up flow is built for maximum security. You’re worth it!