Testing Mind Map Series: How to Think Like a CRO Pro (Part 68)

Oliver West
By
November 18, 2024 ·

Interview with Oliver West

Every couple of weeks, we get up close and personal with some of the brightest minds in the CRO and experimentation community. 

We’re on a mission to discover what lies behind their success. Get real answers to your toughest questions. Share hidden gems and unique insights you won’t find in the books. Condense years of real-world experience into actionable tactics and strategies. 

This week, we’re chatting with Oliver West, Head of Customer Experience at VML and an award-winning and industry respected Customer Experience leader, author, speaker, and mentor with over 25 years of hands-on experience for some of the world’s biggest brands.

Oliver, tell us about yourself. What inspired you to get into testing & optimization?

Unlike some of your other guests, I’m not a dedicated optimisation expert. I come from a UX background, but not the visual design side—I’m honestly rubbish at that! My focus has always been on the strategic, research and analytical aspects of CX. 

Early in my career, it was really important to me to show the value of UX thinking and user research, and I found experimentation to be the perfect tool for that. At first, I used testing to prove a point: if research uncovered something valuable, experimentation allowed me to prove it to stakeholders in a way that was indisputable. 

It became a method to demonstrate that our hypotheses weren’t just based on opinions—they were backed by data. 

Plus, I love that A/B testing is so easy to explain to stakeholders and can deliver clear, actionable results.

How many years have you been testing for?

I started doing basic testing in the early 2000s when I was running my own digital agency. Back then, the tools weren’t as powerful, nowhere near as easy to use, and they were expensive. So, we didn’t have the luxury of dedicated tools; instead, we set up some basic server-side experiments by directing users to different versions of key pages. Honestly, looking back at the analytics we had set up then, I wouldn’t be surprised if they weren’t as accurate as we thought. If I were to share those early experiments, my reputation might take a hit! But hey, we all start somewhere, right?

What’s the one resource you recommend to aspiring testers & optimizers?

I’d highly recommend growth.design. It’s an amazing resource that introduces sometimes complex behavioural science principles in a really digestible and relatable format for anyone to learn from. They take everyday experiences of household brands like McDonalds or Netflix and highlight some areas for improvement based on psychology.

Answer in 5 words or less: What is the discipline of optimization to you?

Strategic and Insight-backed improvement.

What are the top 3 things people MUST understand before they start optimizing?

  1. The human brain: Understanding basic behavioural science and how the brain works is essential. Data is great, but when you combine it with an understanding of why people behave the way they do, it becomes incredibly powerful. As a hiring manager, I always prioritise candidates with psychology backgrounds. It’s hard to design effectively if you don’t know the basics of how humans operate. It’s not complicated—evolutionary psychology, for instance, teaches us that when cavemen saw crowds running, either from danger or towards food, they would follow. That’s “social proof” in its simplest form. You can learn a lot from following people like Rory Sutherland if you don’t want to spend years studying.
  2. Understanding what an insight really is: A lot of people throw the word “insight” around, but true insights are rare and incredibly powerful when you find one. An insight isn’t just a piece of data or an interesting observation—it’s a deeper understanding that fundamentally shifts how you see a problem. For example, discovering that 60% of users drop off at a certain point is not an insight—it’s a finding. The insight comes when you dig deeper and realise why that drop-off happens. Maybe it’s because users are confused by the language, or because there’s an emotional barrier they’re hitting.

    True insights often uncover hidden motivations, frustrations, or needs that weren’t obvious at first glance, and they lead to solutions that are more impactful and meaningful. This kind of deep understanding can completely transform your approach and have a significant impact on results. When you hit on a real insight, it not only helps you solve the immediate problem but often uncovers opportunities you didn’t even know existed. If you’re looking for a deeper dive into what constitutes an insight and how to find one, I cover this extensively in my book Customer Journey Mapping for Business Growth, which you can download for free on my website.
  3. Use multiple sources for test ideas: Don’t just rely on one input for your test plan. Mix it up—lean on the analysts for interesting patterns that highlight an issue or opportunity, use user testing to identify usability issues, run in-depth interviews to get deeper insights into what users really want, and don’t be afraid to trust your instincts, especially if you’ve been doing user research for years. Some of my best results came from gut instincts, informed by years of experience in front of customers.

How do you treat qualitative & quantitative data to minimize bias?

That’s a brilliant question, and honestly, I could talk about this for hours because there are so many layers to consider. In simple terms, I always try to cross-reference qualitative and quantitative data to minimise bias. One informs the other—so, for example, you can use quantitative data to generate hypotheses that you then explore further with qualitative methods, like interviews. Or you start with qualitative insights and validate them with numbers. This back-and-forth between the two is essential to avoid drawing conclusions based on one source alone.

But it goes much deeper than that. I like to pay close attention to segmentation and sampling to make sure the right audience is being analysed. We should also ensure the testing environment itself is unbiased, and I like the data to be properly cleansed to avoid skewing results. On the qualitative side, it’s about creating an environment where participants feel comfortable and asking neutral questions to avoid influencing their responses. Whether you’re conducting interviews, user testing, or focus groups, the way you structure those interactions is key to avoiding bias.

Ultimately, this question isn’t just about data—it’s about recognising the potential for bias at every stage of the process. From the way data is collected, to how it’s analysed, to how we interpret results, you have to be conscious of these biases creeping in. So, it’s not just about validating data with different sources but interrogating every step to ensure you’re getting a true reflection of reality. It’s a much deeper question than it might first appear, and getting it right takes a lot of attention to detail across both sides—qualitative and quantitative.

That said, let’s not forget the practical side to this. While it’s critical our findings are valid and reliable, we can’t afford to spend so much time perfecting the process that we lose sight of our main goal or exhaust resources. It’s all about striking a balance between depth and ROI—ensuring that the hypotheses are well-founded without overcomplicating the process. At the end of the day, it’s about driving actionable insights that move the needle, so you need to be smart about how far you dig and when to pull back to focus on delivering results with the resources you have.

How (to you) is experimentation different from CRO?

CRO is about improving specific metrics, often tied to moving users through a sales funnel for a product or service. It’s very metric-driven. 

Experimentation, on the other hand, is a much broader discipline. You’re testing ideas—not just for CRO, but for anything, really. You could be testing a new product concept, or even running Google ads for a product that doesn’t exist yet, just to gauge demand. Experimentation is the best friend of innovation—once you have a solid insight and an idea, you need to experiment to see if it’s worth pursuing before you invest heavily. It helps you decide whether to scale the idea or shelve it.

Talk to us about some of the unique experiments you’ve run over the years.

Maybe not all unique but there are a few that really stand out to me, and I use as examples, especially because they show how simple actions can have such a big impact on growth.

One example was a headline test for a trading platform. Taking cues from psychology we changed the headline on the signup form from “Join today for free” to “Join over 2.5 million traders.” That tiny change—just four words—resulted in an 11% increase in signups. It’s a great example of using behavioural science, in this case, social proof. People want to do what others are doing, so showing that millions of others were already on the platform made a huge difference.

Another was with a retail bank’s credit card signup process. Conversions were poor, and it was easy to see why—there were so many things wrong with it. The page was cluttered, broke multiple usability principles, and had an animated background that was a massive distraction. The bank had partnered with a well-known brand and insisted on using this animated image to highlight the partnership. To them, it looked beautiful and dynamic, but to a trained eye, it was a disaster in terms of usability and focus. We ran with three versions: the original with the animated background, one with a static image, and one with no image at all. Unsurprisingly, the animation dragged down conversions considerably. This was one of those occasions where my gut instinct was spot on and highlighted the need to remove distraction.

Finally, I ran a test to evaluate product demand using Google ads. We were trying to figure out which type of training course would resonate most with users. Google ads were a quick, relatively cheap way to get insights into which concepts had potential. You could argue that the sample was biased because it’s based on who’s searching, but it’s still an effective way to weed out bad ideas early. You can even take it a step further by setting up a landing page with a waitlist. If people are willing to leave their email, you’ve got a stronger signal of commitment beyond just clicking an ad. This ties in with Alberto Savoia’s concept of “skin in the game”—getting users to show commitment by giving up something, even just their email address, adds a layer of validation. It’s a great way to test product ideas and messaging before investing too heavily in development.

CRO Expert Profile Oliver West

Cheers for reading! If you’ve caught the CRO bug… you’re in good company here. Be sure to check back often, we have fresh interviews dropping twice a month. 

And if you’re in the mood for a binge read, have a gander at our earlier interviews with Gursimran Gujral, Haley Carpenter, Rishi Rawat, Sina Fak, Eden Bidani, Jakub Linowski, Shiva Manjunath, Deborah O’Malley, Andra Baragan, Rich Page, Ruben de Boer, Abi Hough, Alex Birkett, John Ostrowski, Ryan Levander, Ryan Thomas, Bhavik Patel, Siobhan Solberg, Tim Mehta, Rommil Santiago, Steph Le Prevost, Nils Koppelmann, Danielle Schwolow, Kevin Szpak, Marianne Stjernvall, Christoph Böcker, Max Bradley, Samuel Hess, Riccardo Vandra, Lukas Petrauskas, Gabriela Florea, Sean Clanchy, Ryan Webb, Tracy Laranjo, Lucia van den Brink, LeAnn Reyes, Lucrezia Platé, Daniel Jones, May Chin, Kyle Hearnshaw, Gerda Vogt-Thomas, Melanie Kyrklund, Sahil Patel, Lucas Vos, David Sanchez del Real, Oliver Kenyon, David Stepien, Maria Luiza de Lange, Callum Dreniw, Shirley Lee, Rúben Marinheiro, Lorik Mullaademi, Sergio Simarro Villalba, Georgiana Hunter-Cozens, Asmir Muminovic, Edd Saunders, Marc Uitterhoeve, Zander Aycock, Eduardo Marconi Pinheiro Lima, Linda Bustos, Marouscha Dorenbos, Cristina Molina, Tim Donets, Jarrah Hemmant, Cristina Giorgetti, Tom van den Berg, and Tyler Hudson.

Mobile reading? Scan this QR code and take this blog with you, wherever you go.
Written By
Oliver West
Oliver West
Oliver West
CX leader, author, speaker, and mentor
Edited By
Carmen Apostu
Carmen Apostu
Carmen Apostu
Head of Content at Convert

Start Your 15-Day Free Trial Right Now.
No Credit Card Required

You can always change your preferences later.
You're Almost Done.
Convert is committed to protecting your privacy.

Important. Please Read.

  • Check your inbox for the password to Convert’s trial account.
  • Log in using the link provided in that email.

This sign up flow is built for maximum security. You’re worth it!