Testing Mind Map Series: How to Think Like a CRO Pro (Part 14)
Interview with ‘Positive’ John Ostrowski (Positive Experiments)
Good judgment and decision-making skills are essential for success in CRO. For John Ostrowski, optimization is all about understanding the problem, knowing your audience, and making growth a team effort.
He believes that any outcome metric can be improved to some degree through experimentation – but it’s not always worth the effort. You can’t just look at numbers in a vacuum – you have to consider opportunity costs, risks, and other factors. Knowing when “the juice is worth the squeeze” (a.k.a. prioritization) is one of the biggest challenges that experimenters face. This is where qualitative and quantitative data should be used to increase decision confidence.
John offers a fresh approach to CRO that is unlike anything else you’ve read before.
Let’s dive in…
John, tell us about yourself. What inspired you to get into testing & optimization?
From Brazil to the US, Hungary, and Poland. From mechanical engineer to data analyst, experimenter, and growth mentor.
I’ve been an optimizer no matter where or what I was doing.
The tools evolved.
Lean Manufacturing became Lean Startup.
Six Sigma became product growth frameworks.
Continuous improvement, Kaizen, stayed as the common denominator.
The tipping point for going full digital was a vision I had back in Hungary: “Imagine a world where I work in front of a laptop, from anywhere, with no commute to the factory…”
How many years have you been optimizing for? What’s the one resource you recommend to aspiring testers & optimizers?
With the industry days, about nine years optimizing something. Running online optimization efforts, specifically, for about 5 years.
The first experiment I remember was between two cold-stamping machines with different setups to learn about the different tolerances between production lines. Fun times.
As far as resources go, I’ll recommend one course, one book, one podcast, and one newsletter.
Online course:
Reforge’s Experimentation Deep Dive takes the optimization game to a strategic level.
Book:
I can’t recommend Ronny Kohavi’s Online Trustworthy Experiments enough. It gives you superpowers just reading it.
Podcast:
Countless gold nuggets in Guido’s CRO cafe.
Newsletter:
Shameless plug for the Positive Experiments newsletter, my corner of the internet where I share my thoughts on growth experiments, financial freedom, and positivity 🙂
Answer in 5 words or less: What is the discipline of optimization to you?
Extracting juice worth the squeeze.
— Note that defining what worth means to your business is an exercise in itself.
What are the top 3 things people MUST understand before they start optimizing?
“If I had one hour to solve a problem, I’d spend 55 minutes thinking about the problem and 5 thinking about solutions.”
- Understand the problem. What are you trying to solve, and why is it important? How does it help the business growth? What is the story you’re telling?
- Know your audience. Power dynamics are the invisible force in organizations that you are not exposed to during onboarding. How do stakeholders respond to data, and what types of data do they prefer? What makes them tick? Know your audience. Be a champion.
- Growth is a team effort and a long term game. To me, “growth hacks” live in the same category as “get rich quick.” You hear stories about the successful ones, and all the other millions that failed are suppressed. Survivorship bias. WYSIWYG platforms enable individual contributors only to do so much. Know what is within your control and what is not. Learn how to find leverage, be it labor or capital.
How do you treat qualitative & quantitative data so it tells an unbiased story?
My decision-first approach to research is a mix of Behzod Sirjani and Douglas Hubbard.
To me, life, business, and investments are all about good judgment and decision-making.
Therefore, qual and quant become pieces of evidence for some decision that carries a risk.
I think through the five steps of Applied Information Economics (DDCMM):
- Define the decision.
- Determine what you know now.
- Compute the value of additional information against the cost of not deciding. (If none, go to step 5.)
- Measure where information value is high. (Return to steps 2 and 3 until further measurement is not needed.)
- Make a decision and act on it.
Selecting the research method (step 4) is a prioritization step in itself.
Know your audience and what evidence will increase confidence for making the decision, then define the best method.
Avoid defaulting to “let’s run research method X” without first defining what type of evidence will be more useful to your decision.
Be aware of your audience – sometimes a stakeholder will respond better to customer interviews or anecdotes.
It’s worth mentioning that my most impactful experiments were hypotheses that originated from qual research, more specifically happy user interviews.
What kind of learning program have you set up for your optimization team? And why did you take this specific approach?
Our experimentation chapter is a group of 10 growth-minded individuals, lifelong students who love to be proven wrong by data.
Seventy years of optimization experience combined, all from multiple backgrounds.
Apart from what you can expect from chapter ceremonies and peer review, we have Reforge for our team.
We ask all to go through both the Experimentation Deep Dive and Growth Series.
Having a common language is an underestimated superpower for high performing teams.
What is the most annoying optimization myth you wish would go away?
Experimentation? Err.. you mean CRO, right?
This is how I think this through, in four steps:
- There are foundational building blocks (Strategy, Tools & Tech, Process, People)
- Which help shape the governance structure (hybrid, centralized, decentralized)
- That must be in place for a culture of experimentation to flourish (AB testing flywheel)
- And depending on your organization structure (e.g. product-led), different workflows are set — for iTech Media, OKR initiatives, CRO, and Safety Net, read more here.
As an industry, we are better off moving away from this false dichotomy.
We can do better.
Sometimes, finding the right test to run next can feel like a difficult task. Download the infographic above to use when inspiration becomes hard to find!
Hopefully, our interview with John will help guide your experimentation strategy in the right direction!
What advice resonated most with you?
Be sure to stay tuned for our next interview with a CRO expert who takes us through even more advanced strategies! And if you haven’t already, check out our interviews with Gursimran Gujral, Haley Carpenter, Rishi Rawat, Sina Fak, Eden Bidani, Jakub Linowski, Shiva Manjunath, Andra Baragan, Rich Page, Ruben de Boer, Abi Hough and our latest with Alex Birkett.