Testing Mind Map Series: How to Think Like a CRO Pro (Part 13)
Interview with Alex Birkett
When it comes to optimization, there are few people as well-versed as Alex Birkett.
But what exactly is “optimization”?
Alex says it’s a combination of disciplines, including copywriting and experimentation.
It’s about building the right system and processes to reduce (not eliminate) uncertainty. Understanding that there is a point where attempting to eliminate uncertainty has diminishing returns.
And no, optimization isn’t the answer to all business or UI problems.
In this interview, we’ll delve into Alex’s top tips for developing data literacy and establishing a successful experimentation program. You’ll learn what to consider before you even begin optimizing your site and how to keep track of your tests effectively. So read on if you want to take your optimization efforts to the next level!
Alex, tell us about yourself. What inspired you to get into testing & optimization?
My name is Alex Birkett. I live in Austin, Texas and have a dog named Biscuit.
I write at alexbirkett.com, run a content marketing agency called Omniscient Digital, and run the experimentation program and team at Workato. Obviously, this keeps me pretty busy. But outside of that, I’m passionate about health and wellness. So I spend a lot of time in yoga studios, saunas, CrossFit gyms, ski resorts, etc.
When I was in college, I read a lot of Ryan Holiday’s stuff, which got me interested in marketing. He wrote a book called “Growth Hacker Marketing” right around the time I was graduating college, and it included tons of figures like Sean Ellis, who talked about A/B testing and quantitative, data-driven growth. That intrigued me, so I signed up for Optimizely when they offered free accounts and started playing around.
I got a job at a very early stage tech company in Austin, but I kept reading blogs like CXL, Conversion Sciences, and Marketing Experiments, so when I saw Peep Laja open up a role in Austin for a “content & growth marketer,” I jumped at the opportunity.
And that was the start of the rabbit hole. The next few years I spent at CXL were like getting a graduate degree in optimization, experimentation, and data-driven marketing in general. I got to network and learn from the top experts in the industry, run experiments myself, and write about all the stuff I was learning. It was a dream opportunity for a young and hungry experimentation nerd.
How many years have you been optimizing for? What’s the one resource you recommend to aspiring testers & optimizers?
I took a bunch of social psychology classes in university where we ran classic behavioral experiments, but the first time I ran a test on a website was in 2014. I didn’t know what I was doing, though. The first time I knew what I was doing was at CXL in 2015/2016. So I’ve been optimizing professionally, or at least semi-proficiently, for about 6-7 years.
If there’s one resource I can recommend, it’s CXL (including CXL Institute).
“Optimization” isn’t really a discipline, though – it’s several overlapping disciplines. So I’d probably recommend indexing heavily on one of those and getting good at it before you try to become an “optimizer” (which in my opinion doesn’t really exist – that’s more of a mindset).
Copywriting? Copyhackers.com and the classic direct response copywriting books.
Experimentation? Netflix and Airbnb engineering blogs, Ronny Kohavi’s book, and a lot of practice.
It really depends what rabbit hole you want to go down.
I also suggest joining a community for whatever you hope to accomplish. CXL has a great Facebook group. The Measure Slack is great for analytics and general data-driven stuff.
Answer in 5 words or less: What is the discipline of optimization to you?
Making better business decisions.
What are the top 3 things people MUST understand before they start optimizing?
- What we call optimization is mostly uncertainty reduction (you collect X amount of information to reduce uncertainty by Y).
- You can never fully reduce uncertainty and there’s a point of diminishing utility in trying to do so.
- Optimization isn’t always the answer to your business problems, and knowing when it is and isn’t is a big strategic advantage.
How do you treat qualitative & quantitative data so it tells an unbiased story?
You’re never going to get a fully “unbiased” story, so I optimize for “expected value” when working with data.
There’s always a cost to collecting data – in terms of time (the opportunity cost of running an experiment or collecting survey responses) or money (software, developers, designers, etc.).
The impact or risk of a given decision also factors into how much I want to “spend” on data to reduce that uncertainty.
If a decision is make-or-break for the business, and there’s a feasible way I can collect enough data to be quite sure of a good decision, I’m going to spend the time and money to do it.
If a decision doesn’t really matter, it’s a waste of my time and money to spend any time collecting qualitative or quantitative feedback. I just make the decision in that case.
But generally speaking, I like to collect *enough* data and no more than that to make what I feel is an appropriately risk-weighted decision for the task at hand. Sometimes I (*gasp*) go with my gut. Sometimes I talk to like 5 customers and feel confident in my qualitative data. Sometimes I run a rigorous experiment for 4 weeks and use statistical analysis to move forward.
It all depends, there’s no one-size-fits-all answer.
One thing I’ve come to learn is that more data can also create more problems, especially for organizations with little data literacy and ability to parse through the data to make good decisions.
What kind of learning program have you set up for your optimization team? And why did you take this specific approach?
At Workato, the process is pretty simple.
Experiment ideas can come from multiple teams and sources – sometimes it’s a strike of creative inspiration from the sales team, sometimes it’s a well-researched idea from the brand team. Sometimes it’s the product of conversion research (all of which we log in to our Airtable database).
The idea is prioritized and then it’s required to fill out an experiment document – this includes the learning objective, hypothesis, background research, experiment design, and action items upon conclusion.
Once the experiment is finished, it is analyzed and the experiment document is updated with conclusions and learnings. This is tagged and added to our Airtable archive.
This Airtable is available to anyone in the company, and we also do a weekly experiment review meeting as well as a weekly newsletter with running, planned, and concluded experiments (which anyone can sign up for).
Our team also speaks at periodic company meetings to teach and evangelize the ways of experimentation.
I took this approach because I believe in the power of iteration and learning as well as process and education, but you can’t overwhelm a team and company. Everyone has their own goals and tasks, and while we think experimentation is the life and death of a business, it’s not the first thing everyone else thinks about when they wake up. My job is to evangelize and educate people, but also reduce friction in doing so. I want people to be *excited* about experimentation and want to get involved, not to think of it as burdensome or homework.
So my learning program is designed to be as lightweight and frictionless as possible, with escalating opt-ins for people who want to be more involved.
What is the most annoying optimization myth you wish would go away?
I don’t know if this is a myth, but it’s a common belief that those who work in CRO just sort of have the answers to your UI problems. They don’t. They can have a broader swath of data points for pattern matching (which is what best practices are), and that can be helpful. But you can’t just look at a website or landing page, tear it down, and automagically “optimize” it for profit. If you could, you would be rich as f*ck, because that takes a few hours and you could charge hundreds of thousands for the value of doing so if it actually produced ROI.
The system and process you build around making better decisions is what experimentation or optimization is about. Not a bunch of pattern matching in a CRO ninja’s head.
Sometimes, finding the right test to run next can feel like a difficult task. Download the infographic above to use when inspiration becomes hard to find!
Hopefully, our interview with Alex will help guide your experimentation strategy in the right direction!
What advice resonated most with you?
Be sure to stay tuned for our next interview with a CRO expert who takes us through even more advanced strategies! And if you haven’t already, check out our interviews with Gursimran Gujral, Haley Carpenter, Rishi Rawat, Sina Fak, Eden Bidani, Jakub Linowski, Shiva Manjunath, Andra Baragan, Rich Page, Ruben de Boer, and our latest with Abi Hough.