Testing Mind Map Series: How to Think Like a CRO Pro (Part 32)
Interview with Sean Clanchy
Swanky Australia’s Sean Clanchy is giving optimization aficionados some seriously sound advice. Sean recommends taking advantage of the vast resources available from CXL Institute and mentor networks and emphasizes the three core components of optimization success: having multiple data points to validate a hypothesis, picking one prioritization framework and sticking to it, and focusing on learning instead of winning.
For those hoping a one-size-fits-all approach will help them achieve their optimization goals, Sean quickly dispels that misguided notion — making it clear that every customer base and acquisition strategy requires a different approach.
So let’s dive right in and learn from the best!
Sean, tell us about yourself. What inspired you to get into testing & optimization?
Hi, I’m Sean Clanchy. Managing Director of Swanky’s Australian operation.
Swanky is a full service Shopify Plus Expert agency. We are considered one of the Shopify agency old guard, our team has been working on the platform since 2010 – when we helped Golden Bear toys launch their ecommerce store selling merchandise for the London Olympics.
Since then, Swanky have worked with some of Shopify’s largest global brands including HelloFresh, Huel, Wilkinson Sword, and Deepika Padukone’s new cosmetic range, 82E.
As for me… I’ve been everywhere!
I have a diverse background and piggy bank of experiences across procurement and warehouse operations within the Australian mining industry, advertising sales within one of Australia’s largest media groups, before stepping sideways into optimisation via my time with the team at Good Growth in the UK (prior to joining Swanky).
I joined Swanky as the Shopify ecosystem was going through the initial Shopify Plus boom (ably supported by Adobe/Magento’s push from Magento 1 to 2).
For how many years have you been optimizing?
6 years across 2 agencies and spanning media, B2B, and retail brands.
What’s the one resource you recommend to aspiring testers & optimizers?
CXL Institute is a great LM program helping build foundational skills and understanding of core CRO insight, prioritisation and experimentation strategy.
Beyond that, I think mentors are extremely important. Whether you practice what they preach, or forge your own path, having the chance to sanity check your ideas and processes is invaluable. If you can learn from someone else’s mistakes… It saves a lot of pain and loss. Like anything, don’t assume what they say (or you think) is right.
Develop your own hypothesis > Test > Learn > Improve
Answer in 5 words or less: What is the discipline of optimization to you?
Opinion Free. Testing. Learning. Improving
What are the top 3 things people MUST understand before they start optimizing?
- Have multiple data points indicating the what and where of your problem/opportunity and let them at least logically validate your hypothesis before you build a testing roadmap around it. One indicator is a waste of time.
- Decide a prioritisation framework and stick to it. Frameworks are only as good as the rigour to stick to them allows and they are imperative to having a sustained, efficient testing program. Flip flopping test prioritisation is as frustrating for stakeholders and optimisers as flip flopping tests themselves.
- Don’t get stuck on WINNING. Losing tests are fine. Inconclusive tests are the devil as they (typically) don’t provide insight. Get hooked on learning.
How do you treat qualitative & quantitative data to remove bias?
To start with, we standardise terms and data definitions to avoid mis-classification of data when utilising different data sources.
We also determine the scope of dependent and independent variables to mitigate the risk of going down a rabbit hole.
We leverage multiple analysts to code qualitative data, with the data anonymised for initial coding. We then move on to the evaluation of segment specific data after coding.
We also use multiple data sources to validate quantitative data to ensure any reporting error is picked up and corrected before moving into hypothesis development and prioritisation.
What is the most annoying optimization myth you wish would go away?
“I’ve done this before and it always works!” or “I’ve seen it on their website so it must
Every website/app/experience platform has a unique customer base, acquisition strategy and inbound user expectation. Thinking that a one-size-fits-all approach works well is ludicrous.
Audience composition and expectation are continuously changing. You should never be “finished testing”.
Download the infographic above and add it to your swipe file for a little inspiration when you’re feeling stuck!
Thank you for joining us for this exclusive interview with Sean. We hope you’ve learned some valuable insights from his experiences and advice, and we encourage you to implement them in your optimization efforts.
Check back twice a month for upcoming interviews! And if you haven’t already, check out our past interviews with CRO legends Gursimran Gujral, Haley Carpenter, Rishi Rawat, Sina Fak, Eden Bidani, Jakub Linowski, Shiva Manjunath, Deborah O’Malley, Andra Baragan, Rich Page, Ruben de Boer, Abi Hough, Alex Birkett, John Ostrowski, Ryan Levander, Ryan Thomas, Bhavik Patel, Siobhan Solberg, Tim Mehta, Rommil Santiago, Steph Le Prevost, Nils Koppelmann, Danielle Schwolow, Kevin Szpak, Marianne Stjernvall, Christoph Böcker, Max Bradley, Samuel Hess, Riccardo Vandra, Lukas Petrauskas, and our latest with Gabriela Florea.