I’ve run almost exactly the same number of experiments as the author, and experienced the same frustrations until recently.
It sounds as though the author needs to combine the "How we improve conversion without A/B testing" section with user tests and data analysis to build stronger hypotheses. Since we've put more effort into the research phase, our success rate has improved dramatically.
The real value of A/B testing is validating meaningful hypotheses that help you learn what matters to your customers, not unrelated individual improvements. By only observing the overall trend, you miss out on this.
Disagree. The main reason we abandoned a/b testing is it's a major resource drain. Getting better at a/b testing would be another resource drain. Time is finite & how you use it determines where you end up.
The point of the article isn't to say a/b testing doesn't work or shoudn't be done. It's simply to say many companies are mis-using their resources by following the standard thinking on a/b testing.
We could spend another month figuring out how to get better at a/b testing or we could spend our time on another activity that produces a higher return on time spent. It turns out that's what we did and the return has been much better.
I couldn't agree more. Simply spending the time to understand why you are testing certain elements is valuable on it's own.
The author doesn't mention what kind of tests were performed, but often when I read about the "futility" of A/B testing, it's usually due to lack of up front preparation and discussion of the tests true objectives. So the classic "red button vs. green button" might get you a result in the testing software, but it doesn't necessarily translate to more sales/leads or whatever the ultimate goal is.
The post wasn't about the "futility" of a/b testing. It's about the costs of a/b testing that are rarely considered. There's plenty of "pro" a/b testing advice in the world that leads startups to mistakenly waste resources on it.
A solid a/b testing process doesn't just happen without time & effort. You need someone smart analyzing user behavior and forming hypotheses as well as a talented design team to develop concepts that test them.
We simply acknowledged the costs of doing a/b testing well and decided our resources are better spent elsewhere.
It sounds as though the author needs to combine the "How we improve conversion without A/B testing" section with user tests and data analysis to build stronger hypotheses. Since we've put more effort into the research phase, our success rate has improved dramatically.
The real value of A/B testing is validating meaningful hypotheses that help you learn what matters to your customers, not unrelated individual improvements. By only observing the overall trend, you miss out on this.
I wrote about this recently here: https://medium.com/@nickboyce/5-steps-to-a-better-a-b-testin...