This is brilliant. As someone who tries to fight the good fight in an organisation that plays fast and loose with statistics to bias decisions towards the outcomes that everyone wants to hear it’s cathartic to read about such a hilarious proof of my point of view.
Human intuition and statistics are incompatible, and it really is a field where a little knowledge is a dangerous thing.
The golden rule is this: your intuition is wrong, your assumptions about what you can and can’t determine from your results are wrong, and you need to ask a statistician.
This is such an incredibly stupid article where the author does a bunch of stuff wrong and then tries to turn it into some kind of a gotcha about how the idea of testing is flawed.
Nobody should be trying to make statements like I got a 300% increase in clicks when one group had 12 clicks and the the other 3.
That’s not statistics, that’s not testing. That’s someone who thought they had an opportunity to dunk on a topic that they knew nothing about to try and make themselves feel smart.
Sometimes they're the boss and it's a real uphill battle to persuade them how stupid they're being. I don't work in that industry, but I know enough to know when to stop and have a very hard think about some numbers.
It's a common mistake to think that the things that are the trivial fundamentals in the field that you're expert in are understood by the whole of humanity or even in some cases by everyone who claims to be expert in the your field.
Human intuition and statistics are incompatible, and it really is a field where a little knowledge is a dangerous thing.
The golden rule is this: your intuition is wrong, your assumptions about what you can and can’t determine from your results are wrong, and you need to ask a statistician.