Do you mean to ask if I sent three of my colleagues on a multi-year doctorate research program for a question that everybody seems to already know the answer to and is in agreement, or if I let my team work on my product?
Sounds like you're a decision maker and you told like being told your decision are being made on a fragile basis. :)
Let me tell you that the history of statistics is littered with unsound, but plausible looking decisions, to which "everyone already knew the answer".
One example: https://en.m.wikipedia.org/wiki/Survivorship_bias
So yes, that's exactly what I mean. Though if they need multi-year training for a simple A/B test, then you should fire them :)
> Sounds like you're a decision maker and you told like being told your decision are being made on a fragile basis. :)
I have bad news for you if you think this doesn't apply to like 99% of of management decisions (and product, and...). Even, and perhaps especially, the ones that are trying to be "data driven": usually the data's shit, but everyone just pretends it isn't because they don't want to spend the time and money to do it right.
The "modern" "scientific" "data-driven" business world mostly runs on hunches and prejudice.
If it was so simple, everyone would be doing it correctly. Even you don't give a simple account of what needs to be done. What you have done is talk down to someone else. All in all, I think your entire argument would be a lot more sympathetic if you made it by giving a concrete example of your own beliefs (what do? Is that really simple?), otherwise you will have to get used to folks not listening to you.
> Even you don't give a simple account of what needs to be done.
Fair enough - though I did mention a few basic things like A/B testing and survival bias (trying to being aware of one's own cognitive biases is the first step in making better decisions; to be aware means you first have to know them).
And yes, it is not trivial at all and dependent on the context.
The best and simplest starting point would be to make a list of potential confounding variables (I mentioned a few of them) and then just looking at the R^2 statistic.
That already should point him in the right direction.
I was less bothered that his methodology wasn't 100% correct, but rather by casually making a statistically unsound decision which has a big effect on people's lives, so getting it right is important - and then when called out dismissing it.
I have a lot of sympathy for people honestly trying to make decisions rigorously and getting it wrong; but you have to try.