Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's not impossible. We know how many fabrications are discovered. You can take a random sample of papers and analyze them in depth to estimate the number of papers that are fabricated.


>You can take a random sample of papers and analyze them in depth to estimate the number of papers that are fabricated.

That assumes that the fabrications in those papers will be revealed upon further analysis.

And, what's worse, that such analysis is economically feasible.

Most metastudies don't even verify the original results, just assume them as fine.


That's makes it merely expensive, not impossible.

The Reproducibility Project got a long way on a modest budget. They replicated 100 studies, got similar results on only 35%, for $15M. https://en.wikipedia.org/wiki/Reproducibility_Project.


Since you're looking for a (presumably) small signal you need to analyse quite a lot of papers to determine anything. And of course, who can trust a giant analysis study without further analysis?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: