Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sabine Hossenfelder tweets sarcastically on this.

https://twitter.com/skdh/status/1623728595297345538



Relevant xkcd "Linear Regression": https://xkcd.com/1725/


From one of the authors: "It's not a linear regression. This representation is a typical mistake when using mixed effect statistics. It is a "dyadic regression" which is a complex mixed effects procedure. We avoid such representation for ME but reviewers ask." (https://twitter.com/ulugeyik/status/1623654892639330304)

So there's more to the story but that graph is really easy to make fun of because it looks like a linear regression.


Interesting. That's not from the authors though: https://twitter.com/ulugeyik/status/1623749246322089984


Oh ok, just seemed like it from the way he talked but yeah, it is dyadic regression so there's more the story (no idea what it is) but it's still such a terrible graph that I would never want to put in a paper.


Expert with controversial opinions in her area of expertise feels qualified to criticize the work of experts in an area she knows nothing about.

EDIT: Turns out it’s a dyadic regression and Sabine doesn’t know what she is talking about.


Brain Thing Correlates With Other Brain Thing is the expected result of any two Brain Things, because all the Brain Things correlate. The failure here is a variant of the Streetlight Effect; we looked for this one correlation, we found it, so it must be important. It's not, unless it's really strong, because in this space, everything correlates to everything. To be important, there needs to be something more than "a correlation"; that's so expected as to be a near zero-bit contribution to science. For there not to be any correlation would actually be the surprising thing!


From my brief reading on dyadic regression your first sentence is why dyadic regressions are used in the circumstances of the paper. I don’t know the validity of the paper but I do strongly suspect Sabine doesn’t either and that her Tweet says more about her ignorance of the topic than it does about the validity of the paper.


I find there is a substantial difference between the intended performance of statistical tools intended to compensate for such things, and the actual performance of such tools. See also the attempt to filter out "confounding" variables in more normal statistical analyses; I personally believe that the effectiveness of such tools are extremely overstated and overestimated. There is a certain intrinsic circularity to any such tool.


I don’t know who she is, but you don’t need to be neuroscientist or a statistician to see how laughably bad that correlation claim is.


I don’t know neuroscience. I do know that sometimes there is so little known about a given area that one has to grasp at straws so to speak. You look for clues and present the findings. I don’t know what one does with the data represented. It appears to me that the data doesn’t represent a function. There’s a lot of “verticality” in the data. It’s obviously not exponential, logarithmic, or parabolic or some other higher degree polynomial. So how does one fit such data? I don’t know. Is it dumb to use linear fit? Or is it common practice with such data?


I mean, do you honestly think that the graph represents a good linear fit?


The data does not appear to me to represent a function. There is a lot of verticality to the data. It’s doesn’t appear to me to be logistic, exponential, or a higher degree polynomial. What is the best model? I don’t know. I’m not a statistician or a neuroscientist. Maybe they have reasons for doing a linear fit in this case that we aren’t aware of. Typically experts have reasons for what they do and their reasons would seem reasonable to me if I knew what they did.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: