A good heuristic is that if an argument resorts to "actually not doing <something complex sounding>" or "just doing <something simple sounding>" etc, it is not a rigorous argument.
That seems somewhat similar to perplexity based detection, although you can just get the probabilities of each token instead of picking n-best, and you don't have to generate.
It kinda works, but is not very reliable and is quite sensitive to which model the text was generated with.
> Have you tried Polars? It really discourages the inefficient creation of intermediate boolean arrays such as in the code that you are showing.
The problem is not usually inefficiency, but syntactic noise. Polars does remove that in some cases, but in general gets even more verbose (apparently by design), which gets annoying fast when doing explorative data analysis.
I wonder what the last example of "logistics without libraries" would look like in R. Based on my experience of having to do "low-level" R, it's gonna be a true horror show.
In R it's often that things for which there's a ready made libraries and recipes are easy, but when those don't exist, things become extremely hard. And the usual approach is that if something is not easy with a library recipe, it just is not done.
My understanding is that you typically pay something like this in the US for a specialist visit even if you have insurance, especially if you haven't already paid the year's deductibles.
reply