Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Natural language has a high entropy floor. It's a very noisy channel. This isn't anything like bit flipping or component failure. This is a whole different league. And we've been pouring outrageous amounts of resources into diminishing returns. OpenAI keeps touting AGI and burning cash. It's being pushed everywhere as a silver bullet, helping spin lay offs as a good thing.

LLMs are cool technology sure. There's a lot of cool things in the ML space. I love it.

But don't pretend like the context of this conversation isn't the current hype and that it isn't reaching absurd levels.

So yeah we're all tired. Tired of the hype, of pushing LLMs, agents, whatever, as some sort of silver bullet. Tired of the corporate smoke screen around it. NLP is still a hard problem, we're nowhere near solving it, and bolting it on everything is not a better idea now than it was before transformers and scaling laws.

On the other hand my security research business is booming and hey the rational thing for me to say is: by all means keep putting NLP everywhere.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: