Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It is not "an agent" in the sense you are implying here, it does not will, want, plan, none of those words apply meaningfully. It doesn't reason, or think, either.

I'll be excited if that changes, but there is absolutely no sign of it changing. I mean, explicitly, the possibility of thinking machines is where it was before this whole thing started - maybe slightly higher, but moreso because a lot of money is being pumped into research.

LLMs might still replace some software workers, or lead to some reorganising of tech roles, but for a whole host of reasons, none of which are related to machine sentience.

As one example - software quality matters less and less the as users get locked in. If some juniors get replaced by LLMs and code quality plummets causing major headaches and higher workloads for senior devs, as long as sales don't dip, managers will be skipping around happily.




I didn't mean to imply AI was sentient or approaching sentience. Agency seems to be the key distinction between it and other technologies. You can have agency, apparently, without the traits you claim I imply.


Ah, ok, you must be using agency in some new way I'm not aware of.

Can you clarify what exactly you mean then when you say that "AI" (presumably you mean LLMs) has agency, and that this sets it apart from all other technologies? If this agency as you define it makes it different from all other technologies, presumably it must mean something pretty serious.


https://en.wikipedia.org/wiki/Agency_(philosophy)

This is not my idea. Yuval Noah Harari discusses it in Nexus. Gemini (partially) summarizes it like this:

  Harari argues that AI is fundamentally different from previous technologies. It's not just a tool that follows instructions, but an "agent" capable of learning, making decisions, and even generating new ideas independently.
> If this agency as you define it makes it different from all other technologies, presumably it must mean something pretty serious.

Yes, AI does seem different and pretty serious. Please keep in mind the thread I was responding to said we should think of AI as we would a hammer. We can think of AI like a tool, but limiting our conception like that basically omits what is interesting and concerning (even in the context of the original blog post).




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: