Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> LLMs today can already automate many desk jobs.

No they can't because they make stuff up, fail to follow directions, need to be minutely supervised, all output checked and workflow integrated with your companies shitty over complicated procedures and systems.

This makes them suitable at best as an assistant to your current worker or more likely an input for your foo as a service which will be consumed by your current worker. In the ideal case this helps increase the output of your worker and means you will need less of them.

An even greater likelihood is someone dishonest at some company will convince someone stupid at your company that it will be more efficacious and less expensive than it will ultimately be leading your company to spend a mint trying to save money. They will spend more than they save with the expectation of being able to lay off some of their workers with the net result of increasing workload on workers and shifting money upward to the firms exploiting executives too stupid to recognize snake oil.

See outsourcing to underperforming overseas workers because the desirable workers who could have ably done the work are A) in management because it pays more B) in country or working remotely for real money or C) cost almost as much as locals once the increased costs of doing it externally are factored in.



> No they can't because they make stuff up, fail to follow directions, need to be minutely supervised, all output checked and workflow integrated with your companies shitty over complicated procedures and systems.

What’s the difference between what you describe and what’s needed for a fresh hire off the street, especially one just starting their career?


> What’s the difference between what you describe and what’s needed for a fresh hire off the street, especially one just starting their career?

The fresh hire has the potential that after training and working for a while to become a much more valuable and reliable senior.


> has the potential

Good choice of wording! Definitely not a given though.


Real talk? The human can be made to suffer consequences.

We don't mention this in techie circles, probably because it is gauche. However you can hold a person responsible, and there is a chance you can figure out what they got wrong and ensure they are trained.

I can’t do squat to OpenAI if a bot gets something wrong, nor could I figure out why it got it wrong in the first place.


The bell curve is much wider for humans than LLMs, I don't think this needs to be said.


The difference is that a LLM is like hiring a worst-case scenario fresh hire that lied to you during the interview process, has a fake resume and isn't actually named John Programmer.


Entirely disagree.


boy do I love being in the same industry as people like you… :) while you are writing silly stuff like this us that do shit have automated 40-50% of what we used to do and not have extra time to do more amazing shit :)




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: