Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's no reality in the next twenty years where a non-technical individual is going to converse with a persistent agentic AI to produce a typical SaaS product and maintain it over a period of many years. I think we'll see stabs at this space, and I think we'll see some companies try to replace engineering teams wholesale, and these attempts will almost universally feel kinda sorta ok for the first N weeks, then nosedive and inevitably result in the re-hiring of humans (and probably the failure of many projects and businesses as well).

Klarna said they stopped hiring a year ago because AI solved all their problems [1]. That's why they have 55 job openings right now, obviously [2] (including quite a few listed as "Contractor"; the utterly classic "we fucked up our staffing"). This kind of disconnect isn't even surprising; its exactly what I predict. Business leadership nowadays is so far disconnected from the reality of what's happening day-to-day in their businesses that they say things like this with total authenticity, they get a bunch of nods, and things just keep going the way they've been going. Benioff at Salesforce said basically the same thing. These are, put simply, people who have such low legibility on the mechanism of how their business makes money that they believe they understand how it can be replaced; and they're surrounded by men who nod and say "oh yes mark, yes of course we'll stop hiring engineers" then somehow conveniently that message never makes it to HR because those yes-men who surround him are the real people who run the business.

AI cannot replace people; AI augments people. If you say you've stopped hiring thanks to AI, what you're really saying is that your growth has stalled. The AI might grant your existing workforce an N% boon to productivity, but that's a one-time boon barring any major model breakthroughs (don't count on it). If you want to unlock more growth, you'll need to hire more people, but what you're stating is that you don't think more growth is in the cards for your business.

That's what these leaders are saying, at the end of the day; and its a reflection of the macroeconomic climate, not of the impacts of AI. These are dead businesses. They'll lumber along for decades, but their growth is gone.

[1] https://finance.yahoo.com/news/klarna-stopped-hiring-ago-rep...

[2] https://klarnagroup.teamtailor.com/#jobs



Nicely said.

> AI cannot replace people; AI augments people.

Here’s where we slightly disagree. If AI augments people (100% does) it makes those people more productive (from my personal experience I am ballparking currently I am 40-45% more productive) and hence some people will get replaced. Plausibly in high-growth companies we’ll just all be more productive and will crank out 40-45% more products/features/… but in other places me being 40-45% more productive may mean other people might not be needed (think every fixed-price government contract - this is 100’s of billions of dollar market…)


Directionally, I think that's certainly true for some professions. It is probably the case that it isn't true for software engineering. The biggest reason involves how software, as a business, involves a ton of problem space exploration and experimentation. Most businesses have an infinite appetite for software; there's always another idea, another feature they can sell, another expectation, another bug we can fix, or a performance problem to improve.

Tooling improvements leading to productivity boons ain't a new concept in software engineering. We don't code in assembly anymore, because writing JavaScript is more efficient. In fact, think on this: Which would you grade as a higher productivity boon, as a percentage: Using JavaScript over Assembly, or using AI to write JavaScript over hand-writing it?

Followup: If JavaScript was such a 10,000% productivity boon over assembly, to AI's paltry 40-45%: Why aren't we worried when new programming languages drop? I don't shiver in my boots when AWS announces a new service, or when Vercel drops a new open source framework.

At the end of the day: there's an infinite appetite for software, and someone has to wire it all up. AI is, itself, software. One thing all engineers should know by now is: Introducing new software only increases the need for the magicians who wrangle it, and AI is as subject to that law as JavaScript, the next expensive AWS service, or next useless Vercel open source project.


> At the end of the day: there's an infinite appetite for software, and someone has to wire it all up.

I agree with this 100%... the core issue to ponder is this - Javascript was probably 1,000,000% productivity boon over assembly - no question about that but it did not offer much in the form of "automation" so-to-speak. It was just a new language that luckily for it became de-facto language that browsers understand. You and I have spent countless hours writing JS, TS code etc... The question I think here is whether LLMs can automate things or not. I consider a single greatest trait in the absolute best SWEs I ever worked with (that is a lot of them, two and a half decades plus doing this) and that is LAZINESS. Greatest SWEs are lazy by nature and we tend to look to automate everything that can be automated. Javascript is not helping me a whole lot with automation but LLMs just might. Writing docs, writing property-based tests for every function I write, writing integration tests for every end-point I write etc etc... In these discussions you can always detect "junior" developers from "senior" developers in that "juniors" will fight the fight "oh no way imma get replaced here, I do all this shit LLMs can't even dream of" while "seniors" are going "I have already automated 30-40-50% of the work that I used to do..."

the most fascinating part to me is that same "juniors" in the same threads are arguing things like "SWEs are not just writing code, there are ALL these other things that we have to do" without realizing that it is exactly all those "other things" that with LLMs you just might be able to automate your way out of it, fully or partially...


I agree;

- if your response to LLMs taking over is "they're bad and its not going to happen" i think you've basically chosen to flip a coin, and that's fine, you might be right (personally I do think this take is right, but again, its a coin flip)

- if your response is "engineering is about a lot more than writing code" you're right, but the "writing code" part is like 90% of the reason why you get paid what you do, so you've chosen the path of getting paid less, and you're still flipping a coin that LLMs won't come for that other stuff.

- the only black-pilled response is "someone has to talk to the llm", and if you spend enough time with that line of thinking you inevitably uncover some interesting possibilities about how the world will be arranged as these things become more prevalent. for me, its that: larger companies probably won't get much larger, we've hit peak megacorp, but smaller companies will become more numerous as individual leverage is amplified.


> if your response is “engineering is about a lot more than writing code” you’re right, but the “writing code” part is like 90% of the reason why you get paid what you do

[…]

> the only black-pilled response is “someone has to talk to the llm”,

This is literally the exact same response as “engineering is about a lot more than writing code”, since “talking to the LLM” when the LLM is the main code writer is exactly doing the non-code-writing tasks while delegating code writing to the LLM which you supervise. Kind of like a lead engineer does anyway, where they do much of the design work, and delegate and guide most of the code writing (while still doing some of it themselves.)




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: