Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I hear this argument all the time, and I think “this is exactly how people who coded in assembly back in the day thought about those using higher level programming languages.”

It is a paradigm shift, yes. And you will know less about the implementation at times, yes. But will you care when you can deploy things twice, three times, five times as fast as the person not using AI? No. And also, when you want to learn more about a specific bit of the AI written code, you can simply delve deep into it by asking the AI questions.

The AI right now may not be perfect, so yes you still need to know how to code. But in 5 years from now? Chances are you will go in your favorite app builder, state what you want, tweak what you get and you will get the product that you want, with maybe one dev making sure every once in a while that you’re not messing things up - maybe. So will new devs need to know high level programming languages? Possibly, but maybe not.



1. We still teach assembly to students. Having a mental model of what the computer is doing is incredibly helpful. Every good programmer has such a model in my experience. Some of them learned it by studying it explicitly, some picked it up more implicitly. But the former tends to be a whole lot faster without the stop on the way where you are floundering as a mid level with a horribly incorrect model for years (which I’ve seen many many times).

2. Compilers are deterministic. You can recompile the source code and get the same assembly a million times.

You can also take a bit of assembly then look at the source code of the compiler and tell exactly where that assembly came from. And you can change the compiler to change that output.

3. Source code is written in a formal unambiguous language.

I’m sure LLMs will be great at spitting out green field apps, but unless they evolve to honest to goodness AGI, this won’t get far beyond existing low code solutions.

No one has solved or even proposed a solution for any of these issues beyond “the AI will advance sufficiently that humans won’t need to look at the code ever. They’ll never need to interact with it in any way other than through the AI”.

But to get to that point will require AGI and the AI won’t need input from humans at all, it won’t need a manager telling it what to build.


The point of coding is not to tell a machine what to do.

The point of coding is to remove ambiguity from the specs.

"Code" is unambiguous, deterministic and testable language -- something no human language is (or wants to be).

LLMs today make many implementation mistakes where they confuse one system with another, assume some SQL commands are available in a given SQL engine when they aren't, etc. It's possible that these mistakes will be reduced to almost zero in the future.

But there is a whole other class of mistakes that cannot be solved by code generation -- even less so if there's nobody left capable of reading the generated code. It's when the LLM misunderstands the question, and/or when the requirements aren't even clear in the head of the person writing the question.

I sometimes try to use LLMs like this: I state a problem, a proposed approach, and ask the LLM to shoot holes in the solution. For now, they all fail miserably at this. They recite "corner cases" that don't have much or anything to do with the problem.

Only coding the happy path is a recipe for unsolvable bugs and eventually, catastrophe.


You seem so strong opinionated and sure what the future holds for us, but I must remember you though, that in your example "from assembly to higher level programming languages" the demand for programmers didn't go down, went up, and as companies were able to develop more, more development and more investments were made, more challenges showed up, new jobs were invented and so on... You get where I'm going... The thing I'm questioning is how much lazy new technologies make you, many programmers even before LLMs had no idea how a computer works and only programmed in higher level languages, it was a disaster before with many people claming software was bad and industry going down a road where software quality matters less and less. Well that situation turbo boosted by an LLMs because "doesn't matter i can deploy 100x times a day" disrupting user experience imo won't led us far




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: