I wouldn't. But most software developers didn't care whether translators liked the new jobs they had to find. So I think most non-programmers won't care whether you'll like your new job.
>Do you also expect I mention that I used Intellisense and syntax highlighting too?
No, but I expect my software to have been verified for correctness, and soundness by a human being with a working mental model of how the code works. But, I guess that's not a priority anymore if you're willing to sacrifice $2400 a year to Anthropic.
$2400? Mate, I have a free GitHub Copilot subscription (Microsoft hands them out to active OSS developers), and work pays for my Claude Code via our cloud provider backend (and it costs less per working day than my morning Monster can). LLM inference is _cheap_ and _getting cheaper every month_.
> No, but I expect my software to have been verified for correctness, and soundness by a human being with a working mental model of how the code works.
This is not exclusive with AI tools:
- Use AI to write dev tools to help you write and verify your handwritten code. Throw the one-off dev tools in the bin when you're done.
- Handwrite your code, generate test data, review the test data like you would a junior engineer's work.
- Handwrite tests, AI generate an implementation, have the agent run tests in a loop to refine itself. Works great for code that follows a strict spec. Again, review the code like you would a junior engineer's work.
>Perhaps the issue of cognitive decline comes from sitting there vegetating rather applying themselves during all that additional spare time.
The push for these tools is to increase productivity. What spare time is there to be had if now you're expected to produce 2-3X the amount of code in the same time frame?
Also, I don't know if you've gotten outside of the software/tech bubble, but most people already spend 90% of their free time glued to a screen. I'd wager the majority of critical thinking people experience on a day to day basis is at work. Now that we may be automating that away, I bet you'll see many people cease to think deeply at all!
There has been a growing amount of evidence for years that modern technology is not without it's side effects (mental health issues due to social media use, destruction of attention spans among the youth due to cell phone use, erosion of societal discourse and straight up political manipulation, and now we're seeing impacts to cognitive ability from LLMs).
>We’re about to move on to more interesting problems, and our collective abilities and motivation will still be stratified as it always has been and must be
Who is "we"? There are more people out there in the world doing hard physical labor, or data entry, than there are software engineers.
Yes. If you stop doing something, you get worse at it. There is literally no exception to this that I'm aware of. In the future where everyone is dependent on ever larger amounts of code, the possibility that nobody will be equipped to write/debug that code should scare you.
The amount of weightlifting a strength athlete needs to do to stay near their peak (but outside medal range) is ~15% of a full training workload. People can play instruments once a month and still be exceptional once the pathways are set down. Are you getting slightly worse at direct code jockeying? Sure, but not a lot, and you're getting superpowers in exchange.
The superpower you speak of is to become a product manager, and lose out on the fun of problem solving. If that's the future of tech, I want nothing to do with it.
In the previous scenario, programmers were still writing the code themselves. The compilers, if they were any good, generated deterministic code.
In our current scenario, programmers are merely describing what they think the code should do, and another program takes their description and then stochastically generates code based on it.
Compilers are (a) typically non-deterministic and, (b) produce different code from one version to the next, from one brand to the next, and from one set of flags to the next.
Are the companies funding this push for LLMs contributing to healthy cultures? The same companies who ruined societal discourse with social media? The same people who designed their algorithms to be as addictive as possible to drive engagement?
In the US of A, most blue-collar jobs:
1. Pay less 2. Offer poor, or zero benefits 3. Are hard on your body
Not to mention that if suddenly millions of people flood the markets for blue-collar work, then wages will drop further in those fields.
Why would you expect someone to be comfortable with that future?
reply