> Almost every feature required multiple iterations and refinements. This isn't a limitation—it's how the collaboration works.
I guess that's where a big miss in understanding so much of the messaging about generative AI in coding happens for me, and why the Fly.io skepticism blog post irritated me so much as well.
It _is_ how collaboration with a person works, but the when you have to fix the issues that the tool created, you aren't collaborating with a person, you're making up for a broken tool.
I can't think of any field where I'd be expected to not only put up with, but also celebrate, a tool that screwed up and required manual intervention so often.
The level of anthropomorphism that occurs in order to advocate on behalf of generative AI use leads to saying things like "it's how collaboration works" here, when I'd never say the same thing about the table saw in my woodshop, or even the relatively smart cruise control on my car.
Generative AI is still just a tool built by people following a design, and which purportedly makes work easier. But when my saw tears out cuts that I have to then sand or recut, or when my car slams on the brakes because it can't understand a bend in the road around a parking lane, I don't shrug and ascribe them human traits and blame myself for being frustrated over how they collaborate with me.
Likewise when they use all these benchmarks for "intelligence" and the tool will do the silliest things that you'd consider unacceptable from a person once you've told them a few times not to do a certain thing.
I love the paradigm shift but hate when the hype is uninformed or dishonest or not treating it with an eye for quality.
I guess that's where a big miss in understanding so much of the messaging about generative AI in coding happens for me, and why the Fly.io skepticism blog post irritated me so much as well.
It _is_ how collaboration with a person works, but the when you have to fix the issues that the tool created, you aren't collaborating with a person, you're making up for a broken tool.
I can't think of any field where I'd be expected to not only put up with, but also celebrate, a tool that screwed up and required manual intervention so often.
The level of anthropomorphism that occurs in order to advocate on behalf of generative AI use leads to saying things like "it's how collaboration works" here, when I'd never say the same thing about the table saw in my woodshop, or even the relatively smart cruise control on my car.
Generative AI is still just a tool built by people following a design, and which purportedly makes work easier. But when my saw tears out cuts that I have to then sand or recut, or when my car slams on the brakes because it can't understand a bend in the road around a parking lane, I don't shrug and ascribe them human traits and blame myself for being frustrated over how they collaborate with me.