I see the same mistake made everywhere, thinking that in software engineering that the hard part is making new code.
A large chunk of the work is dealing with people, understanding what do they really want/need and helping them understand it.
On the technical side, most of the work is around fixing issues with existing software (protecting an investment).
Then, maybe 1 to 10% of the workload is making something new.
AI kinda works for the "making something new" part but sucks at the rest. And when it works, it's at most "average" (in the sense of how good it's training set was, it prefers things it sees more commonly regardless of quality).
My gut instinct is that there's going to be an AI crash, much like in the late 90s/early 2000s. Too much hype, and then, after the crash, maybe we'll start to see something a bit more sane and realistic.
A large chunk of the work is dealing with people, understanding what do they really want/need and helping them understand it.
On the technical side, most of the work is around fixing issues with existing software (protecting an investment).
Then, maybe 1 to 10% of the workload is making something new.
AI kinda works for the "making something new" part but sucks at the rest. And when it works, it's at most "average" (in the sense of how good it's training set was, it prefers things it sees more commonly regardless of quality).
My gut instinct is that there's going to be an AI crash, much like in the late 90s/early 2000s. Too much hype, and then, after the crash, maybe we'll start to see something a bit more sane and realistic.