I think part of the problem with that is that for physical engineering, there are clear, well-understood, deterministic and enumerable requirements that, as long as you as the engineer understand them and take them properly into account, your bridges and buildings won't fall down.
With software engineering, yes, there are best practices you can follow, and we can certainly do much better than we've been doing...but the actual dangers of programming aren't based on physical laws that remain the same everywhere; they're based on the code that you personally write, and how it interacts with every other system out there. The requirements and pitfalls are not (guaranteed to be) knowable and enumerable ahead of time.
Frankly, what would make a much greater difference, IMNSHO, would be an actual industry-wide push for ethics and codes of conduct. I know that such a thing would be pretty unpopular in a place like Y Combinator (and thus HackerNews), because it would, fundamentally, be saying "put these principles ahead of making the most money the fastest"—but if we could start a movement to actually require this, and some sort of certification for people who join in, which can then be revoked from those who violate it...
If we could get such a cultural shift to take place, it would (eventually) make it much harder for unscrupulous managers and executives to say "you'll ship with these security holes (or without doing proper QA), because if you don't we make less money" and actually have it stick.
I think we're basically describing the same thing. Asking a software engineering process to be the same as a physical engineering process is not realistic. A PE for SEs would look more like a code of ethics and conduct than a PE for say civil engineering.
The key thing to borrow from physical engineering is the concept of a sign off. A PE would have to sign off on a piece of software, declaring that it follows best practices and has no known security holes. More importantly, a PE would have the authority and indeed obligation to refuse to sign off on bad software.
But expecting software to have clear, well-understood, deterministic requirements and follow a physical engineering requirements-based process? Nah. Maybe someday, I doubt in my lifetime.
With software engineering, yes, there are best practices you can follow, and we can certainly do much better than we've been doing...but the actual dangers of programming aren't based on physical laws that remain the same everywhere; they're based on the code that you personally write, and how it interacts with every other system out there. The requirements and pitfalls are not (guaranteed to be) knowable and enumerable ahead of time.
Frankly, what would make a much greater difference, IMNSHO, would be an actual industry-wide push for ethics and codes of conduct. I know that such a thing would be pretty unpopular in a place like Y Combinator (and thus HackerNews), because it would, fundamentally, be saying "put these principles ahead of making the most money the fastest"—but if we could start a movement to actually require this, and some sort of certification for people who join in, which can then be revoked from those who violate it...
If we could get such a cultural shift to take place, it would (eventually) make it much harder for unscrupulous managers and executives to say "you'll ship with these security holes (or without doing proper QA), because if you don't we make less money" and actually have it stick.