> If I am to serve a 1 year prison sentence for faulty code
I wonder why would a programmer go and work on a code when he is always in danger of going to prison for every mistake he makes. We all know that bugs in software happen and it’s impossible to write bug free code
> We all know that bugs in software happen and it’s impossible to write bug free code
The go-to counterexample is the code that ran on the space shuttle, which took years and hundreds of people and $200 million to produce. I have been told that nobody has ever found a bug in the final production system. The development practices involved in creating it seem like they would make most software engineers want to curl up and die. One tidbit from the following link I found is that 60 people spent years working on the backup flight control system, intended to never be used!
Well we could mandate that self-driving cars be programmed to the same standard. I mean it means that self-driving cars will simply never exist, but we could mandate it!
Maybe we shouldnt run safety critical code of which we dont know what it does. Hard realtimesystems exist for a reason. Sure it would also be nice if we could drive nuclear power plants at home via a smartphone app, so the most qualified expert could help in an emergency, but its simply not a sane idea.
This cop out of "Software gonna have bugs" as a way to evade all liability doesnt hold in any other profession. I dont see why we get special treatment here.
It does hold, in some form or other, in other professions. The EPA formally defines how much cash they're willing to burn to save a human life, which seen from one perspective is an "environment gonna kill people" cop out. Nuclear missile silos have two operators because "people gonna launch nukes". Cars have airbags because "vehicles gonna crash". Every field has risks and risk management, every field has certain steps that they could take for the purpose of safety that are judged too expensive to justify the risk, and part of managing risks in software is that you should plan for bugs and plan to mitigate their impact.
Nobody calls the emergency services out because they assume cars are going to crash and plan accordingly, so why does the software engineering industry get called out for assuming software will go wrong and planning accordingly?
> I wonder why would a programmer go and work on a code when he is always in danger of going to prison for every mistake he makes.
You don't go to prison for "mistakes", you go to prison for criminal negligence.
Given that you can go to prison for criminal negligence in any other profession (e.g. a dentist who exposes thousands of people to HIV), why would programmers be the one category of job that's exempt from criminal laws?
If a politically motivated prosecutor can convince a jury of twelve people that Bill killed Dave, by using his psychic brain powers, or, alternatively, by sticking pins into a voodoo doll, or some other form of non-scientific witchcraft, Bill is going to hang.
The hypothetical gullibility of juries doesn't mean that we shouldn't have laws against murder, juries, or prosecutors.
My apologies for potentially hijacking this, but... this is exactly why the term "software engineer" bothers me. Yes, the software you write isn't likely to cause a shopping mall to collapse on a crowd of people, but there can be huge financial and societal responsibility here, and yet, almost every software license in existence completely disclaims that. Engineering comes with a tremendous amount of ethical and legal responsibility.
...sorry for the sarcasm but this is a message student engineers internalize in part because it is pushed by companies they want to work for but can't explain why. They don't have strong boundaries between why it might he a justifiable philosophy for Facebook but not for Boeing.
Agreed. I'm a software developer who has a degree in mechanical engineering and changed over. I cringe at calling myself an engineer in my current role.
First - architects shouldn't be lumped in with engineers in this context. It seems like there's some confusion around how far the responsibilities of an architect extend, which makes me wonder if things are different in the US? Whenever I research it seems to be the same as Australia though. Anyway, architects know as much about structural engineering as engineers know about Le Corbusier. An architect would really have to mess up to be criminally negligent.
But in response to your comment, engineers have a very clearly defined set of rules, collectively called "the code". As long as they design to them you'll be free of criminal negligence (at least during the design & development phase, things get a little murkier in construction).
More to the point, engineering for the real-world means layer upon layer of uncertainty. e.g. civil/structural engineers use materials we can't model (we use pretty-good approximations for concrete behavior) in conditions which are unknown (soil) to resist forces we can't predict (weather, earthquakes, dynamic response to loading). How does the code deal with all this uncertainty? Slap safety-factor after safety-factor onto everything. Whoever comes up with a method for more effectively dealing with this stuff will make millions.
The most obvious example being that we design structures to resist a 1-in-100 year storm. In other words, we expect a structure to be within spitting distance of failure every hundred years. But as long as you design to that standard, you're fine.
If the programmer can’tbe bothered to put in enough verification effort to avoid being negligent, they should not be in charge of safety critical code. Plainly: don’t pretend there aren’t ways to ensure correctness of systems.
But that same developer can put the same effort in at another field and not be constantly at risk of a mistake doing massive damage to them. You would have to be paid an incredible amount to make it worth it.
I think software is a lot more brittle than physics. It takes a lot to go from a sturdy building to something that could collapse, whereas in software the difference could be a single line of code, or two transposed words.
Also, the software world doesn't have the benefit of millennia of accumulated best practices.
Finally, the senior engineers and architects who are licensed to sign off on things they will be held criminally liable for do get incredible amounts of compensation as compared to their more junior colleagues.
Anybody can build a bridge that doesn't collapse, it takes an engineer to build bridges that barely don't collapse. This is hard work, don't downplay it.
I'm not sure how facetious you're being so I'll just play it straight. Modern buildings are not supposed to barely not collapse, they're supposed to be safe even under scenarios quite a bit more extreme than what they're expected (or even legally allowed) to handle.
They're supposed to be exactly as stable as required by law for their intended use, using the minimal amount of labor and materials so satisfy the requirements. Making them more stable using more money is easy, but getting the calculations just right requires years of training.
I see what you mean now. My point is that even if there's a minor screwup, the architect probably won't be prosecuted for anything because the building won't fail catastrophically thanks to modern standards which have substantial margins of safety built into them. It really does require criminal levels of negligence or extreme circumstances that would almost certainly save the architect from prosecution to have a building collapse on you.
On the other hand, a minor screwup in software is far more likely to cause catastrophic failure because we just don't know how to workably build large, robust systems out of code.
Nit: meeting the requirements of law isn't barely collapsing; that requirement has so many safety factors built-in because the building code has to approximate so much. The approach isn't that dissimilar from how that "anybody" you mention would build their bridge that doesn't fall down: by guessing safely.
It's over 20 years since I finished my degree, and I've never worked in the industry, but I spent sufficient hours poring over the ISO standard document for pressure vessels as part of my final year project that I can still remember the sorts of things it covers.
Effectively, they test things to destruction, then publish minimum requirements. So if you want to pressurise your reactor vessel to reactor vessel to 30 atmospheres, you can pretty much look up a table that'll tell you precisely how thick the reactor walls need to be for each of the commonly used materials. If you want to use something uncommon, then you need to pay somebody to test it.
If it fails in a catastrophic fashion, you can expect to be asked to show that you did your due diligence, and there are extenuating circumstances a reasonable engineer could not have been expected to foresee and plan for. Or that you did foresee it, and somebody else chose to accept the (clearly defined) risk.
There are non-tech professions, where the lives of other people are literally in your hands. And a lot of the people doing them get paid a lot less then software engineers.
Or there will be people who gain expertise in developing software good enough that software engineering will be considered an actual engineering discipline.
I wonder why would a programmer go and work on a code when he is always in danger of going to prison for every mistake he makes. We all know that bugs in software happen and it’s impossible to write bug free code