Tuesday, September 3, 2013

Homework #5 - 9/3

Responses for readings for 8/29 and 9/3

     In homework #2, we were asked to discuss whether or not we think programmers should be certified professionals like doctors and lawyers. I held the position that we should not, but after reading some of these articles, I have rethought my take on this subject. In the article about the Therac-25 incidents, the authors say that "more events like those associated with the Therac-25 will make such certification inevitable". I have to agree at least in the case of safety-critical systems and perhaps systems that deal with sensitive data as well, where security is essential. Of course, one could argue that nearly all modern applications fit these categories - apps are getting more and more complicated and requesting access to all sorts of personal information on our phones, and even becoming integrated in our homes and cars. I think software engineers in these fields should undergo extra training on how to handle sensitive data as well as implement safety features in their programming. The Therac-25 case unfortunately held partial blame on simple programming mistakes, such as implementing a variable by one and eventually causing an overflow condition instead of setting the variable to a non-zero value.
     Another focus of the readings was to have thorough requirements outlined before starting a project. The main example of this is the FBI case tracking system, VCF. It is hard to believe that such a massive software development failure could have happened so recently. The major time constraints put on the project as well as focusing on small details of the final product, such as where a button should be located in the interface, instead of the overall functionality of the program set it up for failure from the beginning. It is also disturbing that the FBI was going to do what's called a 'flash cutover' where you replace a legacy system with a totally new one in one fell swoop and cannot revert if there are any failures. This was all exacerbated by the 9/11 event as well as several staff and management turnovers during the development process.  
     Another common theme I noticed was about complacency. Software from previous versions was reused and it was assumed to work correctly when that was not necessarily the case. The Therac-20 contained some of the same software bugs as the Therac-25, but the earlier machine had hardware fallout systems in place to prevent overdose even if the software had a failure. This resulted in many blown fuses, but prevented potentially fatal overdoses of radiation. This is also observed in the Ariane 5 incident, where software from the Ariane 4 was reused and some functions that were already implemented were left in, even though the newer version had no use for them. The software "was assumed to be correct until it was
shown to be faulty" as the accident report points out. Unfortunately, as projects have limited budget, it seems that potential risks get downplayed and do not receive as much attention as they should. Although it shouldn't be this way, perhaps it is up to the developers to ensure they're doing everything they can to practice safe programming and to avoid unnecessary complication when writing software.

No comments:

Post a Comment