|← Market Analysis||Crime Analysis →|
On January 28, 1986, America was left amazed after the sudden death of six astronauts and a schoolteacher. This was because of the explosion of Space Shuttle Challenger on live television. These unwarranted deaths and the fact that this accident was on live television have made the case be a famous one that needs through analysis. Investigators established that the major cause of the disaster was an engineering one, the faulty O-ring. However, the situations of this case present the perfect opportunity to understand the importance of both vertical and horizontal communication in the organization. The analysis of this case provides the best opportunity to clear out the role that ethics play in promoting communication in the organization.
The NASA used Space Shuttle Challenger for carrying out many dozens of flights. Americans had a high level of trust on the success of NASA projects. This accident was a shock to the public who required an immediate explanation as to its cause. Morton Thiokol was the engineering entity responsible for the construction of the space ship. Morton’s engineers had reservations concerning the launch. The engineers communicated their misgivings to the managers of the launch who did not heed their warnings. The engineers had their misgivings for more than eight years.
The Roger Commission, charged with the task of establishing the cause of the accident, interviewed all parties, including top decision makers at NASA. After careful investigation and analysis, the commission found that the accident was because of a decision making flaw. Was the decision-making process bulletproof, the launch would never have taken place? The management should have considered all the opinions of the engineering team. As this was an engineering project, the major determinant of whether the project was a go, should have been the opinions of the engineers (McConnell 2).
What then forced the management to ignore the advice of the engineering team? We have to go back all the way to the year 1970. Due to budgetary reasons, President Nixon bans the Mars project. In its place, NASA began the construction of a reusable and operational space shuttle. President Reagan was to announce its operation ability in 1982. This piled pressure on NASA to embark on a schedule of flights to justify the shuttle creation. This pressure came from a need for NASA to be self-reliant financially by launching flights on behalf of a wide selection of customers.
Pressure from customers might have made decision makers ignore the reservations of engineers. NASA had highly ambitious goals economically. This is another probable cause that made NASA launch Space Shuttle Challenger despite the engineers’ objections. To achieve its economic goals, NASA was required to launch a set number of flights per year. Sometimes, when the client is influential or offers a lot of money, then NASA places its economic goals first (McConnell 4).
NASA was attempting to meet the high number of flights in order to meet its military and economic objectives. However, while trying to achieve this, it did encounter a number of problems that made it difficult to achieve its goals in a safe manner. Instead of redefining its goals, NASA solves its problems by neglecting its safety procedures for money.
However, the major cause of the accident was in no doubt the faulty O-ring. This, according to the commission, was a fault of Morton Thiokol. The commission also blamed NASA for its failure to consider the misgivings that the engineers had. In 1982, there was a classification of the ring as Criticality 1, meaning that it could cause death if it failed. The significance is that the ring had no backup. Both NASA and Thiokol were aware of this fact, but they ignored it (Ross 6).
Robert Ebeline was an engineer at Morton Thiokol. Robert was the manager ignition and final assemblage for the rotor motors. On January 27, he discovered that the prevailing cold weather conditions would be a source of disaster. In line with this, Robert called a meeting of engineers and let them know about his uncertainty. It was at this meeting that other engineers expressed their misgivings concerning the O-rings testing temperature.
The prevailing temperatures were much lower than the temperatures the O-rings were tested. Robert said the commission engineer, Arnold Thomson, tried to convince the managers of the project the potential dangers that cold temperatures could pose on the rings. Both of these engineers try to do a practical demonstration of their worries. One used some photos, and the other one showed some drawings. However, they stopped when they realized that no one was listening to them.
Robert goes on to say that after they had finished talking, the managers decided to make their final judgment without involving any of the engineers present. We should note here that the level three managers at NASA made this decision. Due to the barriers to vertical communication at NASA, the concerns of Thiokol engineers did not reach the level one and two managers (Vaughan 3).
Thiokol engineers made their concerns known to the managers at NASA. In my opinion, the responsibility for the failure of this launch rests barely on the top managers at NASA. The level one managers fail in providing a free communicational environment. Such a project would also require the authorization of the top managers at NASA. However, the final authorization comes from the level three managers. Again, the level one managers fail in their duty.
The Therac 25 is a linear accelerator device that can kill tumors without destroying the tissues that are in proximity to the tumor. However, between 1985 and 1987, Therac 25 heavily overdosed six patients that were under the tumor treatment. The Therac 25 failure is one of the major disasters in the use of linear accelerators. Despite the great investment in the machinery, Therac 25 still fails, and it is crucial to understand the cause of the failure.
The Therac 25 machine is dependent on the computer control. The program is also highly dependent on software. The Therac 25 was an improvement on the Therac 20 and the Therac 6. The development of the Therac 25 was to correct the misgivings of these two previous designs. However, there was an incorporation of some of the features of these previous designs into the Therac 25. The reuse of some of the modules of the Therac 6 and the Therac 20 is one of the reasons of the failure of the Therac 25 (Nancy 9).
Another probable cause of failure is the positioning of the turntable (Miller 6). The Therac 25 has three modes; the electron mode, the photon mode and the field light position. All the accessories for these three modes are on the turntable. Proper operation of the turntable is, therefore, highly dependent on the position of the turntable. The position should be so precise not to allow any accident to happen. In the electron mode, there is an acceleration of a high-energy beam that is dangerous to human tissue. A computer controls this energy and keeps it between 5 and 25 MeV. Scanning magnets keep the energy at manageable levels. These magnets are on the turntable.
In the photon mode, the energy is at a constant 25 MeV. This mode requires current that is 100 times more than the electron mode requires. The mode requires a flattener to produce a constant treatment field. Again, this flattener is on the turntable. If the flattener is not in position, it can cause considerable damage to the patient. The Therac 25 uses a computer to monitor the positioning of the turntable. If there is a fault in the computer, then the turntable will be out of position. The flattener will too, be out of position, leading to substantial damage to the patient.
The operation manual was another cause of the failure of the Therac 25. For any equipment to perform better, the user must be well conversant with all the codes that the machine uses. This requires a proper documentation of these codes (Nancy 12). The Therac 25 does not identify malfunction codes. The manual of the devise does not explain malfunctions; it merely gives the numbers of the malfunction. The manual also does not show the limits within which the patient is safe.
Malfunction errors were also so common that operators had become used to them. The malfunction usually resulted in low dosages and never in over dosages. Such scenarios were easily rectifiable by technicians. Some operators had the instructions that the Therac 25 was so secure to cause any damage to the patient, and hence, some of them ignored the set safety procedures (Miller 4).
The failure of the Therac 25 in both Canada and the United States of America had various ramifications. Firstly, it led to further analysis of the Therac 25 in order to find the source of these malfunctions and improve on its design. Investigations found that there is a flaw in the micro switch. Symond’s report recommends the redesign of this switch. Investigators also find an error in the message of Malfunction 54. They conclude that the cause was the data entry speed. There was an overdose for those patients who had gone under the procedure for several times. The analysis also led to redesign of the computer software that ran the Therac 25. The software malfunction was a key source of the Therac 25 malfunction especially Malfunction 54 (Miller 8).
This case highlights the various ethical standards that a software engineer should keep. The Therac 25 machine is a lifesaving machine. Its usability and operation should not pose any health risk to the very same person it is trying to save (Nancy 4). The machine should be error proof with an error rate of zero. A computer professional has a responsibility to give a thorough analysis of the program he or she creates. Thus, in this case, I noticed an error on the machine, I would have done a re-evaluation of the program. It is also ethical to ensure that the public has a proper understanding of the program he or she has created. He or she should provide a well-detailed manual of the operations of the program.
A computer professional also has to honor the contracts he/she has taken. In this case, the hospitals require bulletproof equipment for treating cancer. The equipment should exude confidence in both the hospitals and the patients. It should not lead to any loss of life, as it happened with the Therac 25. The professional also should provide an analysis of the possible risks the machine can bring. The Therac 25 was documented to be highly error free, which was not the case.
This case has high legal ramifications. It was the fault of designers that led to the death of six innocent people. The families of these victims would want a settlement, whether inside or outside the court. However, it is impossible to replace a lost life. The court settlements would also be a burden to the design company. A computer professional should be aware of this fact when releasing any program to the market (ACM Code of Ethics and Professional Conduct, 1).