Ethics is crucial in developing the corporate safety-first culture. To understand the consequences of poor ethical decisions, we need to define the ethics in general and in the engineering perspectives. The poor ethical decision-making is a nontechnical cause of serious incidents. Many serious accidents which impacted the environment result from the poor decision making such as Deepwater Horizon explosion, Hurricane Katrina, Fukushima Daiichi Nuclear Power Station disaster, and The Piper Alpha explosion. However, many corporates in the US have developed the safety-first culture in the manufacturing which helped to reduce the number of fatalities.
Ethics can be defined as moral principles control a person’s or group’s behavior (Grubbe, 2016). In the context of engineering, ethics can be described as a mixture of complex principles. These principles are related to the professionalism, standards, management of risks, and attitude. Ethical decision-making might be one of the nontechnical causes of serious incidents. For example, (1) if an employee does what his manager instructed him to do, even if it is unsafe and against his or her engineering risk evaluation. (2) An employee cannot disagree with the employee’s manager if the manager neglect employee’s advice regarding the unsafe task. (3) The employee violates the law or regulations to save his or her position or job. I do agree with Grubbe (2016) on that not only the reasons above that might affect the decision making, but also money and time, and ignoring common sense are solid causes of the serious incidents. To improve the safety-first culture and to avoid the impact of ethical decision-making, the senior management in any organization should encourage their teams to report any concerns. Managers should also act as role models if something goes wrong. The team will seek the management reaction and consider it as an acceptable behavior.
The poor ethical decision-making can also cause disasters and environmental damages. In 2011, in the Gulf of Mexico, the Deepwater Horizon (DH), a semi-submersible drilling rig, exploded causing the worst oil spill in the world (Reader & O’Connor, 2013). Deepwater Horizon accident can be a very clear example of the wrong ethical decision-making. A series of inadequate operational behavior and some of the safety management issues were the root causes of the mishap. The BP engineer ignored some of the important maintenance tasks to save money and time. The BP engineers used the long-string production casing instead of the short-string casing, although the initial computer modeling showed that the long-string casing would hindrance the cement job. DH accident is a sign of lacking risk assessment by the platform team which is an indicator of poor decision-making. Another vital factor that was a result of 11 fatalities is that the BP engineers ignored the request of emergency alarm maintenance that impacted the evacuation of the platform. The DH team failed to activate the evacuation alarm which impacted the emergency evacuation on the platform.
The lack of preparation and assessment of the disaster, and poor communication were the casual factors of the failure to deal with Hurricane Katrina. The Federal Emergency Management Agency (FEMA) director, Michael Brown’s behavior was inadequate during the natural disaster. He insisted on following the formal process of FEMA as a backup instead of taking a lead-initiating role (Olejarski & Garnett, 2010). He took a very long time during the aftermath of Katrina to make a proper decision that was sending 1000 workers to the area. According to Olejarski & Garnett (2010), Brown did not show the integrator behavior, and he demonstrated a lack of knowledge of his responsibilities. He was waiting for the state and local officials to ask for the federal involvement. He failed to follow the National Hurricane Center warnings. The New Orleans Mayor Ray Nagin took a late decision to order an essential evacuation of the city. He was waiting until one day before the landfall as he was worried about the consequences of the evacuation on the city’s businesses lost revenue. His concern was about the city’s liability which affected his decision to evacuate the city despite the warning arose from the National Weather Service.
Inadequate ethical decision-making was one of the main factors of Fukushima Daiichi Nuclear Power Station disaster on March 11, 2011 (Kastenberg, 2014). This disaster had an impact on forests, rivers, and the environment. When the reactors started to melt, the engineers were hesitating to pump seawater into the reactors to cool them. They took several hours to decide to pump the seawater. The lack of immediate decision-making has affected the safety of the plant. According to Kastenberg study (2014), the Japanese process of the decision-making is based on the group decision-making, not individual decision-making. I guess this process may be the main reason for the late response to the emergency. It explains that the operators have no responsibilities or authority to act in the situations that are beyond the scope of the plant procedures. It was evident during the disaster; the operators waited several hours to open the containment venting until they obtain the approval of the Prime Minister. If the operators took an immediate decision and opened the containment venting, they would be able to avoid the worst consequences. Another lack of decision-making is that the engineers ignored the new geotechnical data regarding the potential recurrence of severe earthquakes and tsunamis.
Consider the good ethical decision-making; the US Airway Flight 1549 is an example of the proper decision-making, in which the captain safely landed in the Hudson River in New York on 15 January 2009 (Lin, 2012). In a very short time from the failure of the engine to the landing, the Captain Sullenberger took a critical decision that saved 155 lives on board (Lin, 2012). The captain decided to ignore the commands from the control tower to direct to the nearest airport. His risk-sensing helped him to assess the risk and choose to land in the Hudson River. If he relied on the control tower instructions, he would fail to save passengers’ lives.
Not all serious incidents result from significant issues or poor massive ethical decision-making. Small failures may lead to occupational catastrophe (Minter, 2014). The improvement of the number of fatalities in the US in the last 20 years in the manufacturing is a result of the safety-first culture that some corporate have developed. Some organizations have changed their entire culture from being reactive to proactive by involving the management in the operational issues. According to the survey conducted by DNV GL, 90% of the business managers who participated in this survey believe that the safety and health is an integrated part of their corporate culture (Minter, 2014). The corporate safety-first culture is mandatory not only to save employees’ health and safety, but also to save the company reputation and revenue. The explosion of Piper Alpha platform is said to result from a lack of occupational safety implementation. The engineers continued the production while upgrading equipment and maintenance tasks were being tasked. Another factor was that the account managers in the other platforms were unable to stop pumping the gas to the Piper Alpha platform. They claimed that they were not authorized to stop the production. This is a clear sign of poor corporate culture.
In summary, poor ethical decision-making may impact the emergency evacuations and the environment. During the natural or human-made disaster, the decision should be made in time manner that prevents the severe consequences. An example of the adequate decision-making is evident in the US Airways flight 1549. Some corporates have complicated processes and systems for decision making. However, the new safety-first cultures that some of the organizations in the US implement has a significant impact on reducing the number of fatalities in manufacturing.
Kastenberg, W. E. (2015). Ethics, risk, and safety culture: reflections on Fukushima and beyond. Journal Of Risk Research, 18(3), 304-316. doi:10.1080/13669877.2014.896399
LGRUBBE, D. (2016). HOW TO THINK ABOUT ETHICS?. TCE: The Chemical Engineer, (905), 44-53
Lin, M.-L. (2012, May). Tales of the unexpected. The Safety & Health Practitioner, 30(5), 37+. Retrieved from http://go.galegroup.com.libraryresources.columbiasouthern.edu/ps/i.do?p=ITOF&sw=w&u=oran95108&v=2.1&it=r&id=GALE%7CA290858457&asid=f8a1393c366e8598f641c9b92b71f42d
Minter, S. (2014). Workplace Safety: Small Failures and the Occasional Catastrophe. Industry Week/IW, 263(4), 38
Olejarski, A. M., & Garnett, J. L. (2010). Coping with Katrina: Assessing Crisis Management Behaviours in the Big One. Journal Of Contingencies & Crisis Management, 18(1), 26-38. doi:10.1111/j.1468-5973.2009.00597.x
Reader, T. W., & O’Connor, P. (2014). The Deepwater Horizon explosion: non-technical skills, safety culture, and system complexity. Journal Of Risk Research, 17(3), 405-424. doi:10.1080/13669877.2013.81565