Forum talk:Flaming Telepaths - Wiki Dispute Resolution Guideline & Policy

vetoed content
 Safety culture and human errors

One relatively prevalent notion in discussions of nuclear safety is that of safety culture. The International Nuclear Safety Advisory Group, defines the term as “the personal dedication and accountability of all individuals engaged in any activity which has a bearing on the safety of nuclear power plants”. The goal is “to design systems that use human capabilities in appropriate ways, that protect systems from human frailties, and that protect humans from hazards associated with the system”.

At the same time, there is some evidence that operational practices are not easy to change. Operators almost never follow instructions and written procedures exactly, and “the violation of rules appears to be quite rational, given the actual workload and timing constraints under which the operators must do their job”. Many attempts to improve nuclear safety culture “were compensated by people adapting to the change in an unpredicted way”. For this reason, training simulators are used.

An assessment conducted by the Commissariat a` l’E´ nergie Atomique (CEA) in France concluded that no amount of technical innovation can eliminate the risk of human-induced errors associated with the operation of nuclear power plants. Two types of mistakes were deemed most serious: errors committed during field operations, such as maintenance and testing, that can cause an accident; and human errors made during small accidents that cascade to complete failure.