...
首页> 外文期刊>Journal of Behavioral and Brain Science >Cultural Difference and Cognitive Biases as a Trigger of Critical Crashes or Disasters
—Evidence from Case Studies of Human Factors Analysis
【24h】

Cultural Difference and Cognitive Biases as a Trigger of Critical Crashes or Disasters
—Evidence from Case Studies of Human Factors Analysis

机译:文化差异和认知偏见引发严重的崩溃或灾难
-人为因素分析案例研究的证据

获取原文
           

摘要

On the basis of the analysis of past case studies of crashes or disasters, it has been clarified how cultural difference and cognitive biases become a trigger of serious crashes or disasters. Heuristic-based biases such as confirmation bias, groupthink, and social loafing surely appeared in the process of crash or disaster breakout. Overconfidence-bases biases such as illusion of control, fallacy of plan, and optimistic bias are also ubiquitous in the route to a critical crash or disaster. Moreover, framing biases contribute to the distorted decision making, and eventually turn into the main cause of critical crash or disaster. In this way, as well as human factors or ergonomics approaches for designing man-machine systems, the prevention and the deletion of cognitive biases are indispensable for the preventing serious crashes or disasters from occurring. Until now, the distortion of decision making has not been discussed from the cultural differences of way of thinking. As well as a variety of cognitive biases, cultural difference in behavior is expected to be important for understanding the root causes of critical crash or disaster. We found that cultural difference distorted judgment through case studies of critical crashes or disasters. It was also demonstrated that considering cultural difference, as well as cognitive biases, is important to prevent irrational and biased decision making from occurring in safety management.
机译:在对过去的崩溃或灾难案例研究进行分析的基础上,已经阐明了文化差异和认知偏见如何导致严重崩溃或灾难的触发。在崩溃或灾难爆发的过程中,肯定会出现基于启发式的偏见,例如确认偏见,集体思维和社交游荡。基于过度自信的偏见,例如控制幻觉,计划的谬误和乐观偏见,在发生严重崩溃或灾难的过程中也很普遍。此外,成帧偏差会导致决策失真,并最终成为严重崩溃或灾难的主要原因。以这种方式,以及用于设计人机系统的人为因素或人体工程学方法,防止和消除认知偏差对于防止发生严重的崩溃或灾难是必不可少的。到目前为止,还没有从思维方式的文化差异中讨论决策的扭曲。除各种认知偏见外,人们期望行为的文化差异对于理解严重崩溃或灾难的根本原因也很重要。我们发现,通过对重大车祸或灾难的案例研究,文化差异会扭曲判断。研究还表明,考虑到文化差异以及认知偏见,对于防止安全管理中出现不合理和偏见的决策很重要。

著录项

相似文献

  • 外文文献
  • 中文文献
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号