首页> 外文期刊>CMI >REPROGR A MMING RECRU I TMENT?
【24h】

REPROGR A MMING RECRU I TMENT?

机译:重新编程一位Mming Recru它融合了吗?

获取原文
获取原文并翻译 | 示例
           

摘要

In their book Building an Inclusive Organization: Leveraging the power of a diverse workforce, Stephen Frost and Raafi-Karim Alidina cite examples of Al perpetrating some horrific displays of prejudice. In the US, for example, the think tank ProPublica analysed an algorithm called COMPAS that was designed to predict the likelihood of criminal defendants reoffending in Florida. The study ultimately found that COMPAS was "twice as likely to misclassify black defendants as more likely to reoffend than their white counterparts". Last year, Amazon discovered that its candidate-filtering systems didn't seem to like women. As Reuters reported, this was because "Amazon's computer models were trained to vet applicants by observing patterns in the resumes submitted to the company over a IO-year period. Most came from men, a reflection of male dominance across the tech industry." So why are the algorithms misfiring? Kate Glazebrook, founder and CEO of recruitment platform Applied, says it's important to remember who programmes the code: "Al often backfires, baking in existing biases."
机译:在他们的书中建造一个包容性的组织:利用多样化的劳动力,斯蒂芬弗罗斯特和拉菲 - 卡里姆的力量,这一例子是偏见的一些可怕的偏见展示。例如,在美国,智库Propublica分析了一种称为COMPAS的算法,旨在预测刑事被告在佛罗里达州重建的可能性。这项研究最终发现,Compas是“将黑人被告错误分类的可能性,比他们的白色同行更容易重新偿还”。去年,亚马逊发现其候选过滤系统似乎没有像女性。随着路透社报道,这是因为“亚马逊的计算机模型通过在IO年期间向公司提交给公司提交的纲要的模式训练为兽医申请人。大多数来自男性,反映了对科技业的男性优势。”那么为什么算法发生误解?蜂巢平台的创始人兼首席执行官凯特盖特格(Kate Glazebrok)表示,记住谁计划代码:“Al经常遭受偏见,烘焙现有偏见”是很重要的。

著录项

  • 来源
    《CMI》 |2019年第autumnawinter期|42-4345|共3页
  • 作者

    EMILY HILL;

  • 作者单位
  • 收录信息
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号