卫报 | 找工作的压力和羞辱已经够多了,现在机器人来评判我们的简历了


来源:《卫报》

原文刊登日期:2021年3月15日


Algorithms are increasingly used by employers and headhunting firms to find the “best” and most qualified candidates. Before your potential future employer even has a chance to see your application for a job opening, there is a good chance your application has been rejected by a computer for specific criteria and will never be seen by a person. Some of these algorithms were put in place to try to break through human unconscious bias – to give a better shot to people with names that do not scream “white man”, or to address the problem of thin, attractive people doing better in job searches than those who do not meet conventional beauty standards.

翻译

雇主和猎头公司越来越多地使用算法来寻找“最佳”和最合格的候选人。在你未来的潜在雇主有机会看到你的求职申请之前,你的申请很有可能因为特定的标准而被计算机拒绝,并且永远不会被人看到。其中一些算法的实施是为了打破人类的无意识偏见——给那些名字不像“白人”的人更好的机会,或者解决瘦而有魅力的人比那些不符合传统颜值标准的人更容易找到工作的问题。


Employers like these sorting applications, then, because it gives them the sheen of pure objectivity. Opportunities are simply offered to the most qualified. How can a computer be prejudiced? It would probably not surprise you to learn, however, that algorithms, which are created by humans, also recreate human bias. The working class; single mothers; people with chronic health issues; people who have spent time in prison – all are more likely to have gaps in their work history. And while there are countless websites that offer tips on how to explain those gaps or overcome a lack of references or credentials during an interview, that explanation doesn’t matter if you can’t even get your application or résumé in front of a human. And because many of these processes are not transparent, it can be difficult to challenge the algorithm’s assessment or even know what part of your application is setting off the rejection.

翻译

雇主们喜欢这样分类求职申请,因为这给了他们纯粹客观的光环。机会只是提供给最合格的人。计算机怎么会有偏见?然而,如果你知道由人类创造的算法也会重现人类的偏见,你可能不会感到惊讶。工人阶级;单身母亲;有慢性健康问题的人;那些在监狱里呆过的人——他们的工作经历中更有可能出现空白期。虽然有数不清的网站都提供了一些建议,告诉你如何解释这些空缺,或者如何在面试中克服推荐人或证书的缺乏,但如果你甚至不能在一个人面前拿出你的申请或简历,这些解释就没有意义了。而且,由于这些过程中的许多都是不透明的,因此很难质疑算法的评估,甚至很难知道申请的哪个部分被拒绝了。


These changes also affect those looking for work that is generally understood to be in demand. The New York Times recently ran a story on doctors who couldn’t find jobs, even during a pandemic. Many could not get interviews, despite applying for dozens of positions, due to “gaps” in their applications. Applicants were rejected by algorithms for things like taking too long to complete their education, or being out of work for too long. The reasons for those deficiencies in their résumés were pretty predictable, from caretaking responsibilities to financial concerns.

翻译

这些变化也影响了那些正在寻找通常被认为是社会急需的工作的人。《纽约时报》最近刊登了一篇关于即使在大流行期间也找不到工作的医生的报道。许多人尽管申请了几十个职位,但由于求职申请中的“空档期”,他们没有得到面试机会。申请者会被算法拒绝,因为他们完成学业的时间太长,或者失业的时间太长。在他们的简历中,这些“空档”的原因是可以预见的,从照顾责任到财务问题。


Most of us have moments in our lives that need explanation. There are gaps in our histories, moments when we somehow just slid right out of other people’s and our own expectations for how things were going to go. These things leave their mark, not just on our psyches but also on the material world and our reputations via our credit scores, our rental histories, our work timelines, the Google results that come up when someone searches our name.

翻译

我们大多数人在生活中都有需要解释的时刻。在我们的历史中存在着一些空白,当我们不知何故从别人和我们自己对事情发展的期望中滑落出来的时候。这些东西不仅在我们的心理上留下了印记,也在物质世界和我们的声誉上留下了印记,通过我们的信用分数、我们的租赁历史、我们的工作时间表,以及当有人搜索我们的名字时出现的谷歌搜索结果。


After you’ve done the hard work of making your way back and repairing the relationships and the deficits you abandoned, it turns out these official histories are the least forgiving.

翻译

在你做了艰苦的工作回到过去,修复你抛弃的关系和赤字之后,事实证明,这些官方历史是最不宽容的。




意见反馈  ·  辽ICP备2021000238号