新科学家 | 认清科技的不平等


来源:《新科学家》

原文刊登日期:2021年4月7日


本文较简单,适合英语二考生


Technology is so embedded in our lives that we can sometimes forget it is there at all. Your Computer is on Fire is a vital reminder not only of its presence, but that we urgently need to extinguish the problems associated with it.

翻译

科技在我们的生活中如此根深蒂固,以至于我们有时会忘记它的存在。“你的电脑着火了”是一个重要的提醒,它不仅提醒我们科技的存在,而且提醒我们迫切需要消除与之相关的问题。


The book challenges us to a radical rethink so that we can tackle a large range of problems, from algorithmic bias to climate change. These are addressed in a collection of essays, each highlighting problems in our relationship with technology and proposing ways to fix them.

翻译

这本书要求我们进行彻底的反思,以便我们能够解决从算法偏见到气候变化等一系列问题。这些问题在一系列文章中得到了阐述,每一篇文章都强调了我们与技术的关系中存在的问题,并提出了解决这些问题的方法。


To solve the issues of race and gender bias in algorithms, for instance, Mar Hicks at the Illinois Institute of Technology says we must recognise that these are deeply embedded features of the tech we rely on, not mere bugs. “These failures are not simply accidents,” Hicks writes, “they are features of how the systems were designed to work.”

翻译

例如,要解决算法中的种族和性别偏见问题,伊利诺伊理工学院希克斯表示,我们必须认识到,这些是我们所依赖的技术的深层次特征,而不仅仅是漏洞。“这些故障不仅仅是意外事故,”希克斯写道,“它们是系统设计出的特点。”


The consequences of algorithmic bias can be severe, as in the case of facial-recognition software erroneously flagging up innocent people as criminals or as suspects, says Safiya Umoja Noble at the University of California, Los Angeles. But it isn’t too late, she writes: “We have a significant opportunity to transform the consciousness embedded in artificial intelligence and robotics, since it is in fact a product of our own collective creation.”

翻译

加州大学洛杉矶分校的萨菲娅·乌莫贾·诺布尔说,算法偏见的后果可能是严重的,比如人脸识别软件错误地将无辜的人标记为罪犯或嫌疑人。但现在还不晚,她写道:“我们有一个重大机会来改变嵌入到人工智能和机器人的意识,因为它实际上是我们自己集体创造的产物。”


Your Computer is on Fire gives many examples of how our tech is often developed by and designed to work for a select few, despite having a diverse range of users globally. Halcyon Lawrence at Towson University writes that for speakers with a “nonstandard accent” – including herself, as a speaker of Caribbean English – “virtual assistants like Siri and Alexa are unresponsive and frustrating”.

翻译

《你的电脑着火了》给出了许多例子,说明技术通常是由少数人开发的,并且是为少数人设计的,尽管技术在全球范围内都有用户。陶森大学的哈尔西恩•劳伦斯写道,对于“不标准口音”的人来说——包括她自己,因为她讲的是加勒比海口音的英语——“Siri和Alexa这样的虚拟助手会反应迟钝,令人沮丧”。


In addition to supporting programmes to introduce more young people from diverse backgrounds to coding, big tech needs to do more to increase diversity in its own institutions, argues Janet Abbate at Virginia Polytechnic Institute and State University.

翻译

弗吉尼亚理工学院的珍妮特•阿巴特认为,除了支持引进更多背景不同的年轻人从事编程工作的项目外,大型科技公司还需要采取更多措施来增加自身机构的多元性。


The collection also interprets its central metaphor in a more literal sense, with Nathan Ensmenger at Indiana University in Bloomington arguing that we need to reckon with the physical impact our current use of technology is having on the planet. His chapter, “The Cloud is a Factory”, starts by recognising that cloud computing is “profoundly physical”, requiring enormous amounts of energy, resources and labour.

翻译

该书也从更字面的意义上解释了它的核心隐喻,印第安纳大学布卢明顿分校的Ensmenger认为,我们需要考虑到我们目前使用的技术对地球的物理影响。在他的“云是工厂”一章中,首先认识到云计算是“极其物理的”,需要大量的能源、资源和劳动力。




意见反馈  ·  辽ICP备2021000238号