华盛顿邮报 | 自动驾驶汽车的问题是什么?许多人把开车的任务交给了自动驾驶系统


来源:《华盛顿邮报》

原文刊登日期:2022年6月26日


The National Highway Traffic Safety Administration released a report this month on crashes involving vehicles with automated technology. Self-driving cars may not really be the problem — the problem is cars that don’t drive themselves but manage to convince the drivers that they do.

翻译

美国国家公路交通安全管理局本月发布了一份涉及使用自动驾驶技术的车辆的车祸报告。自动驾驶汽车可能不是真正的问题——问题是那些不能自动驾驶的汽车却成功让司机相信它们能自动驾驶。


The report includes data collected over a 10-month period following an order last summer that required automakers to report incidents that included cars with advanced driver-assistance systems. Fully autonomous vehicles such as Google spinoff Waymo or General Motors-controlled Cruise LLC ended up in 130 crashes, most of them occurring when the car was struck from behind, 108 of which resulted in no injuries and only one of which resulted in a serious injury. Meanwhile, cars with partially automated systems experienced nearly 400 crashes. Six people died and five were seriously injured. A previous crash in a Tesla Model S ended in a fire that took four hours and more than 30,000 gallons of water to put out.

翻译

这份报告包括去年夏天一项命令之后10个月收集的数据,该命令要求汽车制造商报告包括配备先进驾驶辅助系统的汽车在内的事故。像谷歌分拆的Waymo和通用汽车控制的Cruise这样的全自动驾驶汽车最终导致130起车祸,其中大多数是在汽车背后发生的碰撞,其中108起没有造成人员伤亡,只有一起造成重伤。与此同时,采用部分自动化系统的汽车发生了近400起车祸。6人死亡,5人重伤。此前,一辆特斯拉Model-S的撞车事故以大火告终,这场大火花了4个小时,耗费了3万多加仑的水才得以扑灭。


The study is a reminder not only that the fully self-driving future many people imagine is a long way off, but also that a present in which cars can perform on their own some functions traditionally reserved for humans can prove dangerous. The NHTSA also recently upgraded a probe of Tesla Autopilot to an engineering analysis; investigators are examining the feature’s responsibility for repeated collisions with parked emergency vehicles such as ambulances and police cruisers — which drivers should have been able to see about eight seconds before impact, but which they took no action to avoid until two to five seconds before impact.

翻译

这项研究不仅提醒人们,许多人想象的完全自动驾驶的未来还有很长的路要走,而且提醒人们,现在汽车可以自行执行一些传统上为人类保留的功能,这可能是危险的。NHTSA最近还将对特斯拉自动驾驶系统的调查升级为工程分析;调查人员正在调查该功能在救护车和警车等停放的应急车辆反复碰撞中的责任,这些车辆的司机本应在碰撞前8秒左右能够看到,但直到碰撞前2到5秒,驾驶员才采取行动避免碰撞。


The issue, it appears, may not be merely that automated systems themselves have flaws but also that drivers are relying too heavily on systems that aren’t designed to do all the work without human input. After all, when something is called “full self-driving” it’s easy to expect, consciously or subconsciously, that it will fully drive itself. Even when software supposedly requires drivers to pay attention, the fact that a car can take care of some things can lull people into thinking the car will take care of all things — or into relaxing more generally, so that if something does go wrong they are unprepared to respond. This is what the NHTSA means when it says it will examine whether Tesla Autopilot “may exacerbate human factors or behavioral safety risks.”

翻译

问题似乎不仅在于自动化系统本身存在缺陷,还在于驾驶员过于依赖那些在没有人工输入的情况下不能完成所有工作的系统。毕竟,当一件东西被称为“完全自动驾驶”时,人们很容易有意识或下意识地期待它会完全自动驾驶。尽管据说软件要求驾驶员集中注意力,汽车能够处理一些事情的事实也会让人们误以为汽车能够处理所有事情——或者更普遍地说,让人们放松下来,这样,如果真的出了问题,司机就没有准备好应对。这就是美国国家公路交通安全管理局(NHTSA)表示将调查特斯拉自动驾驶是否“可能加剧人为因素或行为安全风险”的意义所在。


So far, there’s no data to show whether partial automation features render driving safer or less safe. The NHTSA could certainly try to make the former more likely by imposing minimum performance standards in addition to restrictions on terminology that exaggerates a vehicle’s capabilities. But drivers themselves would do well to remember that the era of self-driving cars for the most part hasn’t yet begun — even when they’re at the wheel of a vehicle that does some of the work for them.

翻译

到目前为止,还没有数据表明部分自动化功能会让驾驶更安全还是更不安全。NHTSA当然可以通过制定最低性能标准,以及限制夸大车辆性能的术语,来提高更安全的可能性。但司机们自己最好记住,自动驾驶汽车的时代在很大程度上还没有开始——即使他们已经坐在为他们做一些工作的汽车的方向盘前。




意见反馈  ·  辽ICP备2021000238号