摘要:On the morning of May 5th, 2021, Steven Hendrickson was driving his Tesla Model 3 to work.
On the morning of May 5th, 2021, Steven Hendrickson was driving his Tesla Model 3 to work.
2021年5月5日上午,史蒂文·亨德里克森正驾驶着他的特斯拉Model 3前往工作地点。
His car was in Autopilot as he drove through Fontana, California.
他驾驶特斯拉Model 3穿过加利福尼亚的丰塔纳时,汽车处于自动辅助驾驶状态。
At about 2:30AM, an overturned semi-truck appeared in front of him.
大约凌晨2点30分,一辆翻倒的半挂卡车突然出现在他面前。
Moments later, he was killed.
几分钟后,他被夺去了生命。
The 35-year-old left behind his wife and two kids.
这位35岁的男子身后留下了妻子和两个孩子。
This crash is one of more than a thousand that Tesla has submitted to federal regulators since 2021.
这起事故是特斯拉自2021年以来提交给联邦监管机构的超过一千起事故之一。
The details of which have been hidden from the public.
其中的细节已被隐藏,未向公众披露。
Video and data gathered from these crashes by the Wall Street Journal shows that Tesla's heavy reliance on cameras for its Autopilot technology, which differs from the rest of the industry, is putting the public at risk.
《华尔街日报》从这些事故中收集的视频和数据显示, 特斯拉的自动驾驶技术严重依赖摄像头,这与整个行业不同,正将公众置于危险之中。
Teslas operating in Autopilot have been involved in hundreds of crashes across US roads and highways since 2016.
自2016年以来,在美国各地的道路和高速公路上,使用Autopilot功能的特斯拉汽车已涉及数百起事故。
Over the years, Tesla CEO, Elon Musk, has maintained the technology is safe.
多年来,特斯拉首席执行官埃隆·马斯克一直坚称该技术是安全的。
I do think that long term, it can reduce accidents by a factor of 10.
我认为从长远来看,它能将事故减少十倍。
We're solving just a very important part of AI and one that can ultimately save millions of lives and prevent tens of millions of serious injuries.
我们正在解决人工智能中一个极其重要的部分,这一部分最终能够挽救数百万人的生命,并预防数千万起严重伤害事故。
The safety for a mile is better than human driving.
每行驶一英里的安全性优于人类驾驶。
Leading expert on autonomous driving, Missy Cummings, who's been critical of Tesla, warned of the risks of semi-autonomous vehicles in 2016.
自动驾驶领域的顶尖专家米西·卡明斯曾在2016年对特斯拉持批评态度,她当时就警告了半自动驾驶车辆存在的风险。
There is no question that someone is going to die in this technology.
毫无疑问,这项技术会导致有人死亡。
The question is when.
问题在于何时。
Cummings has also worked as a safety advisor for the federal regulator, the National Highway Traffic Safety Administration, or NHTSA.
卡明斯还曾担任联邦监管机构——国家公路交通安全管理局(简称NHTSA)的安全顾问。
I am besieged with requests from families of people who have been killed in Tesla crashes.
我收到了许多特斯拉车祸遇难者家属的请求。
It's really tough to explain to them that, you know, this is the way the tech was designed.
向他们解释这一点确实很困难,你知道,这就是技术设计的方式。
Since 2021, NHTSA has launched a series of investigations into Autopilot.
自2021年以来,NHTSA对Autopilot启动了一系列调查。
They've released little information about them and the cars remain on the road.
他们几乎没有公布相关信息,而这些车辆依然在道路上行驶。
Also, in 2021, NHTSA ordered car makers to report all serious crashes involving semi-autonomous driving systems like Autopilot.
此外,2021年,美国国家公路交通安全管理局(NHTSA)要求汽车制造商报告所有涉及类似Autopilot的半自动驾驶系统的严重事故。
But much of the data Tesla submitted has been hidden from public view because the company considers it proprietary.
但是,特斯拉提交的大量数据被隐藏起来,没有公开,因为该公司认为这些数据是专有的。
To unlock it, the Journal gathered reports from individual states and matched them with the crash data Tesla submitted to NHTSA and found that longstanding concerns about the Autopilot technology are showing up on America's roads.
为了揭开这一谜团,《华尔街日报》收集了各州的报告,并与特斯拉提交给美国国家公路交通安全管理局(NHTSA)的碰撞数据进行了匹配,发现长期以来对Autopilot技术的担忧正在美国道路上显现。
Of the 1000 plus crashes Tesla submitted so far, the Journal pieced together 222 and found that 44 of them occurred when Tesla's in Autopilot veered suddenly.
在特斯拉提交的1000多起事故中,《 Journal》整理了222起, 并发现其中44起是在特斯拉自动驾驶状态下突然偏航时发生的。
31 occurred when a Tesla in Autopilot failed to stop or yield for an obstacle in front of it.
其中31起发生在特斯拉处于自动驾驶模式时未能对前方障碍物进行停车或避让的情况下。
Orlando, Florida.
佛罗里达州奥兰多。
A Model 3 drove into the back of a stopped police car that was attending to a disabled vehicle.
一辆Model 3撞上了正在处理故障车辆的一辆停着的警车尾部。
Guadalupe County, Texas.
得克萨斯州瓜达卢佩县。
A Model 3 plowed through an intersection and off the road.
一辆Model 3冲过路口并冲出了道路。
This failure to stop crashes like the one that killed Steven Hendrickson, account for the most serious injuries and deaths in the cases the Journal unlocked.
未能阻止类似导致史蒂文·亨德里克森丧生的碰撞事故,是导致《华尔街日报》揭露案件中严重伤亡的主要原因。
I had to hang up the phone and look at both my kids as they're crying and they're telling me to tell them it's not true, that their dad was in a car accident and he won't be coming home.
我不得不挂断电话, 看着两个哭泣的孩子, 他们哭着求我告诉他们这不是真的,说他们的爸爸只是遭遇了车祸, 再也不会回家了。
Tesla has said drivers operating in Autopilot need to be ready to take control at all times.
特斯拉表示,使用Autopilot的驾驶员需要随时准备接管控制。
They say Steven Hendrickson was warned to keep his hands on the wheel 19 times before he crashed and that his car initiated braking before impact.
他们表示,在史蒂文·亨德里克森发生事故前,他的车辆曾19次警告他保持手握方向盘,并且在碰撞前,车辆已启动了制动系统。
But whether or not a driver is alert, it's important to know how Autopilot performed and that can only be seen through the large amounts of internal video and data Teslas record.
但无论驾驶员是否警觉,了解Autopilot的表现如何是很重要的, 而这只能通过特斯拉记录的大量内部视频和数据来观察。
The Journal obtained video and partial data from the Hendrickson crash and asked experts to analyze it.
《华尔街日报》从亨德里克森事故中获取了视频和部分数据,并邀请专家进行分析。
The kind of things that tend to go wrong with these systems are things like it was not trained on the pictures of an overturned double trailer.
这些系统容易出错的情况包括,例如,它们未曾接受过翻倒的双层拖车图片的训练,因此无法识别这种场景。
It just didn't know what it was.
它只是不知道那是什么。
There were some lights there, but the lights were in unusual positions.
那里有些灯光,但灯光的位置很不寻常。
A person would've clearly said something big is in the middle of the road.
一个人可能会明确指出,路中间有个大东西。
But the way machine learning works is it trains it on a bunch of examples and if it encounters something it doesn't have a bunch of examples for, it may have no idea what's going on.
但机器学习的工作方式是, 它通过大量实例进行训练, 如果遇到没有足够实例支持的情况,它可能完全不知道发生了什么。
Experts say this illustrates a fundamental flaw in the Autopilot technology.
专家指出,这揭示了自动驾驶技术中的一个根本性缺陷。
It relies mainly on cameras or computer vision with radar as a backup in some models.
它主要依赖摄像头或计算机视觉技术,在某些型号中,雷达作为备用辅助。
Other driving assist cars, they have radar computer vision and lidar laser ranging that kind of helps you detect obstacles.
其他驾驶辅助汽车则配备了雷达、计算机视觉和激光雷达测距,这些有助于你检测障碍物。
But the Lidar is very expensive.
但激光雷达非常昂贵。
Expensive sensors are unnecessary.
昂贵的传感器实属多余。
It's like having a whole bunch of expensive appendices.
这就像拥有一堆昂贵的累赘。
Instead, Tesla's camera-based system relies on humans to train it on obstacles it may encounter.
相反,特斯拉基于摄像头的系统依赖于人类来训练它识别可能遇到的障碍物。
Even though people think these cars are learning while they're driving, they're not.
尽管人们认为这些汽车在驾驶过程中能够学习,但实际上它们并不能。
They're only learning at the rate that the companies are deciding to retrain the algorithms.
它们的学习速度仅限于公司决定何时重新训练算法。
So, I'm gonna turn Autopilot on right now.
所以,我现在就要开启自动驾驶功能。
I'll just pull my finger down like that one time.
我就这样轻轻一滑手指。
And as you can see on the screen, the visualizations have started.
正如屏幕上所示,可视化效果已经开始呈现。
John Bernal is a former Tesla employee who was fired in 2022 after posting videos of Autopilot failures.
约翰·伯纳尔是一名前特斯拉员工,他在2022年因发布自动驾驶失败的视频而被解雇。
Tesla has not commented publicly on his dismissal.
特斯拉没有公开评论他的解雇事宜。
So, as we go through this downtown, you'll see a lot of cars being visualized.
因此,当我们穿过市中心时,你会看到许多车辆被可视化呈现出来。
Just, you know, kind of on the side of the road being parked.
只是,你知道,有点像是停在路边。
You'll see pedestrians walk in the sidewalk.
你会看到行人在人行道上行走。
Originally started at Tesla in August, 2020 as a data annotation specialist.
最初于2020年8月在特斯拉担任数据标注专家。
That role primarily relied on me taking image data and labeling it to train the technology on what a car was, what a pedestrian was, maybe an animal, a lane line, or what a red light was.
该职位主要依赖于我处理图像数据并为其标注,以训练技术识别什么是汽车、什么是行人、可能是动物、车道线或红灯等。
That way with this image data training, the vehicle would then know how to operate in a real world situation when that data came through the cameras and through the car's computer.
通过这种图像数据训练,车辆便能在实际场景中运用这些数据, 当摄像头和车载计算机接收到数据时, 车辆就能知道如何操作。
The car has eight cameras, specifically three right here in the front.
汽车配备了八台摄像头,其中三台特别安装在正前方。
And so as that truck just went across us, it's actually going through the plane of multiple different cameras.
因此, 当那辆卡车刚刚从我们旁边驶过时,它实际上正通过多个不同摄像头的视野平面。
In one camera, it may have looked perfectly on the ground and centered, but another camera, it could have been six feet ahead or six feet behind, or maybe five feet floating in the air.
在某一摄像头中, 它可能看起来完美地贴地且居中,但在另一个摄像头中, 它可能前移或后退了六英尺,或者甚至可能悬浮在空中五英尺高。
And those are things I've noticed while labeling these image clips, is that these cameras are not calibrated properly.
我在为这些图像片段标注时注意到,这些摄像机并未正确校准。
And so, what looks true in one camera will not be true in another camera.
因此,在一个摄像头看来是真实的情况,在另一个摄像头中可能就不再真实。
If the cameras aren't seeing the same thing, they can encounter significant challenges identifying obstacles.
若摄像头所见不一致,它们在识别障碍物时可能面临重大挑战。
We can see this in an Autopilot crash, similar to Steven Hendrickson's, near Los Angeles in 2021.
我们可以在2021年洛杉矶附近一起类似史蒂文·亨德里克森经历的Autopilot事故中看到这一点。
The Journal obtained a rare set of raw data for the Tesla in this crash from an anonymous hacker.
《期刊》从一名匿名黑客那里获得了此次事故中特斯拉汽车的一组罕见的原始数据。
When a crashed pickup truck becomes visible on the road, one of the car's cameras sees it, but the other doesn't.
当一辆撞毁的皮卡出现在道路上时,特斯拉的一台摄像头捕捉到了它, 而另一台却没有。
Then as the obstacle gets closer, the Tesla doesn't recognize it and it crashes at full speed.
随后,当障碍物逐渐靠近时,特斯拉未能识别出它,并以全速撞了上去。
Getting this degree of information off of Tesla after it crashes is nearly impossible for the average driver.
在事故发生后,普通驾驶员几乎无法从特斯拉获取如此详尽的信息。
Tesla says they provide drivers with hundreds of data points with granularity down to the millisecond in a user-friendly format.
特斯拉表示,他们以用户友好的格式向驾驶员提供数百个数据点,精确到毫秒级别。
But that information doesn't show the exact decisions Autopilot made.
但该信息并未展示出Autopilot做出的具体决策。
Those details need to be extracted from the car's internal computer.
这些细节需要从汽车的内部电脑中提取。
A Tesla's pretty much a computer on wheels.
特斯拉基本上就是一台装在轮子上的电脑。
I work on salvage Tesla's.
我修理报废的特斯拉汽车。
Usually in accident, wrecks.
通常是在事故中受损的。
I've seen a lot of videos where it's captured a lot of the drivers being reckless, and I've seen videos where it gives you that question mark of like, how did that happen?
我看过许多视频, 其中捕捉到了大量驾驶员的鲁莽行为,也看过一些视频, 让人不禁产生疑问,比如,这到底是怎么发生的? 这让人好奇, 究竟是驾驶员的问题, 还是自动驾驶系统的责任?
And it makes you curious like, was it the driver or was it the Autopilot?
这让你不禁好奇,究竟是驾驶员的失误,还是自动驾驶系统的问题?
The video only tells part of the story.
视频只讲了部分事实。
The data on the car's computer gives a fuller picture.
车载电脑的数据展现了一幅更完整的画面。
The computers, they all report data and link to Tesla mothership.
这些计算机都会报告数据并连接到特斯拉的主服务器。
And we have the computer off.
而且我们把电脑关闭了。
Diaz sends Autopilot computers he doesn't need across the country to a hacker who then extracts the information.
迪亚兹将不需要的自动驾驶电脑寄往全国各地的黑客,黑客随后从中提取信息。
Janell Hendrickson has not been able to get data from her husband's car.
贾内尔·亨德里克森至今未能从她丈夫的车上获取数据。
Her lawyers requested it over a year ago.
她的律师一年多前就提出了请求。
Tesla says they're in the process of gathering it.
特斯拉表示他们正在收集相关信息。
It's actually been really difficult to get any information about this.
实际上,获取这方面的任何信息都非常困难。
We know that Tesla themselves has all the video and all the information, but they will not share it.
我们知道特斯拉自己拥有所有的视频和所有信息,但他们不会分享。
One thing that's not reflected in the car's data is the driver's confidence in the technology.
汽车数据中未体现的一点是驾驶员对这项技术的信任程度。
He trusted that Autopilot, obviously with his life.
他显然以自己的生命信任自动驾驶系统。
He trusted it with my kids' lives.
他信任它,甚至用我孩子们的生命去信任它。
He said that it let him be a little bit less worried when he was driving so much.
他说,这让他开车时少了一些担忧。
He'd leave about 3AM to get to work.
他大约在凌晨3点出发去上班。
By six, all over SoCal.
到六点时,整个南加州都已覆盖。
If the cars do a pretty good job, not perfect, but pretty good, we develop a false sense of trust in them.
如果车辆表现相当不错, 虽非完美,但已相当出色, 我们便会产生对其的错误信任感。
And this overconfidence is, it's both coming from the levers of the tech, but it's also coming from the top down.
这种过度自信, 既源于技术的推动,也来自上层的引导。
The Model S and Model X at this point can drive autonomously with greater safety than a person.
Model S 和 Model X 目前能够在安全性上超越人类驾驶员,实现自动驾驶。
We've gone over this multiple times, like, "Are we sure we have the right sensor suite?
我们已经多次确认过,比如“我们确定拥有正确的传感器组合吗?”
Should we add anything more?" No.
我们还需要添加什么吗?不用了。
The car currently drives me around Austin most of the time with no interventions.
这辆车目前大部分时间在奥斯汀都能自主驾驶,无需我干预。
The Department of Justice is investigating Tesla for its marketing of Autopilot.
美国司法部正在就特斯拉的Autopilot自动驾驶系统营销进行调查。
Tesla disputes claims they misled the public about the car's capabilities.
特斯拉否认有关其误导公众关于汽车能力的指控。
Janell Hendrickson's case against Tesla is set to go to trial in 2025.
简nell Hendrickson起诉特斯拉的案件定于2025年开庭审理。
People are still gonna buy Tesla.
人们还是会购买特斯拉。
They're still gonna support Elon Musk.
他们依然会支持埃隆·马斯克。
Doesn't matter how many accidents there are, but at least understand your car and understand your car's capability before you just put your entire life behind that.
无论发生多少事故,在你将整个生命托付于它之前, 至少要了解你的车以及它的能力。
Computer vision is such a deeply flawed technology, and we're not doing enough to determine how to fix its gaps and how to make it recognize objects.
计算机视觉技术存在如此严重的缺陷,而我们并未采取足够措施来确定如何弥补其不足, 以及如何使其识别物体。
Like I look into my crystal ball, you know, "What do I see coming?" I see that having the car do most of the driving for you and requiring you to pay attention to make sure nothing bad happens.
就像我透过水晶球窥视未来, 你会问:“我看到了什么即将到来? ” 我预见的是,汽车将承担大部分驾驶任务, 而你需要保持警觉, 以确保一切顺利, 不会发生意外。
I don't think this is going to be a long-term technology that we're gonna keep in the cars.
我认为这不会是一项长期技术,我们不会在汽车中保留它。
来源:英语东