A Tesla Driver Has Died In The First Ever Fatal Crash For A Self-Driven Car

Tesla has revealed that a Model S driver died in an accident on 7 May while Autopilot was activated, in what’s thought to be the first fatal crash involving an autonomous vehicle. The driver - 40-year-old Joshua D. Brown - was on a divided highway in Williston, Florida, when a tractor pulled out, at which point neither Brown nor Autopilot reacted.
The Model S passed under the tractor’s trailer, with the bottom of the trailer hitting the windscreen. The car then continued down the road, before leaving the highway and hitting a fence. He died at the scene.
In a statement released on Thursday, Tesla said that the National Highway Traffic Safety Administration (NHTSA) has started a “preliminary evaluation” into the performance of Autopilot during the crash. “This is the first known fatality in just over 130 million miles where Autopilot was activated,” Tesla said, adding, “Among all vehicles in the US, there is a fatality every 94 million miles.”
Brown was well known in the Tesla community, and just a month before the fatal crash had posted a video on YouTube (below) of Autopilot successfully averting an accident. The video quickly clocked a million views.
Tesla’s Autopilot is at the moment intended to be a driver assist, and more of a ‘semi-autonomous’ mode that requires the driver to be holding the steering wheel at all times. In the statement Tesla notes that “Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert,” but that hasn’t stopped some well-documented abuses of the system. It’s been heavily criticised in some corners for lulling its users into a false sense of security. Earlier this year, a senior Volvo engineer slammed Autopilot, labelling it an “Unsupervised wannabe” that “Gives you the impression that it’s doing more than it is.”
At this early stage of the investigation, it’s not known exactly why Brown didn’t brake himself. Tesla’s statement speculates that he simply did not see “The white side of the tractor trailer against a brightly lit sky,” however in a report in the Associated Press, the 62-year-old driver of the tractor claimed to have heard one of the Harry Potter films playing from the car at the crash scene. Tesla responded to the claims, stating that it isn’t possible to watch videos on the main screen found in the Model S.
Find out more about how Autopilot works by watching our video below:














Comments
I feel that people are trusting the autonomous system a bit too much even though it’s in a development phase. May as well remove the human out of the equation. Condolences to the family tho. :(
Is it me or does Musk seem very detached from the entire situation. Especially for someone who seems to push this technology….
…I’ve never much liked the guy but is anyone else getting this feeling?
As much as I love driving and as much as I don’t want to think of a world full with self driving cars, I understand what a convenience it might be. I read an article about Eric Noble from carlab a car consulting company accuse Tesla of providing their customers with an untested technology and I have to agree no matter how difficult it is the only way you can’t spot a semi perpendicular to you while driving is if you are blind or if you are not paying attention. If the sun was getting in his eyes he should slow down, that would give him time to adjust to the brightness, if he was not paying attention then he is at fault as stated by Tesla, but they are too. Further measures should be taken just like the measures taken by MB and their semi autonomous cars, or this will be the first in a long series of tragic incidents..
At the end of the day what’s the point of having any auto pilot technology if it doesn’t brake when there’s an obstacle, when it’s engaged or not haha. I know the driver has got to be aware at all times as well but really and fundamentally it’s absolutely and utterly pointless at this stage if the car can’t even stop in an the evnt of an accident. Basically using humans to beta test a very dangerous software and then when it goes wrong blaming the human for not using it right. IMO that’s not on. Tesla should have only released that software once it was up to full standards and then when something does go wrong they take liability. You can’t keep blaming the humans… oh they musn’t have been looking or concentrating or have their hands on the wheel…. how do you know?! For all we know the guy tried to brake but the auto pilot had overridden him or had a fault that kept the gas on?! Who knows. ….Or did the software think… we’re on a highway if we suddenly brake there’s going to be a massive pile up behind us. There’s only one occupant in the car… software makes the decision to keep the car moving to reduce loss of life elsewhere? What ya reckon conspiracy theorists? :P :P
it is autonomous but YOU CANNOT rely on it. you should not lapse in attention when behind the wheel of an automobile whether it is driving itself or not because you never know when a hazard may present itself at short notice. such as a tractor.
The whole point of collision warning autonomous braking and auto pilot systems is to stop the car if the driver isn’t paying attention? Driver was at fault but the system failed. The car travelled even after the accident. It really shouldn’t activate unless you have a hand on the steering wheel etc.
Overall It should never have been sold as auto pilot just yet. I agree with the Volvo engineer.
[DELETED]
I guess Jesus is his copilot now…
It’s a shame that Tesla is choosing to make their customers the test pilots (passengers?) in a system that is potentially dangerous and very poorly regulated internationally. The responsible thing to do would be to sort it out first then sell it, people will always try to abuse the system so make sure it can hold up first.
Condolences to the family, but it seems he didnt brake either, lets hope people can step back from this horrific incident and see that this is a fledgling technology that still requires human concentration too.
Pagination