Autopilot Criticised In NTSB's Fatal Tesla Crash Report

The NTSB has released its report on the fatal Model S crash that claimed the life of Joshua D. Brown last year, with several criticisms levelled at the Autopilot system
Autopilot Criticised In NTSB's Fatal Tesla Crash Report

In May 2016, Joshua D. Brown lost his life in what was at the time thought to be the first ever fatal crash involving an autonomous vehicle. 16 months on, the National Transportation Safety Board (NTSB) has released its report detailing exactly what went wrong when the 40-year-old’s Tesla Model S struck a turning lorry.

“A truck driver’s failure to yield the right of way and a car driver’s inattention due to overreliance on vehicle automation are the probable cause [of the crash],” the NTSB said. The report concedes that recognising and slowing down for the lorry wasn’t something Autopilot was designed for, but went on to say that the system “allowed prolonged disengagement from the driving task and enabled the driver to use it in ways inconsistent with manufacturer guidance and warnings.”

In other words, Brown was too dependent on Autopilot, and the system didn’t do enough to curb his inappropriate use of the semi-autonomous feature. “If automated vehicle control systems do not automatically restrict their own operation to conditions for which they were designed and are appropriate, the risk of driver misuse remains,” the NTSB concluded.

The organisation’s preliminary findings had earlier indicated that Brown had his hands on the wheel for just 25 seconds over a 37 minute period.

Autopilot Criticised In NTSB's Fatal Tesla Crash Report

The NTSB noted that changes have since been made to Autopilot, reducing the length of time users can keep their hands off the wheel for before being issued a warning. Nonetheless, numerous recommendations were made, urging manufacturers to “limit the use of automated control systems to conditions for which they are designed,” and develop better ways of measuring driver engagement.

NTSB Chairman Robert L. Sumwalt III summed up the situation rather aptly, stating:

“While automation in highway transportation has the potential to save tens of thousands of lives, until that potential is fully realized, people still need to safely drive their vehicles.”

Sponsored Posts

Comments

Anonymous

Theres a difference between autonomous and fully autonomous. Why do people view the Autopilot system as fully autonomous and literally hand over their lives to a beta program? I’ve even seen accounts of drivers SLEEPING while on Autopilot.

I know computers will be more accurate than humans driving in the future, but today isn’t the time. Notice how I said IN THE FUTURE.

One more thing. Autopilot should not be criticized because it can’t drive fully autonomous yet. But it’s the driver who should be criticized for using the program in a stupid manner and not paying attention to his surroundings. This accident could have been prevented if only the driver was paying attention on the road while on Autopilot so he has way more time to take back control and react.

Even Tesla says that you must always keep your eyes on the road while on Autopilot.

09/13/2017 - 09:29 |
82 | 2
Anonymous

In reply to by Anonymous (not verified)

i don’t want autopilot

09/13/2017 - 09:48 |
17 | 1
Anonymous

In reply to by Anonymous (not verified)

I think calling it ‘Autopilot’ in the first place is a bit of an error. People see it in movies and expect it to work exactly the same IRL.

If the system was named ‘Assisted Driving’ or something like that, people might not be so reliant on it.

09/13/2017 - 11:02 |
28 | 0
ormim

In reply to by Anonymous (not verified)

Yeh but i have what j think is a easy solution for this whole thing and will prevent law suits from hapoening. Teslas being a 80k plus cat i assume there is gps in the car. So why not have the autop pilot be able to turn on when the gps knows ur on a highway. That way it can only be on where its inteded to be on and not on the streets untill its out of tje beta phase

09/13/2017 - 12:03 |
0 | 0
Anonymous

In reply to by Anonymous (not verified)

Why we just don’t ban it? Cheaper and helps clean the gene pool of rtards

09/13/2017 - 12:41 |
6 | 0
TheMindGarage

In reply to by Anonymous (not verified)

Tesla isn’t even autonomous. It’s a driver ASSISTANCE program, not a driver REPLACEMENT.

09/13/2017 - 17:17 |
0 | 0
Anonymous

If you don’t want to drive just take the bus or a taxi(or a plane if there’s one).

09/13/2017 - 09:39 |
6 | 1
Anonymous

So you have a sleeper which basicaly destroys EVERYONE on a drag race and you don’t even drive it?

This system makes sense on a prius, not in a tesla imo

Still, every system has flaws, and until every car is fully autonomous, this system will be as dangerous as driving yourself.

Until then, it should only be able to function in a highway

09/13/2017 - 09:54 |
1 | 14
Caro

In reply to by Anonymous (not verified)

uhh
every part of that comment is either wrong, or has already been said extensively before

09/15/2017 - 22:46 |
1 | 0
Anonymous

This is a bit like the perennial battle of philosophy’s between Boeing and Airbus (not cars but bare with me).
Boeing believe that the pilot, with much electronic assistance, is the overall master of the machine and the aircraft ultimately responds to his inputs.
Airbus think that the aircraft knows best and that the pilot only has an advisory capacity on how to fly the machine.
It’s no coincidence that Airbus used to have quite a few incidents with their aircraft trying to kill everyone due to electronic malfunction or whatever.
I think these autopilot systems encourage a total disconnect from the business of driving, so when the electronics

There have b

09/13/2017 - 10:21 |
12 | 0
Anonymous

In reply to by Anonymous (not verified)

….let you down, your in a too distant mental place to take control back quickly enough.

To sum up I’d never be able to trust any autonomous driving system. Direct control is always going to be the best control.

Essay over :-P

09/13/2017 - 10:24 |
8 | 0
H5SKB4RU (Returned to CT)

In reply to by Anonymous (not verified)

Desde luego XD

09/17/2017 - 11:55 |
0 | 0
Klush

This is why Volvo calls their system “Driver Assist” and NOT F** Autopilot.

09/13/2017 - 13:54 |
3 | 0
Anonymous

Keeps hands off steering wheel for allmost 37minutes and … car gets blamed.

09/13/2017 - 16:49 |
0 | 0
Anonymous

People are too stupid to use the beta of autopilot. If Tesla wants to test it and not have everyone blame the car in every possible situation, they should pick and choose who gets autopilot because people are idiots who will find a way to get it to kill them.

09/13/2017 - 21:40 |
0 | 0
Anonymous

I appreciate the new NTSB conclusions factoring in the blame of bad UX design and lack of effective operational safeguards (dare I say, landmark?). However, I fear voluntary guidelines might not be enough, and regulation/enforcement might be necessary regarding driver engagement/involvement in driving responsibilities. For example, I currently am a PhD student in a Driver State Monitoring work package of a Human Factors of Automated Driving project.
Please see my LinkedIn “pulse” article on this news topic (as an evolutionary account of the unfolding of news coverage of this specific event): https://www.linkedin.com/pulse/tortuous-case-tesla-autopilot-christopher-cabrall?articleId=6313809679554347008

09/13/2017 - 23:54 |
0 | 0
Zanzaroni

After a short amount of time of the driver not responding to the warnings the car should safely find a space to pull up, instead of Tesla and any other manufacturer investing in fancy marketing and stuff, they should gather their resources around that. After this issue is fixed, they can go on about trying to make the system worthy of its “Autopilot” name

09/14/2017 - 08:15 |
0 | 0