This Tesla Autopilot Crash Shows Why You Still Have To Pay Attention In Semi-Autonomous Cars

Impressive though semi-autonomous systems like Autopilot are, it's really not a good idea to fully leave them to the task of driving, as this Model S crash demonstrates

Semi autonomous cars are not perfect. Classified as Level 2 Autonomy, they’re a stepping stone to a future which - whether we like it or not - may not require a driver at all. But for now with semi-autonomous systems like Tesla’s Autopilot and Volvo’s Pilot Assist, the squishy bit behind the wheel is still very much needed. Case in point? The accident in the video above, which happened near Dallas, Texas recently.

The driver posted pictures of the aftermath up on Reddit, explaining that his car had hit a barrier while his Model S 70D was running on Autopilot. Coincidentally, another Reddit user browsing the thread had a link to some dashcam footage taken from the car behind, which showed the full nature of the crash.

What it seems the original poster hadn’t mentioned initially is that he traveling through roadworks at the time, and that the accident happened where a concrete barrier cut across the lane, arguably too suddenly and without enough warning. It’s unfortunate that the Tesla’s front collision warning system didn’t react to the barrier, but at the same time, you can see why Autopilot assumed it needed to keep on going straight, given the road markings.

Crucially, had the driver been as alert as you’re supposed to be when behind the wheel of a semi-autonomous car - and as alert as Tesla itself tells you to be - he’d have been able to avoid it himself. As a silver lining, the driver at least escaped serious injury.

Source: Reddit via Electrek

Sponsored Posts

Comments

Anonymous

Autopilot disengage.

03/03/2017 - 18:21 |
0 | 0
Anonymous

.

03/04/2017 - 04:04 |
1 | 0
Anonymous

Well. Way to enable some terrible drivers. Why not just tell these people to walk if they aren’t willing to pay attention?

03/04/2017 - 05:09 |
1 | 0
Anonymous

As it already tracks the cars around it, they could patch an algorithm where if all traffic is following a different pattern to the road markings, it will follow the traffic as a priority. To avoid being taken on to unwanted slip roads, I’d make sure the sonar can detect a barrier before it overrides the road markings.

03/04/2017 - 08:38 |
0 | 0
Tbryant

Autopilot is crap just drive the car by yourself

03/04/2017 - 09:31 |
1 | 1
H5SKB4RU (Returned to CT)

And this is why a autopilot will not be better than a human on adapting to situations

03/04/2017 - 13:04 |
2 | 0
Anonymous

Where does in the Tesla Manual say you can drive without your hands on the wheel or without looking at the road? Driver error. There is no such thing as fully autonomous car yet…

03/05/2017 - 21:40 |
0 | 0
Florin 1

Autonomous cars will never work for 2 reasons. 1, unless everyone and i mean EVERYONE has autopilot, there will still be accidents. 2, you cant program the fear of death into a robot so it will not act accordingly to save lives.

03/06/2017 - 01:20 |
0 | 0
DannyWRX

Its pissing me off that none of you drivers are going to rage that there were no signs saying merge right, or saying “lane closed up ahead”. That flipping barrier looks like its at 50degrees. Being behing a pick up truck, I wouldn’t expect the Tesla to react perfect.

03/09/2017 - 18:10 |
0 | 0
Jens Nielson

Stupid Autopilot!!!!!

03/30/2017 - 14:54 |
0 | 0