Self-Driving Cars Could Be Fooled Into Dangerous Manoeuvres By Anyone With A Handful Of Stickers

Simple research is suggesting that even common graffiti or stickers on road signs can make self-driving cars misread them, potentially making them behave unpredictably or even dangerously
Self-Driving Cars Could Be Fooled Into Dangerous Manoeuvres By Anyone With A Handful Of Stickers

Autonomous cars could be made to behave unpredictably by simply putting stickers or graffiti on road signs, if new research is correct.

Boffins at the University of Washington carried out some simple tests on an unnamed manufacturer’s autonomous driving software, having previously hacked into it, with results suggesting that even basic changes or additions to signs can make the car completely misinterpret them.

Image: University of Washington
Image: University of Washington

The ultimate point being made by the researchers is that if digital crims gain access to an autonomous car’s brain, and let’s be honest: they can and will, then they can learn how to trick it. In one of the university’s tests they placed ‘love’ and ‘hate’ stickers on a ‘stop’ sign, causing the car to misread it as a 45mph speed limit sign.

Another example saw the black arrow on a ‘turn right’ sign fogged out in a pixelated grey-white. The car saw nothing of the sort, having convinced itself it was looking at a ‘stop’ sign.

Image: University of Washington
Image: University of Washington

Cars are already on the road with technology that can read and respond to road signs, most specifically for speed. If such a car hits the brakes apparently randomly, thinking it has seen a lower speed limit sign and needs to adjust, unsuspecting drivers behind could end up in all sorts of trouble.

While the likes of Elon Musk will likely have a response up their sleeves, the results of this study suggest there’s a long way to go before machines can match the complex reasoning powers of the human brain, even if that brain is disguised as a seat.

Already, Volvo has apparently found it almost impossible to design software to avoid kangaroo collisions in Australia, while according to reports, Waymo has had to develop tiny windscreen wipers to clean bird crap off cameras and sensor arrays that would otherwise go blind…

Via: The Telegraph

Sponsored Posts

Comments

Anonymous

Best alternative is probably a small chip (maybe RFID) built into most street signs that the cars read instead to gain confirmation

08/09/2017 - 00:37 |
2 | 0
LordFokas

In reply to by Anonymous (not verified)

Doesn’t RFID have an extremely limited range? Plus you could just drop an RFID card with the code for “STOP” in the middle of a highway and wreak havok.

08/09/2017 - 03:55 |
1 | 0
Fouck hahaha

How much a imbecil can be lazy to not drive his own car… olny a self-crashing EV can be worst than that… Get the Bus lazy people!

08/09/2017 - 02:37 |
2 | 0
......

sticker bomb your car and watch the world burn

08/09/2017 - 03:19 |
1 | 0
Dosai

Ayyee University Of Washington!

08/09/2017 - 06:34 |
2 | 0
Driven to Drive 1

In reply to by Dosai

Yeet! It’s funny to see my school on CT.

08/10/2017 - 00:50 |
0 | 0
Anonymous

Useful to kidnap a small and defenceless car in the woods…

I know that’s ceeepy

08/09/2017 - 08:23 |
0 | 0
H5SKB4RU (Returned to CT)

Why we dont just ban them? So we clean drivers who can drive and those who not use public transport ans bicycles?

08/09/2017 - 11:20 |
2 | 0
Jefferson Tan(日産)

Hackers? (Furious 7 hacks intensifies)

08/09/2017 - 11:57 |
0 | 0
Mark Stanton

No matter how smart the machine, it can still be fooled by an idiot with a spray can

08/09/2017 - 15:33 |
3 | 0