Main menu

Pages

Edit post

Researchers Mess With Stop Signs To Fool Autonomous Cars


Automakers around the world are furiously developing autonomous driving systems but researches from the University of Washington have shown that current systems can be easily fooled.

According to University of Washington computer-security research Yoshi Kohno, the cameras used by most current semi-autonomous vehicles can be tricked into thinking stop signs are actually speed limit signs.

Kohmo discovered that most vision systems in use rely on an object detector and a classifier, the latter of which interprets what the detector has seen, decides what the object is and can read what a road sign is saying. In his research, Kohmo says that if a hacker gains access to this classifier, they can use an algorithm and a photo of a road sign to generate an image which can be stuck to the sign to trick the car.

Working with colleagues from the University of Michigan, Stony Brook University and the University of California, Kohmo was able to get a self-driving system to interpret a stop sign as a 45 mph speed limit sign by simply sticking small pieces of paper on it. Additionally, by printing an attack disguised as graffiti on a stop sign that reads ‘Love’ and ‘Hate’, researchers were able to fool the vision system at a 73.3 per cent success rate into thinking the sign signified a 45 mph limit.

According to a senior research scientist from autonomous vehicle start up Voyage, carmakers will need to develop their self-driving systems to avoid such hacks.

“Many of these attacks can be overcome using contextual information from maps and the perceived environment. For example, a ‘65 mph’ sign on an urban road or a stop sign on a highway would not make sense. In addition, many self-driving vehicles today are equipped with multiple sensors, so failsafes can be built in using multiple cameras and lidar sensors,” he told Car and Driver.

PHOTO GALLERY

reactions

Comments

table of contents title