With more and more automakers aiming to create self-driving cars, we’re all dreaming about the day when our streets will be filled with cars that follow the rules instead of cutting you off, while you sit back and enjoy a cup of coffee, read the news, or play whatever it is you like on your smartphone. This dream, however, may be in danger.
A group of researchers from the University of Washington, University of Michigan, Stony Brook University, and the University of California, Berkeley, have discovered that self-driving cars can be tricked if someone messes with the street signs.
We’re not even talking about some sort of cyber attack where some evil mastermind takes control of your car. We’re talking about pranksters and thugs who would like nothing more than to mess with this type of cars and their passengers, creating havoc.
According to the paper with the title Robust Physical-World Attacks on Machine Learning Models, there are two possible scenarios. In the first, an attacker could print an entirely new poster and overlay it over the existing sign, which would also fool regular drives. The second scenario has attackers attaching small stickers on a legitimate sign in order to fool the car’s system into thinking it’s a different kind of street sign.
The eight researchers signing the paper even give some examples. For instance, in one image, they’ve added “love” and “hate” on a Stop sign and the car’s system thought it was a “speed limit 45” instead. Other signs adorned with stickers or graffiti had the same effect, although the method was less successful.
Of course, other applications to this method are sure to be possible, and that’s a bit scary due to the obvious implications such a thing would have – especially since most self-driving cars offer no option for the humans to take over. Even if they did, it wouldn’t do much good since it only takes a split second for an accident to occur, and certainly a lot more attention given to the road by the passenger than they would normally.
Prevention is best when it comes to self-driving cars
There are, of course, ways to counteract this type of situations. For instance, if a city wants to truly embrace the smart era, they’d have to keep street signs clear of anything that might cover them, even if it’s just a tree branch. The authors of the paper also mention they could use non-stick materials for street signs.
There’s also the fact that car vendors should probably make sure their systems can tell when a street sign is misplaced. These systems shouldn’t just recognize the street signs as they approach them, they should also memorize them and notice when something is “off”.
Obviously, it’s a rather odd situation to be in, but the truth is that we’re still a few years away before such self-driving cars even make it on the market. That’s plenty of time for manufacturers to come up with ways to keep everyone safe. We’re also at least a decade away before such cars are even remotely embraced, and even longer before they’re going to take over the streets, if ever. There are many risks that come with self-driving cars, mostly revolving around the fact that they are susceptible to hackers, but we will hopefully, one day, surpass all our fears and embrace what technology has best to offer.
Realistically, however, considering that electric cars have been around for quite some time now and they’ve only reached 2 million units worldwide, hopes aren’t too high that their adoption rate will be too high. We’ve also got to take into consideration the high price these cars will have, which will make them inaccessible for a great number of people who would, perhaps, be open to the idea.