A two-year-old video of a prankster confusing a self-driving car has gone viral for a second time, and it demonstrates a critical reason why Tesla’s self-driving cars will never become ubiquitous.
Going viral once is a big event, but a “recurrent” viral video indicates that something about the content has gripped people in some visceral fashion.
In this case, it’s the fact that human ingenuity can fool a self-driving car. It drives home the reason why it seems unlikely that they will ever have mass-market appeal, no matter what company makes them.
How to Confuse a Self-Driving Car
The video below is of a self-driving car surrounded by a solid and unbroken line of salt, which is further enclosed by a circle consisting of dashed lines of salt.
The video demonstrates that the self-driving car understands it can cross the dashed lines, and even the solid line following it.
As anyone who has ever driven a car knows, the dashed-solid line pair is the universal symbol that permits a car to move into the opposing lane in order to pass a vehicle in front of it.
However, while a human driver knows that he must then travel back into his original lane to avoid a head-on collision, the self-driving car only views its motion activity in a vacuum. It doesn’t use the previous data of having crossed over the dashed line to understand that it must cross back.
Consequently, the car ends up stuck in the circle of solid salt. The AI interface believes that, because there is a solid line, it cannot be crossed lest it place the car into a head-on collision.
It seems likely that,