Despite having circuitous sensor arrange and bags of hours of analysis and development poured into them, self-driving cars and autopilot systems can be gamed with customer accouterments and 2D image projections.

Researchers at Ben-Gurion University of the Negev in Israel, were able to trick self-driving cars — including a Tesla Model X — into braking and taking ambiguous action to avoid “depthless apparition objects.”

The image below demonstrates how two-dimensional image projections tricked a Tesla’s Autopilot system into cerebration there was a person stood in the road. To the human eye, it’s clear that this is a hologram of sorts, and wouldn’t pose a concrete threat, but the car perceives it otherwise.

webrok

This address of apparition image projections can also be used to trick autopilot systems in to “thinking” any number of altar lay in the road ahead, including cars, trucks, people, and motorcycles.

Researchers were even able to trick the system’s speed limit admonishing appearance with a apparition road sign.

Using apparition images, advisers were able to get Tesla’s Autopilot system to brake suddenly. They even managed to get the Tesla to aberrate from its lane by bulging new road arrangement onto the tarmac.

Take a look at the aftereffect of the researcher‘s abstracts in their video below.

Drone attacks

What’s more terrifying, though, is that “phantom image attacks” can be agitated out remotely, from a distance, using drones or by hacking video billboards.

In some cases, the apparition image could appear and abandon faster than the human eye can detect or before a person notices. The image could still be accustomed by the able accelerated sensors used in autopilot systems, though.

By adhering a mini-projector to the base of a drone, advisers projected a apparition speed limit sign onto a nearby building. Even though the sign was arresting for just a split-second, it was enough to trick the autopilot system.

webrok

A agnate action can be agitated out by hacking video billboards that face the road. By injecting apparition road signs, or images, into a video advert, autopilot systems can be tricked into alteration speed.

Is this the future?

It all sounds acutely dystopian. Picture it now.

It’s 2035, you’ve just been forced by the government to buy an electric car because aggregate else is banned. You’re canoeing to work on autopilot, and all of a sudden a drone appears and projects new road arrangement which cause your car to veer into advancing traffic. At best, you spill your coffee.

Of course, this book assumes that the “phantom attack” issue is never resolved.

Thankfully, the advisers declared their allegation to Tesla and the makers of other autopilot systems they tested. Hopefully fixes are on their way, before more people figure out how to carry out these attacks.

Indeed, the advisers themselves were able to train a model that can accurately detect phantoms by analytical the image‘s context, reflected life, and surface. It can be overcome.

Read next: Here are 3 ways we can redesign social media to reduce 'FoMO' all-overs