Researchers claim to have devised means and methods that can be used to deceive the auto pilot system of Tesla Model S. While this can be seen as severely demeaning to the Model S in autonomous mode, those involved in the research project, however, warn their findings should not entail ruling out the system entirely just yet.
The project comprising of researchers from South Carolina, China's Zhejiang University and the Chinese security firm Qihoo 360 stated they used standard radio jamming equipment to fool the autopilot into believing the existence of an obstacle even if none existed. Similarly, the contrary to it when an object actually existed but the autopilot failed to detect it was also achieved.
However, as Wired reported, the research was largely conducted on a stationary car in controlled conditions. That coupled with the fact that the high costs of the equipment used - signal generator that costs upward of $90,000 and frequency multiplier that costs a few hundred more dollars should be reason enough for current Model S owners to continue relying on the autopilot mode as before.
What makes the above scenario a lot less probable in a real world scenario involving a speeding Model S is that the intercepting radio beams have to be aimed at precisely the right angle to be able to jam the car's radars.
What the researchers did prove is that the autopilot can still pose a risk as there is always the chance, howsoever delicate or impractical it might be, to hack the system. Thereafter, the ultrasonic sensors onboard the Model S which comes into play during self-parking and summoned functions too can be hacked using acoustic dampening foam.
However, according to Motor1, attempts to 'blind' the sensors by directing laser beams or LED at them simply made the autopilot to switch off automatically. This would be followed by a warning for the driver to take over control of the car, which should come as a relief for Model S owners.
The above findings come on the back of the fatal May crash when the Model S autopilot system failed to 'see' a white truck trailer against a brightly lit sky. Investigations are still on to ascertain as to what might have gone wrong. Tesla though has stated there isn't anything wrong with the autopilot which relies on inputs from cameras, ultrasonic sensors, and radar for its functioning.
"We appreciate the work Wenyuan and team put into researching potential attacks on sensors used in the Autopilot system," said a Tesla spokesperson in response to the above finding. "We have reviewed these results with Wenyuan's team and have thus far not been able to reproduce any real-world cases that pose risk to Tesla drivers."
See Now: OnePlus 6: How Different Will It Be From OnePlus 5?