Researchers from the University of California found out that the placement of an ordinary item on the side of the road might deceive autonomous vehicles into coming to a standstill or other dangerous driving behavior.
It is already known that a single miscalculated decision of self-driving cars can cause severe damage to the passengers and also pedestrians.
However, this new revelation indicates the need to further refine the technology to improve its security.
Read More: Is DeepMind’s Generalist AI Agent Gato, truly an epitome of AGI Models?
To understand and identify this vulnerability, the researchers at UCI’s Donald Bren School of Information and Computer Sciences created PlanFuzz, a testing tool that can automatically uncover vulnerabilities in commonly used automated driving systems. They also released a video on YouTube to demonstrate their findings.
Researchers discovered that vehicles were forced to stop on vacant thoroughfares and crossroads due to cardboard boxes and bicycles put by the side of the road.
Qi Alfred Chen, UCI professor of computer science, said, “A box, bicycle, or traffic cone may be all that is necessary to scare a driverless vehicle into coming to a dangerous stop in the middle of the street or on a freeway off-ramp, creating a hazard for other motorists and pedestrians.”
He went on to say that automobiles cannot tell the difference between things on the road by chance and those purposely set there as part of a physical denial-of-service assault.
The research team primarily analyzed security flaws unique to the planning module, a component of the software code that handles autonomous driving systems. According to the researchers, this component is in charge of the vehicle’s decision-making processes, such as when to cruise, change lanes, etc.