When we think of autonomous vehicle safety, we’re usually focused on ensuring a vehicle’s self-driving technology is sound. But what if high-quality self-driving tech can be sabotaged by bad actors from the outside? According to new research, it can—and the consequences could be deadly for pedestrians.
Most companies involved with autonomous vehicle technology use light detection and ranging (LIDAR) for navigation. LIDAR, which typically takes the form of a spinning sensor on the vehicle’s exterior, constantly captures information about the vehicle’s surroundings by emitting laser light. The resulting reflections allow the system to calculate the distance between itself and any obstacles present in the vehicle’s environment. When LIDAR is working properly, it helps the vehicle stop or change course to avoid obstacles in its path.
Unfortunately, it’s disconcertingly easy to interrupt this process from the outside. Researchers from the University of Florida, the University of Michigan, and Japan’s University of Electro-Communications have found that it only takes a well-aimed laser to confuse autonomous vehicles’ LIDAR, thus preventing the system from detecting things in its path. When timed correctly, a laser pointed at an oncoming LIDAR sensor creates a cone-shaped blind spot where pedestrians, equipment, and other obstacles can’t be seen. This causes the vehicle to think its current path is a safe one and even crash into whatever’s in its way.
Though at first it sounds like lasers “blind” LIDAR, this isn’t the case. In an experiment simulating malicious interference, the researchers tested the effect of a laser pointed at LIDAR-equipped vehicles and robots from approximately 15 feet away. While LIDAR still captured obstacles’ presence, it immediately discarded that data in favor of the laser, which essentially spoofed a second reflection and scrambled the sensor’s data. In simulations involving vehicles, this caused moving pedestrians to go unnoticed, allowing the vehicles to proceed toward a potentially deadly collision.
It’s difficult to imagine what incentive someone would have for wreaking havoc by inhibiting an autonomous vehicle’s LIDAR. Regardless, the researchers—who specialize in artificial intelligence, data science, programming, and electrical engineering—think such interference can be avoided. Manufacturers could update their vehicles’ LIDAR to include Fake Shadow Detection, or FSD. This would identify and circumvent “shadow regions” caused by spoofed reflections, including those maliciously generated using lasers.
Now Read:
- New Ransomware Attack Tries to Frame Security Researchers
- The Air Force Wants LIDAR on Robot Dogs
- New X-Ray Technique Could Help Detect Explosives, Tumors