SJ Munoz
With efforts to implement autonomous vehicles increasing, where does safety factor in?
Missy Cummings, director of the Autonomy and Robotics Center at George Mason University, joined Trucking with OOIDA to discuss the topic.
“If not designed correctly, automation or autonomy can lead to some catastrophic events,” Cummings said. “While I was a fighter pilot in the 1990s and working with some of the most advanced automation, there was a problem with mode confusion then.”
Cummings’ recent research took aim at claims by autonomous vehicle developer Waymo, which has said its vehicles are safer than human drivers.
“I wanted to take a fresh look at the data that was coming to light through NHTSA and the California Waymo program,” Cummings said. “When you look at that data, it’s actually quite clear that Waymo’s claims are not true. They are more on par with rideshare drivers, who are having accidents four-to-six times more often than your average driver. That’s pretty concerning. But it’s not a direct and fair comparison. Waymo uses remote operators, unlike the average driver. So it’s not really fair to compare. I wanted people to understand this. We should not start making these comparisons, because they are not the same.”
The data did reveal some conclusive evidence.
“On any given month, Waymo is experiencing almost two times more rear-end collisions than your average driver,” Cummings said. “It’s quite clear there is a problem in self-driving cars with the computer vision system. Even (with) their augmented sensors like LiDAR, they will see things that aren’t there. This causes aggressive hard-braking maneuvers, much harder than your average driver is making. It’s one thing for a Toyota Sentra to slam on its brakes going 65 mph, but completely different for a tractor-trailer to do it. If we can’t figure out how to address the phantom braking problem, self-driving trucking is DOA.”
Artificial intelligence is not reasoning or thinking like a human, Cummings added.
“This is why you see videos of driverless vehicles going the wrong way down the street,” she said. “They struggle to get out of the situation they’re in. Fortunately, these types of instances are happening at a much lower rate than things like phantom braking.”
The goal for Cummings in her research is to find the sources of the problems and a viable solution.
“When the training models are developed, human annotators will typically cut off the tip so that the true shape does not get learned,” Cummings said. “The jury is still out on how much we can improve the training, or are we just going to have to use a different kind of sensor? It’s hard to make these technologies scale to operate at low and high speeds. I think Waymo has done a good job of showing how operations can be conducted, particularly at suburban speeds. I don’t want to take away from them. But it’s not clear that their solution is going to scale for highways.”
Cummings pointed to the situation with Aurora’s autonomous vehicle rollout as evidence.
“That (Aurora) failed rollout is exactly what I’m talking about,” Cummings said. I’d like to see this technology succeed. I run a robotics lab. I’m here to make the technology work. It’s not clear the theoretical underpinnings are going to work. If we can’t make sure these vehicles can detect object problems at highway speeds and act accordingly, self-driving trucking is never going to work. It’s also going to severely limit the self-driving car.”