Self-driving cars are racking up a crash rate double that of those with human drivers, challenging the notion that these cutting-edge creations will lead to a world without accidents.
Where is the problem?
First of all, these driverless vehicles obey the law all the time — that’s how they’re programmed and wired.
But good luck trying to merge onto a chaotic, jam-packed highway with traffic flying along well above the speed limit.
These cars tend not to work out well in those conditions — reckless or inattentive humans are slamming into them.
As the accidents have piled up — all minor scrape-ups for now — the arguments among programmers at places like Google Inc. and Carnegie Mellon University are heating up, Bloomberg reports.
Should they teach the cars how to commit infractions from time to time to stay out of trouble?
“It’s a constant debate inside our group,” said Raj Rajkumar, co-director of the General Motors-Carnegie Mellon Autonomous Driving Collaborative Research Lab in Pittsburgh.
“And we have basically decided to stick to the speed limit. But when you go out and drive the speed limit on the highway, pretty much everybody on the road is just zipping past you. And I would be one of those people.”
Last year, Rajkumar offered test drives to members of Congress in his lab’s self-driving Cadillac SRX sport utility vehicle.
The Caddy performed perfectly, except when it had to merge onto I-395 South and swing across three lanes of traffic in 150 yards (137 meters) to head toward the Pentagon.
The car’s cameras and laser sensors detected traffic in a 360-degree view but didn’t know how to trust that drivers would make room in the ceaseless flow, so the human minder had to take control to complete the maneuver.
“We end up being cautious,” Rajkumar said.
“We don’t want to get into an accident because that would be front-page news. People expect more of autonomous cars.”
Turns out, though, their accident rates are twice as high as for regular cars, according to a study by the University of Michigan’s Transportation Research Institute in Ann Arbor, Michigan.
Driverless vehicles have never been at fault, the study found — they’re usually hit from behind in slow-speed crashes by inattentive or aggressive humans unaccustomed to machine motorists that always follow the rules and proceed with caution.
“It’s a dilemma that needs to be addressed,” Rajkumar said.
It’s similar to the thorny ethical issues driverless car creators are wrestling with over how to program them to make life-or-death decisions in an accident. For example, should an autonomous vehicle sacrifice its occupant by swerving off a cliff to avoid killing a school bus full of children?
– Contact us at [email protected]