Driverless cars pose several challenges

The accident in Arizona in which a driverless car struck and killed a woman Sunday night appears to be the first pedestrian death involving self-driving technology. It won’t be the last, but that’s not a reason to abandon the idea of autonomous vehicles. People just need to understand that no technology is perfect.

Still, there are many issues to be worked out before self-driving cars are allowed to graduate from testing to full operation, and that process should not be hurried.

The incident in Tempe, Arizona, is still being investigated, but early indications suggest the driverless car was not at fault, and neither was the human safety driver at the wheel of the Volvo XC80 SUV operated by Uber. The victim, a 49-year-old woman, was walking her bicycle across the street, in the dark, and reportedly stepped into the path of the Volvo, which was traveling 40 mph in a 45-mph zone, according to police. It appears neither the vehicle nor its human occupant had time to react.

That’s problem No. 1 with driverless cars. They cannot be expected to anticipate and avoid every possible action by human beings. People will step in front of moving vehicles, either accidentally or on purpose. Human drivers will suffer medical episodes such as seizures or heart attacks and suddenly lose control of their vehicles. Or they will simply swerve into the path of another car for no apparent reason. Even the best computer technology can’t compensate for every possible occurrence.

Self-driving cars can, however, eliminate a great deal of the human error that causes an estimated 94 percent of automobile crashes. Every year, crashes kill more than 30,000 people in this country and injure 3 million.

Self-driving cars don’t drink, text, speed or fall asleep behind the wheel. And so far, their safety record is stellar. Google reported over a year ago that its self-driving cars had 13 accidents in 1.8 million miles of driving — all of them minor fender-benders — and not once was the self-driving car at fault.

Still, autonomous cars are being tested in only a few places, mostly in the Western U.S., in good weather. Peter Hancock, a professor of psychology and engineering at Central Florida University, points out much of the data comes from tests on unidirectional, multi-lane highways, where the driver’s job is keeping the car in the lane and not following too closely. Self-driving cars are very good at that, he notes — but so are human drivers. Two-lane roads, rain, snow and mountainous terrain pose greater challenges.

A computer-driven car can react to another vehicle and avoid hitting it. But what happens when a self-driving car encounters an accident scene with a police officer directing traffic? Computers can’t acknowledge a flagger’s signals or interpret them — at least not yet.

Security is another huge consideration. Hackers already have demonstrated they can take control of some driverless cars remotely. How prone will autonomous cars be to hacking by pranksters or terrorists?

Eventually, driverless cars will likely become part of daily life. But that will happen gradually, and for a long time, autonomous cars will share the road with human drivers. Managing that mix will be no small task.

In the meantime, government officials should make sure driverless cars are perfected as much as possible and tested thoroughly before allowing them to operate at will.

Share This Story