Self-driving cars are nearer than we think, and here’s how we know: The U.S. government has decided that they are badly in need of federal regulation.
The National Highway Traffic Safety Administration has begun a four-year study to plan guidelines for the regulation of cars that drive themselves — or, as the bureaucrats describe them, “autonomous vehicles.”
Using a combination of radars, lasers, cameras, computers and GPS systems, these cars can drive to their destinations and park themselves, without the interference of humans.
And there’s one of the questions the feds are wrestling with: Should a human even be required aboard the car, and, if so, should he be required to sit in the driver’s seat? The feds think so, and go even further, recommending that there be a special driver’s license to
“operate” — meaning, sit in the front seat in a driverless vehicle.
The test for a license for a person to “operate” an autonomous vehicle should be quick. All you need to know is how to turn it on and off. The traffic regulations and rules of the road would already be programmed into the car’s computer.
The professional worriers have come up with the usual melodramatic scenarios that seem to cry out, as the regulators intended, for federal regulation.
As one worrier told the Sacramento Bee, “There’s often a difficult balance that we, as a regulator, face. You’ve approved something, it crashed and killed (a) choir on the way to a tsunami relief concert. Who’s responsible for that?”
Oh, I don’t know. God? The concert promoter? The choir singing “Ninety-nine bottles of beer on the wall”? On-board electronic devices screwing up the computer like the airlines tell you they will when you’re getting ready to take off?
We are a country of litigants, and already there is an embryonic driverless-car bar. Ryan Calo, a law professor at the University of Washington, has co-founded the Legal Aspects of Autonomous Driving center at Stanford University.
As Calo told The New York Times:
“The first time that a driverless vehicle swerves to avoid a
shopping cart and hits a stroller, someone’s going to write, ‘Robot car kills baby to save groceries.’ It’s those kinds of reasons you want to make sure this stuff is fully tested.”
If the car is so smart, why doesn’t it run over the shopping cart? Happens all the time in supermarket parking lots. And why are the hypothetical victims always choirs and babies? Why not, as an example, have a driverless car plow into a group of al-Qaeda operatives?
The real danger of driverless cars is not simultaneously arriving at the same parking spot or the computer taking offense at a car refusing to give the right of way where three lanes merge to two.
The auto designers want to use the cars to collect all kinds of information about your driving habits, purchases and destinations, meaning somebody may have to explain why the car was parked in front of an hourly-rate motel for three hours.
You shrug sheepishly and explain, “What can I say? Our minivan has been seeing a Chevy
Camaro on the side.”
Dale McFeatters is a senior writer for Scripps Howard News Service. Send comments to firstname.lastname@example.org.