OPINION:
Machines with a mind of their own are the future, and self-driving automobiles will soon be sharing the road with cars and trucks with real drivers. Labor-saving devices are always welcome, and driving on roads in the congested communities where most Americans live is certainly a chore. But motorists should keep their hands on the steering wheel until autonomous vehicles are proved demonstrably safer than human-piloted versions. Smart technology isn’t always smart enough.
California is renowned for its pursuit of trends that the rest of the nation strains to follow, but with the new selfie-mobiles California is proceeding with uncharacteristic caution. Google, based in Silicon Valley, leads the pack of software firms and automobile manufacturers vying for a big slice of the driverless transportation pie, is impatiently pushing state regulators to approve its computer-driven car with no steering wheel, no accelerator and no brake pedal. It’s thrilling (but still scary) be at the mercy of a machine at the amusement park, but so far, not on the open road.
California is one of four states that has authorized the use of self-driving cars, and is completing the rules that would mandate that trial models contain working steering wheels and pedals, and that a licensed driver sit behind the wheel — just in case. Caution gives Google hiccups. Google says if driver error on busy roads is to be eliminated, the place to start is with human drivers, with all their flaws and frailties. “We need to be careful about the assumption that having a person behind the wheel [will make driving safer],” says Chris Urmson, leader of Google’s self-driving car project.
Major U.S. automobile companies like an incremental approach, too. Ford is developing a “Ford Smart Mobility” program and General Motors has commissioned an “Autonomous and Technology Vehicle Development Team.” Both are intended to supervise a careful transition from human- to computer-piloted car. Some self-driving features are available now, such as automatic braking and assisted parking. Still unresolved is how a self-guided model can cope with the unexpected, such as blowing trash or a police officer waving a car to the shoulder. Some tough ethical issues must be ironed out as well, such as whether a self-driven car that senses an approaching catastrophic crash will decide to sacrifice itself rather than the other car. Passenger protection shouldn’t simply be a function of price.
Hackers will inevitably learn how to commandeer self-driving cars for nefarious purposes. Rogue programmers who steal information and money from government and private institutions every day might find the pickings easy in a self-driven car. Nissan has used NASA technology to guide its self-driving test cars in California, and the hacker organization AnonSec says it has pilfered 276 gigabytes of data from NASA’s computer network. The group said it had taken over a $222 million Global Hawk military drone and tried — without success, fortunately — to send it nose first it into the Pacific Ocean.
Self-driven vehicles are only as safe as the software that controls them, and anyone with a computer knows that the idea of software perfection is a daydream. Software failure eventually strikes every user, whether behind the wheel or in front of a keyboard. It’s far too soon for a motorist in a self-driven Belchfire 8 to climb into the back seat for a snooze and let the computer drive.
Please read our comment policy before commenting.