Remember the movie Total Recall, when Arnold Schwarzenegger takes a ride in a Johnnycab?
That ride didn't turn out so well, but Eric Jaffe's ride in one of Google self-driving cars was a lot better.
Google has been testing its self-driving cars on California freeways for years. Now they're moving to city streets, which are orders of magnitude more complicated. That's because so many things share regular streets: bicycles, pedestrians, jaywalkers, delivery trucks backing up, garbage trucks stopping suddenly, buses constantly weaving in and out of traffic, right-turn-on-red, cats, dogs, walk signals, crossing guards, potholes, patches of ice, red-light runners and left-turn-lane jumpers -- it's crazy out there.
Jaffe's story is highly complimentary to Google's system, even though the first rule of self-driving cars is to not compliment the self-driving car. However, the test driver had to intervene twice during the ride: once when some traffic cones appeared on the road and the computer stupidly couldn't figure out what to do, and again when a truck appeared out of nowhere and the computer didn't appear to respond to the impending collision fast enough (though the Google team said later that the machine would have stopped in plenty of time). Jaffe was favorably impressed. It is an impressive system.
But I'm much more skeptical about the practicality. One of the most difficult problems in artificial intelligence has been computer vision. Though the software is getting better at recognizing its surroundings, Google's system is completely dependent on extremely intricate maps and GPS. (Though, truth be told, a lot of people are now equally helpless without their GPSs.)
But what happens when conditions on the road don't match the map (say, because of road construction or an accident), or when the computer can't get a GPS signal? The machine is highly dependent on a laser array on the car's roof to make a 3D map of its surroundings. Does that laser system work in rain, in fog, or snow? Can the machine see brake lights through the windows of the car ahead and know that means traffic is stopping?
The article mentions that they're working on getting the software to recognize people standing behind poles. Working with incomplete data is something that humans are good at; if I see the bottom of the rim of a bicycle tire under a truck I know there's a biker up ahead. Can Google's hardware recognize those kinds of details, and can their programmers code that kind of knowledge into the software? On the other hand, if the car perceives everything as a potentially deadly situation, it will never go anywhere.
One of the arguments for self-driving cars is that they should be better at obeying traffic laws: they should obey the speed limit and yield the right of way (as long as they can reliably detect other cars). The software shouldn't get impatient and pull into a lane of fast-moving traffic from a parking spot.
But the technical aspects are probably the least of Google's problems in
making self-driving cars a reality. I predict that legal and liability issues will be the biggest stumbling block. If a self-driving car runs down a child chasing a ball into the street, whose fault is it?
One could argue that a cautious driver, seeing children playing next to a street, would slow down to 5 mph and shift out of the right-most lane, and on a narrow street perhaps even move into the oncoming traffic lane to ensure that there would be enough time to avoid any darting children.
What if Google's algorithm doesn't include that specific scenario? Were the programmers negligent? Could the car company and Google be sued for the child's death, and could the programmers be held criminally and financially liable for this oversight? And if the self-driving car was programmed perfectly to follow all the laws and take all the precautions, would there be any humans who would want to be chauffeurred by such a slow and timid vehicle?
Jaffe says that 90% of car accidents are due to human error. Self-driving cars, the argument goes, will eliminate human error and make the roads much safer. Except that's completely false. Humans will write the software and build the hardware that control the car. Yes, those humans will take a lot of time and do a lot of testing to make that software and hardware as reliable as possible. But, as we know from all the bugs we find in the software in our computers and mobile phones and cars and microwave ovens, that human-designed software and hardware is far from perfect. Will that software be open-source, available for everyone to examine?
To make it worse, these cars will almost certainly have black boxes that will record every piece of data recording during the trip, allowing the entire country to second-guess every traffic accident these cars are involved with. Let's say a baseball rolled out from between two parked cars. Any decent driver would immediately slam on the breaks, assuming a child would be chasing it. Will Google's software do the same? If it doesn't, and a child is run down by a car that doesn't know what a baseball is, what kind of liability will Google and the car company have?
Of course, people make these kinds of driving errors and kill themselves and others all the time (at a rate of 33,000 each year). At this point the car companies (and Google) can just shrug and say, "human error."
In fact, the biggest legal protection that car manufacturers have is that 90% human error rate: they can almost always blame accidents on the driver. But when we have self-driving cars, these companies will be legally exposed to everything that happens on the road.
Airplanes have autopilot systems, but they're typically used in very controlled circumstances, in clear skies when the aircraft is at cruising altitude. Autopilots can land and take off, but typically human pilots are in control at critical junctures. But even in those cases, airports are tightly run by air traffic controllers. Planes have several pairs of eyes watching them at all times. Google is proposing that no one will be watching any of the cars on the road, except some hardware and software.
Admittedly, driving on a freeway is a lot like flying an aircraft on autopilot in open skies. I can see how Google's system could be made to work on a sunny freeway with light traffic. City streets, however, are completely different. At any point something totally random can happen. Such streets are far more unpredictable than an airport runway, and there's no air traffic controller monitoring all the comings and goings.
I can see technically how Google's system could be made to work. I would even grant that it could be made to work if only cars were on the street, because most car-on-car collisions at city-street speeds are very survivable with the seat belts and air bags found in today's cars. It would be even safer if the vehicles were operated in their own zones, say on monorail tracks suspended above the streets.
But when you have a mix of cars, pedestrians, children, bicycles, buses, and massive trucks on surface streets, I find it hard to believe that any company's lawyers would allow them to relinquish the "human error" they can now blame for almost all car accidents. Everything will be the company's fault, even accidents caused by weather, because the car should have "known" it was going too fast for the conditions.
I'm not sure if Google's programmers realize it, but people are going to want the software to incorporate Asimov's Three Laws of Robotics. They're going to expect these robot cars to make moral and ethical judgments about what to do in an emergency. Suppose your car is tooling down the road, and an old man steps right in front of your car. There's barely enough time to swerve, but on the right is a sidewalk restaurant packed with bearded hipsters drinking lattes, and on the left is a school bus filled with adorable children into which you would run into head-on.
How will the car decide who will live and who will die? Run down the geezer because he has the fewest years left (and based on his ratty clothing is least likely to have a good lawyer)? Front-end the bus, assuming that its greater mass will protect the children and the car's airbag will miraculously save you? Or plow through the restaurant, because, well, bearded hipsters drinking lattes.
I'm afraid Google's vision of Johnnycabs ferrying us around the city is going to be crushed by those meanies in Legal.