How the Google accident shows we need self-driving cars

Google recently had, in all likelihood, a worst case scenario happen to one of their self-driving cars. On Feburary 14th, one of the Google SUVs, which roam the streets of southern California, crashed into a city bus. Headlines were made, critics are gleefully calling victories, and Google’s PR is on the defensive.

But how big was the incident? A bus failed to yield to the Google car as it swung slightly into the neighboring lane to avoid some sandbags on the road. The Artificial Intelligence driven SUV did what, arguably, any other human driver would have done. In fact, the human driver in the SUV, there by law, didn’t intervene. This fact is the most telling of the story. This human driver, undoubtedly a typical, average Joe driver, figured that the bus would yield. The same assumption was made by the SUV’s computers.

But the bus didn’t yield. Most urban drivers won’t find that actions surprising. Buses are notoriously pushy in traffic. They don’t have any yield requirements, and in some areas actually have laws requiring everybody else yield to them. Google claims that not all of the fault lies with them, and that makes sense. This bus is not being driven by an AI, though if it was, one could argue it would have stopped, but by a stubborn human who just kept driving.

There are two things that we can take away from this accident, and neither of them are Google’s fault. Saying this exemplifies the reason that we should keep steering wheels and human drivers ignore some of the key facts about the incident:

  • The Google human driver didn’t intervene because they didn’t feel the need too: aka, the SUV computer wasn’t doing anything differently than they, or most people, would do in that situation.
  • The bus was approaching from the rear, and likely saw the SUV creeping out at 2 mph. The driver chose to hold their ground, and their line.
  • Google’s AI was able to intelligently pace the SUV so that, in this event, the speed of the collision was so slow, negligible damage was done. The bus was doing over 15mph.

Google AI is programmed by human drivers

This is really the crux of all computer driven cars. The programming within the AI is made by humans, designed to mimic human reactions, and the driving style of the humans that made it. Google has taken great steps to ensure the attitude of the SUVs is extremely passive. If the SUV determined it was clear to veer slightly into the right lane next to it to avoid the debris, it clearly believed it was not breaking into the flow of traffic. Any one of the hundreds of millions of drivers around the world would have made the swerve. Most of those drivers wouldn’t even have slowed down to analyze it, cutting off the bus. Those that do, a small fraction would have outright stopped on the major roadway and let the bus pass. Google’s self-driving car acted in a way that would frustrate most drivers on a busy multi-lane road. It slowed down, it paced itself, and it creeped its way around a minor obstruction.

Could there be fault on the bus driver’s part? Without seeing footage of the incident, it’s hard to tell. But knowing that the vehicle was slow moving, molasses slow, perhaps the driver believed they could rush past while the SUV puttered around. Hopefully the transit system will have the integrity to admit their driver’s actions contributed as well.

Computer driven cars are surrounded by human driven cars

We humans may not want to admit it, but the biggest flaw on the road today is the fleshy pilots behind the wheel. Human error contributed to over 90% of road accidents in the United States, according to the NHTSA. That’s astonishing. These self-driving cars have been in 12 recorded accidents before, an impressive figure considering they’ve accumulated over $1.5 million miles to date. Of those 12 accidents, 12 were caused by other cars driven by humans. In this most recent case, at least part of the cause lies with a bus driven by a human. As good, or near perfect, as self-driving cars can be, they’re still at the mercy of the plethora of horrible driving styles Humans employ: There’s the scared slow, the hyper aggressive, the road hog, speed demon, blind spot ignorers, the drunk driver, the high driver, the distracted texting driver. More than half the new technology in cars today is geared towards keeping us from getting in accidents.

Until the vast majority of cars on the road are self-driving, accidents with Google’s SUVs will happen. Critics will squeeze every piece of pseudo evidence from the results to discredit this technology. Truth is, however, computers make consistently better drivers than Humans. This fact scares a lot of automotive purists. Quickly ignored are the 12 other accidents involving Google cars. Quick to be blamed is the human at the wheel of the Google SUV. Irony? Perhaps. But what is clear is that the common fault in this accident may eventually be narrowed down to one thing: Human error.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top