Self-Driving Cars Must Be Banned Now!

People have gotten themselves killed by Tesla autopilot for quite some time. The first known fatal incident happened in 2016, when a Tesla Model S with the autopilot engaged slammed into a 18-wheel tractor-trailer. The autopilot didn’t notice the white side of the crossing tractor-trailer against a brightly lit sky, and the brakes were never engaged. The Tesla went so fast the roof of the car was cut off, and the vehicle didn’t come to a stop until it snapped a telephone pole a quarter-mile down the road. The driver in this particular accident was watching a Harry Potter DVD while driving.

Now self-driving cars have taken it to the next level, and are killing not only their drivers, but also unsuspecting pedestrians. On March 18, a woman crossing the road with her bicycle in Tempe, Arizona, was struck by a self-driving car operated by Uber. She was taken to the hospital, where she later died from her injuries.

Uber has naturally pulled all their self-driving cars off the road after the accident. Not surprisingly, many people are now calling for all autonomous vehicles to be removed from public roads until the cause of the Uber crash has been found. Other’s are demanding that the whole idea of a self-driving car being banned and buried.

Crying out for a ban is often the knee-jerk reaction when something new and scary goes awry. But it’s not necessarily the rational thing to do.

Computers. Computers everywhere!

In the case of the fatal Tesla autopilot accident, the driver was watching a DVD instead of paying attention to the road. With the first iterations of their autopilot technology, Tesla stated that the drivers should both keep their eyes on the road, and their hands on the steering wheel while the autopilot was engaged. Why? Because the Tesla autopilot at the time wasn’t advanced enough to handle some advanced driving scenarios. The name “autopilot” might have been slightly misleading.

The Uber involved in the fatal crash in Arizona was traveling in autonomous mode, but with a safety driver on-board. The safety driver is supposed to pay attention to the road, and take control of the vehicle in case the autonomous system fails. Video footage release by the Tempe police, suggests that the Uber safety driver was not paying enough attention, thus failing to detect the pedestrian crossing the road.

But if the car is self-driving, why is it necessary to have a safety driver? Isn’t the car supposed to drive all by itself? Well, nothing is prefect on the first try, or the second. Or tenth for that matter. Or ever. A modern car is an amazingly complicated piece of machinery. It’s basically a computer on wheels. Then you take a shitload of sensors, cameras, radars, lidars, and a metric fuckton of new computers, and add everything on top of the already complicated car. This setup is then used to run software written by computer programmers just like me, and we make mistakes. A lot of them. There will always be bugs in the code, and those bugs will inevitably result in things going horribly wrong.

But with every iteration, bugs are ironed out, and the chance of something going wrong decreases.

ABC15 screen grab showing the smashed up Uber vehicle.
ABC15 screen grab.

Accidents happen

Uber’s self driving technology might have bugs that prevented it from detecting a pedestrian it should have spotted. Maybe they have been allowed out on public roads prematurely? The investigation of the Arizona accident will hopefully reveal the cause of the crash. If it’s a technical glitch, it can be fixed. If it was an accident neither technology nor the safety driver could have prevented, then the Uber cars should be allowed out on the streets again as is.

I’m not saying that the pedestrian in Arizona is just collateral damage. It’s not an unfortunate accident on the road - pun intended - to the holy grail of self-driving vehicles. I’m saying that accidents happen, and they will continue to happen even when every single vehicle on the road is driving on its own.

But there might be far less accidents with autonomous vehicles, than with vehicles controlled by a human. Early data released by Tesla showed that after 47 million miles (~75 million kilometers), the probability of an accident is at least 50% lower when using their autopilot technology. Since the numbers are released by Tesla themselves, they should of course be taken with a grain of salt. But a lot of car accidents are caused by incompetence, lack of attention, drunk drivers, and other factors that will be eliminated with the use of autonomous vehicles.

All hail our autonomous overlords!

There’s a good chance you can kiss some traffic congestion good bye as well. Ultimately, self-driving vehicles should be capable of communication with each other. This makes it possible for them to drive a lot closer to each other than cars today. Perhaps even the need of owning your own car will be eliminated. Most cars spend the majority of their time parked. With fully autonomous cars, you could simply call one up on demand, and have it take you to your destination. I realize this is very similar to today’s taxi services, but with an important difference: Every car is a taxi. You can probably see why Uber is spending a lot of money on autonomous vehicle research.

In a world where we’ve, unfortunately, made ourselves dependent on a polluting, counter-productive, killing machine on wheels, we need the self-driving car. Let’s not go completely overboard, and ban progress because of a single, tragic, accident. Do we even consider banning airplanes when they fall out of the sky? No. We analyze the incident, figure out what went wrong, and then make sure it doesn’t happen again.

In the airline industry, this approach has been wildly successful. The number of miles flown increase each year, still the number of fatal accidents decrease. In 2017 there was zero accident deaths in commercial passenger jets1.

The same should be the case for the car travel, and with autonomous vehicles, we can achieve just that.


  1. 2018, unfortunately, isn’t looking that good↩︎


Feedback

Do you have any thoughts you want to share? A question, maybe? Or is something in this post just plainly wrong? Then please send an e-mail to vegard at vegard dot net with your input. You can also use any of the other points of contact listed on the About page.

Caution

It looks like you're using Google's Chrome browser, which records everything you do on the internet. Personally identifiable and sensitive information about you is then sold to the highest bidder, making you a part of surveillance capitalism.

The Contra Chrome comic explains why this is bad, and why you should use another browser.