When Computers Choose who Lives and who Dies

Two cars are heading towards each other on a narrow bridge. Their brakes are failing, and they each have two options: swerve or continue straight. If both cars do the same thing, both drivers die. But if one swerves off the bridge and the other doesn’t, one driver lives. In most cases, the drivers will both choose to do nothing and both die. But what if they don’t have a choice?

With the arrival of self-driving cars, which are controlled by computers rather than people, this choice of what to do in the case of an imminent crash could no longer be for the driver to make.
Instead, life and death decisions may be determined by a so-called “death algorithm” downloaded into the car at its construction.

In July, the owner of a Tesla died when his vehicle, set to autopilot, failed to brake and collided with a trailer-truck. He was not driving, and some consumer advocates argue Tesla should be held responsible.

With the fast pace of technological innovations, we are destined to see more and more self-driving car on the roads in the near future. Many will still have aspects that allow humans to override autonomous control, but the cars will eventually be forced to make more choices about imminent collisions.
Soon, autonomous cars will be able to communicate information with each other, such as how many passengers each has and who those passengers are, in fractions of a second, and then act accordingly.

But do you choose who lives and who dies based off this information, and how?
What if one of the cars on the bridge contained a family of six, and the other a single passenger? A convicted felon in one, and the President in another?

I know these scenarios are hypothetical, but self-driving cars are here. I am not a philosopher nor any kind of expert in morality, but someone will have to answer these questions before autonomous cars become widely available. So here’s what I believe.

The minute we start attaching different values to different lives, we cross the moral line. Call me unsympathetic, but I do not care if the crash is going to be between a schoolbus of children and a bus of inmates on death row; a life is a life. There cannot be a grey area.

We could ask the algorithm to save as many lives as possible, even if it meant suicide for certain cars. But who would buy a car they knew could choose to kill them? How would you feel if someone close to you died because their vehicle was on track to collide with a group who happened to carpool that day?
That leaves a last option: the algorithm is told to always act in the best interest of its own passengers. More lives would be lost, but this is the way we drive now. We prioritize ourselves. We cannot swerve off the bridge.

If you look in terms of the bigger picture, we are selfish. But that’s ok. Self-preservation is what makes us human. The value we put on our own lives is not wrong; rather, it is what distinguishes us from the technology we build.

Computers don’t have instincts or emotions. They’re highly analytical and able prioritize society over the individual with ease. Which unsettles us, rightly.

Whatever your opinion, try to think about these things now rather than later. Write to Congress (if you believe the government should have a role in regulating death algorithms), Google, Tesla; make these decisions about your life and the lives of those around you rather than leave them up manufacturers of self-driving cars.

As our world becomes more technologically advanced, we will need to figure out how exactly we want this world to look. Computers cannot think on their own (yet), so we are the ones who get tell them what to do. Let’s make sure we’re telling them the right things, whatever you interpret that to be.

Senior Paige Watson lets her car do all the driving. Photo by Mireille Leone

Posted by Hope Anderson

Hope Anderson, senior, is a Senior Editor and a waffle enthusiast. In her spare time she watches British dramas and eats off-brand organic snack products.

Leave a Reply

Your email address will not be published. Required fields are marked *