March 13, 2017
Autonomous Cars and Moral Driving Dilemmas
It’s an inescapable trend in the auto industry that car computers and machines are doing more of the work that used to be the responsibilities of the driver. We have automatic transmissions, ABS, and traction control systems. Brakes activate in anticipation of a crash. There are even systems that can automatically parallel park your car. Soon, they’re going to be driving themselves.
The Need for Autonomous Cars
It’s hard to deny the fact that humans as a group, aren’t really good drivers. In 2015 alone, more than 35,000 people died in motor vehicle accidents. A study conducted by the US Department of Transportation determined that up to 94% of all US car crashes were due to human error.
People drive drunk or while distracted, they may lose concentration, or they may wish to go too fast for the adrenalin rush. Sometimes, they’re simply not just good drivers at all, and people can panic and do stupid things like step on the gas when they mean to step on the brakes. Automated systems do go through these problems at all.
But driving isn’t all about skill and expertise. There’s a moral and ethical component to it. Now autonomous cars are expected to address various “what if” situations in ways that people and experts deem ethical and proper.
The Moral Dilemma
Here’s the typical “what if” situation. You’re driving fast on the road, and then suddenly a bunch of little children go right into your driving path. You step on the brakes, but the car doesn’t stop on a dime. It just keeps on going.
However, if you do turn the car away from the kids, you will crash and die. Perhaps there’s a huge truck on one side and a cliff on the other side, so you can’t safely swerve away from the kids.
So you have two options. You can swerve and die, so you can save the children. Or you can just step on the brakes and go through the kids, and this option saves your life.
Various experts, including ethics professors, government regulators, and lawyers have debated this issue for years. Science magazine conducted a recent survey of almost two thousand people, and the majority of the respondents stated that it was ethically preferable for the car to crash into a cliff than into the pedestrians.
However, Mercedes-Benz disagrees. The car company has stated that their most advanced autonomous cars in the future (Levels 4 and 5) will always prioritize their car occupants every time.
The MB Rationale
Why has Mercedes-Benz taken up this position? There are several possible answers, and it may start with the aforementioned Science survey. That same survey also discovered that the majority of them would not buy an autonomous car if that car prioritized the pedestrians over the car occupants.
In other words, it’s more moral for a self-driving car to save the pedestrians. But people would rather ride a car that’s programmed to be “less moral” by putting the car occupants first.
Mercedes-Benz also recognizes the inevitable fact that they will get sued no matter what when the first accidents occur involving self-driving cars. If the car protects the car occupants and hits the pedestrians, then the pedestrians’ lawyers will argue that the car computer murdered those helpless pedestrians.
But if the pedestrians are spared, then the car occupants’ lawyers will argue that the car betrayed and killed the people that it was supposed to serve and protect.
Mercedes-Benz will therefore save the person in the car. That’s because it’s the person who bought the car, and there’s also a much greater chance of saving the occupants than the pedestrians. There’s too much uncertainty if the pedestrians are the higher priority for the car computer.
Mercedes-Benz does say that its autonomous systems can keep such problems from arising in the first place. So in the scenario, perhaps the car wouldn’t be going as fast, or it could have detected the children well before they cut across the road. The self-driving car won’t be perfect, but at least it will be better and safer than a human-driven car.
Currently, the thinking is that it’s still not up to the computer to make that sort of moral decision. It’s still up to the driver, because drivers are required to be always ready to seize control of the car at any point.
But soon more advanced cars will be driving themselves. The hope of Mercedes-Benz is that by that time the question becomes moot, because the car occupants would never be placed in such a dilemma because of the prior decisions of the car computer. Hopefully that actually comes to pass. If not, then we face a future that lets computers make moral decisions—and those decisions may not be all that ethical all the time.