Tuesday 18 October 2016

The Morality of Self-Driving Cars

About two years ago, I wrote a flash fiction story called "An Automatic Decision", which you can find in the flash fiction collection, Heaven and Earth: Paranormal Flash Fiction.

In it, I speculated what might happen if a self-driving car had to make a decision between saving the life of its driver, and saving the life on an innocent bystander.

When I wrote it, I had no idea how prophetic that would turn out to be. I read an article yesterday on Co,Exist, entitled Self-Driving Mercedes Will Be Programmed To Sacrifice Pedestrians To Save The Driver. Here's a quote from that article:

One of the biggest debates about driverless cars concerns the moral choices made when programming a car's algorithms. Say the car is spinning out of control, and on course to hit a crowd queuing at a bus stop. It can correct its course, but in doing so, it'll kill a cyclist for sure. What does it do? Mercedes's answer to this take on the classic Trolley Problem is to hit whichever one is least likely to hurt the people inside its cars. If that means taking out a crowd of kids waiting for the bus, then so be it.

Scary, isn't it?

I don't know about you, but if you ask me, that's definitely a decision that should be left up to the driver!

Maybe the driver should be able to set some parameters beforehand, to tell the car what should happen in that situation.

Of course, it would be ideal if the car could do what it did in An Automatic Decision, which would be to stop time. But that's not going to happen, now is it?

What do you think? Should a self-driving car be allowed to make a call between saving a single human life, and potentially dozens?


No comments:

Post a Comment