who is protected in a self driving car

A self-driving car will make a decision about the best way to navigate an imminent collision, and maybe it's best to leave the decision up to logarithms. But if artificial intelligence can make a decision that costs someone his or her life, should the owner of the car be able to choose settings that might alter the outcome? 

 

In a study that had people decide between killing one person or five, only Buddhist monks and psychopaths had no empathy in making a choice. AI also will not carry guilt in any decision that is made. If the self-driving car is programmed to spare as many lives as possible in any scenario, then it would aim for one person in its path rather than five. 

 

Of course, a variant would be if the five people have at least a small chance of survival while the single pedestrian has no chance.  What if there are five bicyclists wearing helmets one way and a single child the other? A team of prison workers cleaning the roadway vs a young mom. Cars will have the resources to know a lot more than a human driver would ever know about an impending collision, but someone still has to make the decision for the computer, even if the decision is to always default to most lives saved.

 

But what if you, as the owner of the vehicle, could choose some settings that could alter the dates in at least some situations? For example, self-preservation dictates that even a good husband (like me) will steer his wife into harm's way when presented with a collision. If I could set my car to take a hit on my side (if she's my passenger), I would  choose that setting. The problem is that your passenger could also be that jackass from HR who's in your rideshare, or your wife's old-ass uncle.

 

Also, what if the one life being saved in a scenario is (or isn't) yours? Should a car be able to choose to end its owner's life (getting smashed by a semi) in order to save five people standing on the street corner? Maybe the owner could be given the right to choose in a 1 to 1 scenario, as in avoid the semi if there's one pedestrian. Or even the more love-thy-neighbor setting to take the crash yourself rather than hit the pedestrian. And if there's no option, then what will the computer decide? If my wife was driving one of these vehicles with my kids, I'd have to say I'd rather the car take out a pedestrian than get hammered by a truck. But if I don't get a choice and don't know for sure what the vehicle might do, I might worry.

 

My great uncle was in a scenario like the ones I've discussed here. He was about to get hit by a truck, so he swerved and killed a bicyclist. He was cleared of any wrongdoing, but I'm sure it wasn't easy to live with having killed someone, even if it was to avoid getting killed. I remember asking about the victim, and I was glad it was also an old man rather than some kid on the bike. But my uncle didn't have time to decide, even if it had been a twelve-year-old, or even a group of five kids. His self-driving car would have that time. With no conscience, it's the programming that will dictate who lives, essentially making gods of our cars. I suppose 100 years down the road, when every vehicle is self-driving and can avoid each other in all circumstances, these early philosophical debates will seem silly. But right now, someone will have to decide how the cars react in each scenario, and I suppose most programmers aren't psychopaths or Buddhist monks.