Would you let your self-driving vehicle kill you?

midian182

Posts: 9,632   +120
Staff member

One of the interesting moral questions posed by self-driving vehicles is one of self-sacrifice. Specifically, if a situation arose where an autonomous car could save numerous lives by sacrificing its occupants, should it choose the lesser of two evils and kill its passengers?

According to a recent study - “The social dilemma of Autonomous Vehicles” - in Science, 75 percent of participants said self-driving cars should be programmed to prioritize minimizing the number of public deaths in such situations, even if it means those inside the car perish.

While the statement sounds noble, many respondents added that if a car was designed to act in this fashion, they wouldn’t buy it, preferring instead to ride in a vehicle with an AI that protects the lives of its passengers above all else. Essentially, those surveyed say a self-driving car should definitely sacrifice its occupants for the sake of the greater good, just not when it’s their own lives or the lives of their children at stake.

Additionally, most people said self-diving vehicles should not put its passengers at risk if it meant only one or two pedestrians were guaranteed to live. So when autonomous vehicles become the norm, remember to walk around in large groups.

Researchers say that if both self-sacrificing and self-preserving autonomous cars went on sale, even the most altruistic among us would probably start putting themselves first.

“Even if you started off as one of the noble people who were willing to buy a self-sacrificing car, once you realize that most people are buying self-protective ones then you’re going to really reconsider why you’re putting yourself at risk to shoulder the burdens of the collective when no one else will,” said co-author Iyad Rahwan.

One way to avoid choosing between AVs that will save you and ones that may kill you is introducing government regulations that legally force manufacturers to program vehicles to take the self-sacrificing “utilitarian approach.” However, the survey participants said they were much less likely to purchase any autonomous vehicles if such regulations appeared.

The crux of this issue could be that while many people would sacrifice themselves when behind the wheel to save a large number pedestrians, they don’t want to be in a situation where the choice is taken away from them. What if, for example, the autonomous vehicle makes a mistake and kills its passengers needlessly?

It’s worth remembering, though, that it will be a long time – decades, even – before self-driving vehicles populate our roads, and by then we’ll have hopefully solved some of the moral and social dilemmas posed by the vehicles. They may improve congestion, pollution, and road safety, but it seems the cars will bring with them some new problems of their own.

Permalink to story.

 
There are still too many unknowns for me to say what it should do. Main Question is what is the range of the front sensor.

The other thing I wanted to mention is, that there are certain type of people that when they are crossing the street, whether it is a cross walk or not, they take their golly good time like they rule the world or something and I am not talking about elderly or disabled people here.
 
This is a dumb question. The only time a car EVER drives into a crowd is when the person driving the car does it intentionally - or passes out or something. It's happened a couple times in recent years. If the car is screwing up enough to drive into pedestrians - why are we talking about whether it's going to decide WHO to hurt?
 
I would say it should be allowed to kill the driver. The sheer amount of traffic deaths that will be avoided with the implementation of self driving cars would offset by a wide margin the number of car passengers killed by artificial intelligence making the decision to avoid killing others. I think we would have to defer to the bigger picture.

However... someone brought up a weird point that this mechanism could be hacked. Someone could jump out in front of your car on a bridge forcing your car to veer off the bridge killing you. How would situations like this be handled?
 
What if I'm surrounded by an angry mob hell bent on killing me because I just happened to be in the way or have a certain bumper sticker? Self driving is ok but where the hell is manual override? Are there no pedals and a steering wheel?

I would never buy a car I couldn't drive. Hell I wish I could flaship cell phones without fingerprint and iris scanners. Let alone a car without a steering wheel.

While reading this I thought of the time I was driving home late and a pedestrian tried to run in front of my car, I did a slight swerve and avoided him, he seemed upset I missed but I kept my cool and made an easy swerve into the empty oncoming traffic lane. Wonder what the self driver would do? Crank the wheel into the ditch?
 
And here you have it folks.
A vehicle that removes individualism...only it's not the vehicle. It's a group of self righteous new world order types who firmly believe in 'The minority report' as perfection and think they can run your life (and death) better than you can. This is of course assuming you should have the privilege of making decisions in the first place.
 
What a horrible article written to provoke anti-autonomous car propaganda and sentiments. First of all, the person in the car might get injured at worst, the people getting ran over will die from the impact. They don't have seat belts or steel cages surrounding them for protection. Second of all, if a collision is unavoidable through unforeseen factors, something that seems outlandish in the first place, you know an autonomous vehicle won't still be at full speed if it has to self-crash. I don't expect anything worse then whiplash would be within the realm of possibility. But I honestly don't even see the possibility that the car wouldn't have given itself enough room to come to a complete stop so I don't even see this as legit possibility that a rider would be at risk of being "murdered" by their vehicle. If anything, I see this intentional self-crash feature making the system much more vulnerable to a catastrophic bug. A car can't accidentally do what it was never programmed to do. But you program the car to crash and the conditions triggering that response somehow gets triggered in error, your car might self crash for no reason. I really don't like any of this conversation. We need to be spreading stories about all the safeguards and planning that is going into making these cars reliable. Not fear mongering to the general population.
 
If a bunch of *****s jump in front of my car, they can die.
Otherwise apply the breaks ahead of time.
 
I think that at the point that self driving cars are widely adopted, cars that prioritize the driver over everything else will be banned. Why? It could easily be used by hackers, terrorists, or pretty much any one how wants to break the law.
 
And here you have it folks.
A vehicle that removes individualism...only it's not the vehicle. It's a group of self righteous new world order types who firmly believe in 'The minority report' as perfection and think they can run your life (and death) better than you can. This is of course assuming you should have the privilege of making decisions in the first place.

I don't think major traffic incidents have anything to do with individualism. Yes, a choice is being taken away from you but it's not a choice you want or should make in the first place. Unless you are saying that everyone should have the choice to decide who lives and who dies.
 
Hell no I don't want to be sacrificed for what will most likely be the stupidity of others. Dumbasses walking out from between cars without looking, cyclists and skateboards listening to headphones crossing the street, etc. etc. Lots of people not paying attention out there.

Focus on softening the hit for pedestrians, recognition speed and response times to hopefully avoid a collision in the first place.
 
I'm more concerned about someone else's self driving car killing me. Self driving car....ha ha
 
I think this is all a dumb question. Long before something like this comes to be, we as a society would need vastly better tech to prevent hacking, stupid accidents, sensors that reach beyond basic information (heat, cell phone, even audio can be planned for). As well, the situations where a death to the driver caused by such tech would be more obscure, we would likely hear of planes crashing more often.

Oh and if you are really worried about hacking, you should not look up videos about how vulnerable your vehicles are today (especially ones with blue-tooth connectivity).
 
"So when autonomous vehicles become the norm, remember to walk around in large groups." A bit presumptive to say "when" and not "if" aren't we? I will not relinquish my grip on the steering wheel to an AI, period. Humans are certainly imperfect, but an AI lacks the power of reason. That ability is critical in making choices affecting the driver, passengers and those outside the vehicle.
So, if a bunch of drunks or stoners (or other drugs illegal and/or legally prescribed) get into a self-driving car (owned by one of them) and this car subsequently kills a pedestrian, who is to blame? In my estimation, all inside, the auto manufacturer, the insurance company who willing does so for these vehicles, the local, state, and federal governments for allowing this happen.
 
Last edited:
......if a collision is unavoidable through unforeseen factors, something that seems outlandish in the first place...........
You do know the incredible number of vehicle accidents right? Couldn't it be said that the overwhelming majority of these are due/caused by 'factors' like those mentioned and not mentioned in the article (like icy roads so braking distance uncertain). This is why they are called 'accidents', they weren't planned. Other types of incidents are 'criminal'. Geez, do you work for an autonomous vehicle project?
 
I don't think major traffic incidents have anything to do with individualism. Yes, a choice is being taken away from you but it's not a choice you want or should make in the first place. Unless you are saying that everyone should have the choice to decide who lives and who dies.

Ofc we should have this choice. Lets say, a schoolbus breaks traffic laws and drives in front of me, it's them who need to die. Assuming that my self-driving car did no break any traffic laws obviously. I don't have to die because they were in the wrong place at the wrong time doing wrong things. Ethical or not.
 
Ofc we should have this choice. Lets say, a schoolbus breaks traffic laws and drives in front of me, it's them who need to die. Assuming that my self-driving car did no break any traffic laws obviously. I don't have to die because they were in the wrong place at the wrong time doing wrong things. Ethical or not.

A self driving school bus can't break laws. Also, no a school bus possibly carrying 20 or more children is not going to self-sacrifice for one guy.
 
Let's just hope the software in these autonomous cars is better than what we currently run in our devices or I will be seriously afraid.
Is there going to be an autonomous car 'Patch Tuesday"?
 
I can think of many scenarios where AI would have trouble making a decision. What if someone is carjacking your car and you have the opportunity to run him over and escape? Your car wouldn't let you! What if you are going through a "drive through" zoo and a group of chimps jump in front of your car? Will your car kill you to avoid the chimps? On the other hand, what if it is Halloween and a group of people wearing chimp masks get in the way of your car? Will it run them over? What if you actually wanted to kill someone? Could you just have a group of people run in front of their self driving car? You could just annoy someone by getting in the path of their car! Instead of slowly making them move by inching your car forward, AI would just sit there! Annoying people would love this. In fact, what if a crowd is trying to kill you by smashing your windows and trying to get into your car? (I have seen this in the news many times). Your AI car would just let them kill you.
 
OMG, if it's going slow enough to make the turn, it's going slow enough to stop with minimal damage to outsiders... But no, I bought the protection of self-drive. If the *****s want to j-walk, I'm not going to die for em. Their fault.
 
Back