Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

Renew Deal

(81,855 posts)
Tue Jan 18, 2022, 11:39 PM Jan 2022

If an autonomous vehicle carrying a passenger (the owner) is going to crash, what should it hit?

Here's the scenario. There's an autonomous vehicle with a passenger. The passenger is the owner. The vehicle can "see" what's around it and make decisions. Most of these decisions are ordinary like accelerate, brake, turn, signal, etc. But the vehicle can also make safety decisions to avoid collisions.

Let's say the vehicle detects a collision. It's choices are to:

Hit a building and kill the driver
Hit a senior citizen, but save the driver
Hit a child, but save the driver

What should it choose? Why? Does fault play into any of these? Is it the drivers fault they chose an autonomous vehicle? Is it the pedestrian's fault that they crossed the road at the wrong time? What if it is none of their faults and another vehicle caused the situation? What if the passenger is a pregnant woman?


8 votes, 1 pass | Time left: Unlimited
Hit the building
8 (100%)
Hit the senior citizen
0 (0%)
Hit the child
0 (0%)
Show usernames
Disclaimer: This is an Internet poll
40 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
If an autonomous vehicle carrying a passenger (the owner) is going to crash, what should it hit? (Original Post) Renew Deal Jan 2022 OP
Your job to save others bottomofthehill Jan 2022 #1
Oh, boy, first one I could not answer. Polly Hennessey Jan 2022 #2
It is tough Renew Deal Jan 2022 #3
Wait. What did I just say. Oh, dear after more thought it would be the brick wall. Polly Hennessey Jan 2022 #5
The answer is easy... hit the republican. nt Lucid Dreamer Jan 2022 #38
If you speak with a self-driving car system engineer C_U_L8R Jan 2022 #4
People are designed the same way. Renew Deal Jan 2022 #7
It should hit Elon Musk DBoon Jan 2022 #6
More to the point... Takket Jan 2022 #8
That makes sense Renew Deal Jan 2022 #9
Kobayashi Maru nt. BlueIdaho Jan 2022 #10
Would you buy a car that is programmed to kill you? The Revolution Jan 2022 #11
What if, even if programmed to kill the driver... ret5hd Jan 2022 #15
That is the right question to ask. Renew Deal Jan 2022 #21
Serpentine and hit them all Hassin Bin Sober Jan 2022 #12
Best reply yet! "de-AI-th race 2050" ret5hd Jan 2022 #17
Lol. Machine Gun Joe Viterbo Hassin Bin Sober Jan 2022 #24
The car can't choose to kill the driver over someone else unblock Jan 2022 #13
These are in my neighborhood every day. n/t ChazII Jan 2022 #14
My son and I had this conversation. Corgigal Jan 2022 #16
I was thinking about ForgedCrank Jan 2022 #19
I could never ForgedCrank Jan 2022 #18
The Three Laws of I Robot... madinmaryland Jan 2022 #20
All other systems in a car are designed to protect the driver EarlG Jan 2022 #22
Mitch Mc Connell. roamer65 Jan 2022 #23
Well at least there's no choice for hitting the dog. milestogo Jan 2022 #25
Simple. WarGamer Jan 2022 #26
I like how the default is death. Of someone. flvegan Jan 2022 #27
Best option is not listed Diablo del sol Jan 2022 #28
The best option is not to leave the road Disaffected Jan 2022 #29
This message was self-deleted by its author sarcasmo Jan 2022 #30
Couldn't I, as the passenger, rownesheck Jan 2022 #31
There really isn't enough information to make a proper decision. PTWB Jan 2022 #32
You have to assume the computer is behaving prudently. Renew Deal Jan 2022 #33
the passenger made the decision to own and ride in a self driving car... let them take the risk Demovictory9 Jan 2022 #34
Not a realistic scenario. lagomorph777 Jan 2022 #35
It is basically edhopper Jan 2022 #36
There should NOT be totally autonomous operations on public roads. Lucid Dreamer Jan 2022 #37
It should LetsGoBiden Jan 2022 #39
If some drunk pedestrian runs out into traffic on the highway, the vehicle should self destruct... PTWB Jan 2022 #40

Polly Hennessey

(6,793 posts)
5. Wait. What did I just say. Oh, dear after more thought it would be the brick wall.
Tue Jan 18, 2022, 11:44 PM
Jan 2022

Now I am going to be a mess for the rest of the night.

C_U_L8R

(44,998 posts)
4. If you speak with a self-driving car system engineer
Tue Jan 18, 2022, 11:44 PM
Jan 2022

They'll say they try to design it to not hit anything.

Renew Deal

(81,855 posts)
7. People are designed the same way.
Tue Jan 18, 2022, 11:48 PM
Jan 2022

Generally, they are more flawed than the machines but that's besides the point.

Takket

(21,561 posts)
8. More to the point...
Tue Jan 18, 2022, 11:50 PM
Jan 2022

Does your car have the duty to kill you?

I’ve seen these ethics quizzes before. They are tough to answer. You would hope the answer maybe lies in the technology being robust enough to avoid having to make the choice in the first place.

That being said, you have to weigh everything. The goal should be the most lives saved possible and logic says a person in a car is far more likely to survive hitting a building than a pedestrian hit by a speeding car.

Renew Deal

(81,855 posts)
9. That makes sense
Tue Jan 18, 2022, 11:52 PM
Jan 2022

We have to assume that the car is not speeding, since it doesn't have the motivation to speed the way humans do. But I get your point.

The Revolution

(766 posts)
11. Would you buy a car that is programmed to kill you?
Tue Jan 18, 2022, 11:57 PM
Jan 2022

We should start to codify questions like this into law now. If it is up to individual companies to decide, then an AI that will preserve the life of the owner and their family in all situations might be a selling point.

ret5hd

(20,491 posts)
15. What if, even if programmed to kill the driver...
Wed Jan 19, 2022, 12:14 AM
Jan 2022

in some extreme circumstances to save others lives…

that the technology was good enough at eliminating overall risk that the chances of a driver dying in an accident was cut by… I dunno, say 10%?

What about 20%?

50%?

At some point (reasonable?) people should agree that the trade off was worth it, right?

At some point, OVERALL risk should come into the equation.

Renew Deal

(81,855 posts)
21. That is the right question to ask.
Wed Jan 19, 2022, 12:42 AM
Jan 2022

Would someone buy a car that will sacrifice the passengers to save others? I would guess no. The manufacturers have a duty to provide safe products, but isn't their first duty to their customers?

I'm not convinced that laws will come up with better answers.

Hassin Bin Sober

(26,325 posts)
24. Lol. Machine Gun Joe Viterbo
Wed Jan 19, 2022, 01:08 AM
Jan 2022

That was the first movie my friends and I rented on the brand new Betamax.

unblock

(52,196 posts)
13. The car can't choose to kill the driver over someone else
Wed Jan 19, 2022, 12:10 AM
Jan 2022

The life of the driver counts no more or less than the others

*the driver* ought to choose self sacrifice, if they were making the decision

But the *ai system* is not sacrificing itself to save anyone

The *ai system* has no basis in that moment to identify with the driver and self-sacrifice to save the others. To the *ai system*, they're *all* others.

It has no basis on which to choose to kill one to save the others.



I have my cheeky answer though, which is what I think Elon musk would do:

The ai system should hit the building, killing the driver, but record data showing the human driver disengaged autopilot and took over seconds before the crash and selflessly veered away from the pedestrians, killing himself and dying a hero.

Corgigal

(9,291 posts)
16. My son and I had this conversation.
Wed Jan 19, 2022, 12:14 AM
Jan 2022

He said, it will hit whatever the software engineer decided to hit.

I pause, and never asked about it again.

ForgedCrank

(1,779 posts)
19. I was thinking about
Wed Jan 19, 2022, 12:24 AM
Jan 2022

this earlier when I read the question and it opens up an entire new realm of liability.
Any of those families would have a valid wrongful death case against the company who approved the programming.
This is certainly a catch-22 if it ever came that this was an actual decision to make. And I can see none of those being an option, and AI making an attempt to avoid all three, even if it failed to do so.

ForgedCrank

(1,779 posts)
18. I could never
Wed Jan 19, 2022, 12:19 AM
Jan 2022

choose my own life over someone else if a choice were given.
Any life I had after that would be worthless and full of guilty misery if I did.

madinmaryland

(64,931 posts)
20. The Three Laws of I Robot...
Wed Jan 19, 2022, 12:31 AM
Jan 2022

They are: A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

So none of those options apply.

EarlG

(21,945 posts)
22. All other systems in a car are designed to protect the driver
Wed Jan 19, 2022, 01:02 AM
Jan 2022

I think? Is there any system in a car which is designed to protect the life of a pedestrian over the life of the driver, in any circumstance?

It seems counterintuitive because I think we’d like to imagine that if we were in such a situation, we would drive ourselves into the building rather than hit the child. But at the same time, I can’t imagine that people would feel comfortable buying a car that might choose to murder them if its algorithms determined that a pedestrian’s life was more valuable at any given moment.

Maybe they should just make it an option in the car’s settings. “In the event of a potentially fatal collision, protect me,” or “In the event of a potentially fatal collision, protect pedestrians.”

WarGamer

(12,436 posts)
26. Simple.
Wed Jan 19, 2022, 01:25 AM
Jan 2022

Don't leave the roadway.

The programming never has the option to leave the "proper path" to avoid a collision. It will apply brakes.

flvegan

(64,407 posts)
27. I like how the default is death. Of someone.
Wed Jan 19, 2022, 01:46 AM
Jan 2022

With all of the safety tech, the automatic result of the car making a decision is instant death. I mean what, is this car self-driving at 200 mph? Then chooses to hit a building without slowing down?

Maybe two sets of Brembos on all 4 wheels and better programming, or a non-idiot pilot still in control of the vehicle.


This is a poll for idiots (and salivating tort lawyers). Congrats!

Disaffected

(4,554 posts)
29. The best option is not to leave the road
Wed Jan 19, 2022, 01:52 AM
Jan 2022

unless that can be done without colliding with something off the road (or in the oncoming lane). So, if someone suddenly walks/runs out in the street in front of the oncoming car, swerve/brake as is possible but don't leave the road if the car will hit the building (or an oncoming vehicle). If that results in colliding with the pedestrian, it was the pedestrian's fault in any case and the driver/passengers should not die instead.

Response to Renew Deal (Original post)

rownesheck

(2,343 posts)
31. Couldn't I, as the passenger,
Wed Jan 19, 2022, 06:36 AM
Jan 2022

open the door and dive out and roll along the ground like I've seen in movies and TV shows? That's what I would do. Then I'd hope the empty car hits the building.

 

PTWB

(4,131 posts)
32. There really isn't enough information to make a proper decision.
Wed Jan 19, 2022, 04:51 PM
Jan 2022

Is the vehicle being operated (by computer, in this case), in a safe and prudent manner? Are the pedestrians jaywalking or committing some sort of violation that places them at risk? What’s behind the brick wall and is there a significant risk of additional death or injury to people on the other side?

Renew Deal

(81,855 posts)
33. You have to assume the computer is behaving prudently.
Wed Jan 19, 2022, 05:36 PM
Jan 2022

Is a mistake possible like it doesn't detect we ground or sand, but unlike people, the computer is incapable of purposefully behaving improperly.

Lucid Dreamer

(584 posts)
37. There should NOT be totally autonomous operations on public roads.
Wed Jan 19, 2022, 06:01 PM
Jan 2022

A driver should be accountable at all times.

I've spent thousands of hours flying airplanes that were on "autopilot" but I was still the one responsible for the safe completion of the flight.

Totally autonomous operation is appropriate in warehouse environments, for example. But turning auto-driving cars loose on the public roads is foolhardy in my opinion.

Your mileage may vary.

[Yes. I am aware I didn't answer the question asked. I'm pissed that the question even needs to be considered.]

 

LetsGoBiden

(58 posts)
39. It should
Wed Jan 19, 2022, 10:58 PM
Jan 2022

Self-destruct so it doesn’t hurt anybody but the person who owns it self responsibility now I won’t be buying one untill after Trump does lol Can we get hanity and him on a golf trip anybody say golf cart accident

 

PTWB

(4,131 posts)
40. If some drunk pedestrian runs out into traffic on the highway, the vehicle should self destruct...
Wed Jan 19, 2022, 11:00 PM
Jan 2022

If the only choices are to hit the pedestrian or kill the occupants of the vehicle?

Latest Discussions»General Discussion»If an autonomous vehicle ...