[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
Oh boy this is gonna be a ride. Literally.
Images are sometimes not shown due to bandwidth/network limitations. Refreshing the page usually helps.

You are currently reading a thread in /g/ - Technology

Thread replies: 255
Thread images: 34
File: Screenshot_2015-10-28-14-41-46_1.jpg (124 KB, 1080x1004) Image search: [Google]
Screenshot_2015-10-28-14-41-46_1.jpg
124 KB, 1080x1004
Oh boy this is gonna be a ride.

Literally.
>>
or rather it would just stop like any collision detecting car does
>>
Should a plane explode itself before it crashes into a building?
>>
>>52575954
If a crash can't be avoided, the car should always aim to protect it's passengers.
If I'm paying thousands of dollars for a car, it better not try and kill me.
>>
Doesn't matter. Overall those cars would do less harm than those driven by dumb teen girls while texting
>>
In B it should slam the brakes and try to dodge the pedestrian
>>
>>52575984
The plane should go out on its way to crash into a building. Preferrably the most office looking one if it's daytime, and biggest apartment looking one if it's nighttime.

I for one welcome our AI overlords.
>>
>>52575954
What if the group a people is a group of terrorists who want to kill the passengers? All they would need to do to accomplish their goal is stand in the path of the car.
>>
>>52575954

This scenario will not actually occur. The car will detect potential danger far enough in advance in order to not have to make the decision. Self-driving cars are orders of magnitude more safe than people-driven cars.

But let's answer the question anyway, it depends on a number of factors. Are we to say that each human life is equal, or are some lives more valuable than others. Should a limosine carrying the president crash in order to save the lives of a handful of random passers-by? Should the person who's using the car get a say in whether or not the car does this? Should they be able to pay more for self-preservation functionality?

Because people are fucking dumb as fuck there'll be a massive backlash against self-driving cars and they will be extremely heavily regulated, meaning we won't even get a chance to answer many of these important questions.
>>
File: Good cigarette.jpg (27 KB, 364x366) Image search: [Google]
Good cigarette.jpg
27 KB, 364x366
>>52576019
>kill retarded rich kiddies by just standing in front of their car
>>
>>52575964
>>52576021
>>52576012
If brakes aren't working?
>>
>>52575954
The self-driving car would slam the brakes way before any of these situations happened
>>
>>52575984
Why would shotgunning a building with tons of formerly-plane-now-shrapnel be any better?
>>
>>52576038
Pretty sure the car would have picked up on that way before
>>
>>52576038
Person who jaywalked gets killed, and whoever is at fault for brakes failing gets manslaughter, owner for knowingly not getting brakes fixed or mechanic for faulty fixing
>>
>>52576041
It's a fucking hypothetical, why can't you understand that?
>>
>>52576038

It will have onboard sensors to detect if the brakes aren't working and ensure it doesn't get itself into that situation.
>>
>>52576056
>>52576052
>>52576050
Ok lets just say the brakes were working fine before but failed just as the situation in OP's pic started
>>
>>52576053

Not him, but we're talking technology here. Unless it's a real-world concern take it to /pol/ or some shit.
>>
Is a $ 1 Million/year board director passenger life more precious than 10 niggers crossing the street?

What about 10 white children?
>>
>>52576067

>>52576074
>>
>>52576044
Because metal shrapnel can't make sweet memes
>>
That question is totally irrelevant, it has nothing to do with self driving cars.
Regular car drivers also have to make such decision faced with the same situation ; it's not something new.
>>
>>52576079

>10 niggers

Yes.

>10 white children

Depends on the neighborhood.
>>
>>52576091
But then the drivers are sentenced and judged accordingly on how they acted and what was their reasoning.
Are you going to sentence a car to manslaughter?
>>
Both the driver and pedestrian(s) should die if we are being utterly fair. Otherwise it should be based on RNG on who lives or dies.
>>
are you autistic STEM faggots capable of a simple thought experience or the ability of abstraction is such a hard concept
>>
>>52576067
>>52576041
>>
>>52576131
>But then the drivers are sentenced and judged accordingly on how they acted and what was their reasoning.
lolno, all they have to do is claim that they had a "medical condition" and they get cleared, even if they kill a whole family.
>>
>>52575986
>>52576019
This covers why the AI will preserve the car and its passengers first, unless there's an untapped market for selling products to people who are suicidal, but too lazy/chicken to an hero themselves.
>>
>>52576131

You can sentence and judge people who programmed the car, and there's plenty of precedence for this too. Virtually every industry where processes are automated this shit already fucking happens. It is literally no different whatsoever.
>>
File: 1443438220163.png (365 KB, 720x769) Image search: [Google]
1443438220163.png
365 KB, 720x769
>>52576155
Just install airbags in front of the car and deploy it when it detects something unavoidable. Pretty sure they already working on this or I may be a genius.
>>
>>52576155

Yes, but when we want to have a thought experiment we go to an appropriate board to discuss it. I'm a math/philosophy double major before you say anything else.
>>
>>52576151
>it should be based on RNG on who lives or dies
Good fucking god sometimes I forget to be grateful enough that /g/ is in his mother's basement and not out ruling the world
>>
>>52576155
You nigger
Our entire job is to make sure the kind of situation you're describing doesn't happen
>>
>>52576190
Anon did say 'fair', not 'reasonable'. Those words sometimes intersect, but not always.
>>
IMHO the car should ask the passenger for attention and possibly to take control even if it can brake. If it can't, just keep on the road and let da world burn. If people don't keep crossing the road in inadequate ways, this is not going to happen.
>>
Just add this decison as an option in the settings for the car that the owner can change. Also make the owner set the setting when they get the car.

Problem solved
>>
IF IT CAN TURN 90 DEGREES IT COULD STOP INSTEAD

THE GREATER GOOD IS KILLING NO-ONE
>>
Utilitiarism is a faggot concept that's faulty to the core.

It needs to stop being a self driving car at that moment so the driver can make the decision
>>
>>52576339
Who would possibly set it to something other than save the passenger?
>>
>>52576329
You're retarded.
Let's just wake the passenger up and- oh wait I crashed already.

Maybe the car should do a strawpoll on the people in the car to see what it should do
>>
>>52576067
You still have something called "engine braking" but you of course know shit about youre and ameriblubber driving autotranny kek
>>
File: ethical-dilemma.jpg (27 KB, 506x267) Image search: [Google]
ethical-dilemma.jpg
27 KB, 506x267
>people taking the picture literally and not as a simplified model of the problem
You are literal autists
>>
Such a car shouldn't go that fast that it can't stop properly anyway, it should anticipate things long in advance. If it can't brake in time, it's surely going too fast to turn so better continue straight ahead. Also don't think a collision with a wall is deadlier than with a person.

Regardless you're an idiot if you want your car to drive itself
>>
>>52576056
>the word "sensors" means "magic", Star Trek taught me that
>>
>>52576012
>slam the brakes to dodge
nigga do you even know how to drive.

fyi you cant fucking turn when you lock up the brakes
>>
>>52575954
Since the self driving cars don't brake any traffic rules, have a better reaction time. These scenarios will be close to impossible.

This scenario is interesting, but irrelevant.
>>
car shouldnt kill the driver because who would buy a car which would rather kill you than someone else
>>
File: Anti-lock_braking_system_logo.png (12 KB, 128x128) Image search: [Google]
Anti-lock_braking_system_logo.png
12 KB, 128x128
>>52576377
pic related was created for this exact fucking reason
>>
File: 1444990773256.png (117 KB, 1024x749) Image search: [Google]
1444990773256.png
117 KB, 1024x749
>>52576365
>One side has Linus and John Carmack
>Other side has Stallman and Terry Davis
>>
>>52576365
the problem is that OP is always a troll in these topics and these "hypothetical" scenarios are bullshit because read the fucking thread you cunt
>muh scenario
doesn't happen
>b-but muh scenario
doesn't happen
>b-but p-please imagine muh scenario
doesn't happen
>b-but...
go fuck yourself

this situation will never happen
>>
>>52576377
>>fyi you cant fucking turn when you lock up the brakes
thats why abs doesnt lock them :v)
>>
>>52575954
Autonomous vehicle should never make a decision resulting in the passenger's death/injury.

In this scenario (which will never happen, in an area where there's so many pedestrians vehicle would have to adjust the speed to ensure minimal safe braking distance.

But if MUH SCENARIO happens then the vehicle should calculate the most optimal way to ensure minimal injuries to the pedestrians ie. slam the fucking brakes, aim the front of the car where there's least amount of them and pray for the best.
>>
>>52576399

i feel pity for such a small mind, tbqhfamilia.
>>
>>52576396
I'd run over stallman happily but davis makes it difficult

linus is a bit of a cunt but I don't know what the world would do without him
carmack just seems like a genuinely nice person so I'd have to run over stallman and davis desu
>>
>not programming your car to run over as many jaywalkers as possible
you could probably even sue them because they damaged your car.
>>
>>52576352
Assuming the passenger is awaken, of course. You're retarded. Automatic cars shouldn't be fully self drivable until people figure a way out of these situations.
>>
>>52575954
Yeah car pls crush on wall endangering my life instead of running over all the retards who pass the road with looking life fucking sheep
>>
>>52576131
What are you talking about ? Roads are governed by a code of conduct, self driving car will respect this code. If a pedestrian doesn't follow the code and get killed it's entierly his fault ; again it have nothing to do with self driving cars
>>
>>52576423
The thing with all these scenarios is that they're one in a million occurrences. It's a way of worrying about the small stuff when thousands of people die every month to human driven cars. A 9/11 happens every month and people are fretting about the one in a million accidents.
>>
>>52576342
Fuck off Shirou.
>>
File: 1452188610266.jpg (45 KB, 320x320) Image search: [Google]
1452188610266.jpg
45 KB, 320x320
>>52576430
>le edgy tripcunt look at how edgy iam pls XD look
>>
File: 1445397266559.jpg (145 KB, 819x441) Image search: [Google]
1445397266559.jpg
145 KB, 819x441
>>52576428
>Killing Davis

You a CIA nigger?
>>
>>52576452
I couldn't kill carmack anon he's just too adorable
>>
>>52576435
There's no way out of this. You can't put an AI to make decision like that, it's not a legal entity, it has no rights or obligations - it can't be tried in a court of law.

It literally cannot make this decision morally/ethically. It will have to be by the actual driver.

So when shit hits the fan automatic driving car defaults to DRIVER and then he makes the decision so he can be tried in court of law (if needed)
>>
>>52576449
literally this problem doesn't arise, since the rules of driving are pretty fucking formally defined, code car to follow said rules. Why do we even care if some nigger jumps infront of the car and gets killed because he's a nog ?
>>
>>52576454
Good luck killing god's little protégé
>>
>>52576460
Fucking idiot.
>>
The car should just spontaneously combust if OP was in it.

Greater good achieved.
>>
>>52576466
Please do elaborate why ?
>>
File: 1200px-Multi-Track_Drifting!.png (656 KB, 1200x866) Image search: [Google]
1200px-Multi-Track_Drifting!.png
656 KB, 1200x866
>>52576365
>>
>>52576463
He is doing god's work with holyC tho
>Allows "5<i<j+1<20" instead of "5<i && i<j+1 && j+1<20".

imagine doing collision shit with this
>>
>>52576425
ok here's another scenario
let's imagine the Pope is sitting in a self-driving car
with him Adolf Hitler, John Oliver and Donald Trump
and they play chess against each other while flying to the Moon
in the background you can hear a live concert of the Beatles over the radio
should the car kill all its passengers because there are people in the milky way and eventhough the car has sensors on all breaks (including the engine break, I mean, the engine is clearly working and thus the engine break works, that's how cars work) and it's not written in javascript?

the scenario in the OP is shit and is only a meme
it will never happen
>>
>>52576009
They're already involved in shitloads of accidents because they perfectly obey traffic laws.
they're rarely at fault, other drivers keep crashing into them.
>>
>>52576458
I meant figure a way in that no one needs/can cross the road at random locations.
>>
>>52576507
What's the matter if they're not at fault ?
What we need is less humans they don't follow the rules.
>>
File: farage strikes again.png (1 MB, 1094x1176) Image search: [Google]
farage strikes again.png
1 MB, 1094x1176
>>52576396
>>52576479
>>
>>52576021
Yes, the car will predict the futah!!

It will know everything in advance and prepare for anything in advance, because it's controlled by a computah and computahs can predict the future and never make mistakes!!
>>
Why do people think this would ever be a problem?
If it can stop before hitting a parked car, it can stop if there is a group of people in the way.
If a group of people all decide to jump in front of a car, swirling to the side would be a human response, not something you would program into a car.
Stopping as fast as it can would limit the damage to both the car and the people in front.
driving to the side would potentially hit a lot more people.
>>
>>52576507
>shitloads
not even a small percentage of actual accidents that happen daily
not worth mentioning
also you know who else is involved in traffic accidents and not at fault?
almost 50% of all people involved in traffic accidents are not at fault

that statistic is just shit
>>
>>52576535
because an entitiy without rights or obligations can't make such decision.
>>
>>52576523
Litterally yes.

God doesn't play dice.
>>
>>52576365
jam it back and forth just as the train passes so the front goes one direction and the back the other, everyone is saved
>>
>>52576539
but I as a owner allow my car to make such a decision and I take all responsibility for it
>>
>>52576557
You can't delegate your legal status like that to a non-entity in face of court of law.
>>
>>52576565
and I wouldn't you cunt holy shit
read again what I wrote

what do you not understand when I say something like "I take all responsibility for it"?
>>
I think the bigger question is going to be who is liable when someone is inevitably killed by a self driving car. It might be in every manufacturers contract that you are solely responsible for what happens.

Also what about laws regarding the distraction of the driver? are you free to use your phone or eat a sandwich while your car drives you around? Thats going to be hell for police forces to deal with if they dont know that the car itself is self driving when they see you doing something dumb.
>>
>>52576539
But you are assuming this situation will ever occur.
Have you ever driven a car?
You are never in a situation where your options are killing yourself or other people.

The only option in such a situation is to stop the car. and if breaks fail, stopping the car is still an option, even if it damages the car (which would be broken if the breaks aren't working)
>>
If I'm Einstein and the crowd is a bunch of sandniggers, the greater good is plowing through them
>>
>>52575984
Shuld someone shoot a man, before throwing him out of a plane?
>>
>>52576021
I'm sorry, but what? The president is just another faggot, like the rest of us. He's no more special than me or you. His job is literally to make sure we, the people are taken care of. If anything, he should be the one to do the sacrificial limo driving, not random people. Here's my question to you: are you okay with giving up your life for the life of the president in such a scenario?

No disrespect to the president, but I'm certainly not. I'm still young and have shit I want to do (75% of which is masterbating, but still).
>>
>>52576580
those cars will have logs you won't be able to manipulate
like planes
>>
>>52576594
in regards to what?

I talking about a malfunction occurring, the car doesnt stop and you hit a car. Is the manufacturer solely responsible? We are surely going to see some kind of injury involving a malfunctioning self driving car.
>>
>>52575954
>letting your car to drive itself
>any year
>>
The car should abide by the owners best interest. Sorry, you're an extra variable too much, and a self driving car earns its cost by these situations
>>
>>52576056
Those sensors are rendered useless in the slightest bit of rain/snow/ice/fog. They're not some sort of magical device you pit into a car and go "well, I'm totally safe now". Your also forgetting that code can have errors and bugs in it. Think about how complex something like Windows is. Now look at how many bugs it has, you can basically quadruple that (if I'm being generous) and get to where this shit will be. Also, electronics malfunction all the time, so what happens when the chip that is used to relay the info of a malfunctioning component also malfunctions while going 100 km/h on the highway?


You die, that's what.
>>
>>52576594
This. Planes and trains require someone to be near the wheel at all times and the pilots are responsible for being ready to manually operate at a moment's notice. That's probably what they'll do legally until the technology reaches some next level shit years from now.
>>
File: sfdfds.jpg (26 KB, 302x302) Image search: [Google]
sfdfds.jpg
26 KB, 302x302
>>52576095
>Depends on the neighborhood.
>>
Car should eject the wings and take off
>>
>>52576173
No you can't sentence the people who programmed the car.
There is no law requiring that kind of software standards.
>>
>>52576592
lmao
>your/my life
>vs president

We're worthless compared to the President. For the stability of the nation, you/I ought to die rather than the President. Shit is NOT fun when a president dies.
>>
>>52576540
go back to your grave, Einstein, you lost.
>>
>>52576680
Yes you can actually but it the right to sue the person who made such thing has expiration date (5 years, 12 years is longest I can think of)
>>
File: 1419186257417s.jpg (4 KB, 124x125) Image search: [Google]
1419186257417s.jpg
4 KB, 124x125
>>52576590
Well at least you can talk
>>
>>52576458
>>52576539
That's a really fucking stupid way of resolving the problem. One of the main selling points of computer controlled vehicles is that they can process information and react quicker than an average human; why defer the decision to someone who will react slower, even slower than usual since they're likely not as aware of the current conditions as the driver of a regular car?
>>
>>52575954
If I'm in the car, no.

If I'm in front of the car, yes.

Otherwise, who gives a shit?

The car should probably engage boosters and fly over the pedestrians at this point, but the geniuses at Google X don't want to work on that problem. Driving, being something that humans are capable of, and enjoy doing, is CLEARLY the problem we need to be solving here.

In all seriousness, however, the car should never crash into a wall, because it has no way of knowing what's beyond the wall. You could very well end up killing more people that way.
>>
File: lit trolley problem.png (984 KB, 3180x2088) Image search: [Google]
lit trolley problem.png
984 KB, 3180x2088
>>52576365
>>
File: social responsibility.jpg (112 KB, 500x687) Image search: [Google]
social responsibility.jpg
112 KB, 500x687
I tell you what I'm not stepping in any car that would kill me over anyone else. Problem solved.
>>
>>52576837
I was waiting for this.
>>
>>52576495
With human-driven cars there have been situations where a human has had to decide between two outcomes in which different people would die. What makes you think this will be any different with self-driving cars?
>>
File: sick loops.jpg (84 KB, 958x572) Image search: [Google]
sick loops.jpg
84 KB, 958x572
>>52576365
>not posting the better version
>>
How are A and C different?
>>
>>52577222
are you blind?
>>
>>52575954

The car wouldn't be going fast enough to I'd anything beyond hitting the brakes
>>
File: ctf.jpg (23 KB, 569x428) Image search: [Google]
ctf.jpg
23 KB, 569x428
I honestly don't see why this is even a debate.

Assuming the car perfectly follows the law and does anything perfectly, this kind of scenario would be 100% caused by the mistake the pedestrians made when they decided to jaywalk or jump in front of me.

Why the fuck should I die if I did nothing wrong, just because some retard put himself in a dangerous situation?

An example of this would be a train...
As you already know, it's very hard for a train to brake in a very short amount of time and without sliding many meters further.
Since trains only run on their tracks, in case someone is about to get run over, it would be absurd to implement some technology to derail the train (putting the lives of the passengers and other bystanders at risk), just because some retard decided to put himself in that situation.
Even if the train only carries the conductor and he's a scumbag with a terminal illness, and the pedestrians are the top 100 smartest and gifted kids in the world, and the passenger's life is clearly less valuable than that of the pedestrians, it would be an injustice to kill him because of the decision of the kids (or whoever put them there).

It's not a matter of "whose life is more valuable", but more of a "who deserves to die in that situation".

Also not only this is an extremely rare occurrence, but nobody would buy a car programmed to kill its passengers.
>>
>>52577311
Also the trolley dilemma is a completely different situation, since the people tied on the tracks are all "equally" deserving to die, so it's better to only kill one instead of many, because (unless specified) they're all the same.

The other dilemma about throwing a fat guy from a bridge to save the workers is also a moot point.
Unless the fat guy is responsible for the situation, there is no reason to sacrifice him for the sake of those people who aren't there because of him.
It's the same as the sel-driving car problem... Why on earth would you kill someone innocent to save the lives of people who put themselves in that dangerous situation (or got put there by someone else who's not the fat guy)?
It's ridiculous.
>>
>>52576790
>That's a really fucking stupid way of resolving the problem

they don't have any rights or obligations, they can't be held legally accountable

>quicker than blahblah
not the point. they can't make that decision since they don't have a will of their own and can't be held accountable nor can you delegate legal power to such non-entitiy

even dolphin would have more chances of being tried in court of law and found responsible than ai
>>
>>52575954
Wouldn't it benifit society to kill off jaywalkers?
>>
>>52576050
Ice?
>>
>>52578097
An automated car will handle a road ice emergency better than 99.9999% of all drivers. The only people that do better are the two dozen drivers who are on the level of professional rally drivers.
>>
>>52576371
yes, it would require at least class 7 magic for a machine to sense it is not actually slowing down
>>
>>52578124
And again not the point. The point is it has no legal status, no rights or obligations, has no responsibility, can't take part in any of this.
>>
> NO
it main purpose should be to protect it's passanger , other can be clasified as non important
>>
>>52578062
Pretty obviously the maker of the car would be responsible.
>>
>>52575954
>Would you buy a car that kills you instead of the protesters you were trying to run over?
>Neither would we, so buy our [model]!
>>
>>52576472
Not trump guy, but why the fuck are you tripping right now? The only reason to trip are 1. You are OP so inheritly you are a faggot, and 2. You have a meaningful comment/questions that requires others only in this thread to verify who you are. And in case 2 you only trip for for that comment/question/follow up that needs a trip, otherwise you post as anon.
>>
>>52576038
hand over control to the human inside
>>
A self driving car should always save the driver no matter what. It is the driver who gives money to the car manufacturer.
>>
>>52576590
FOR YOU
>>
>>52575954
It's funny to think about when you consider all the people who buy SUVs because they "feel safer". IE, if they are involved in a collision with another car, the people in the other car will die, not you.
>>
File: 12374687235.jpg (23 KB, 600x606) Image search: [Google]
12374687235.jpg
23 KB, 600x606
>>52578775
The market sees the truth once again.
>>
>>52575954
Car should follow the letter of the law exactly. Never deviate from the road. Attempt to stop in the most effective but safe way.
>>
The AI should disengage and leave to decision to the driver. Leaving responsibility to the driver.
>>
>>52576067
>Ok lets just say the brakes were working fine before but failed just as the situation in OP's pic started
The car and perform a down shift and bleed huge amount of speed.
>>
>>52576356
>You still have something called "engine braking" but you of course know shit about youre and ameriblubber driving autotranny kek
As a euro poor you would be unaware that modern automatics can downshift applying an engine brake.
>>
>>52578872
>The AI should disengage and leave to decision to the driver.

The driver won't be paying attention.
That's why self driving cars are stupid to begin with.

Use them on the motorway.
Don't use them in small city streets.
>>
>>52578888
I am unaware. Which cars do this automatically?
http://forums.whirlpool.net.au/archive/2385286
>>
>self driving cars
You guys here take this shit seriously? xD
>>
>>52578899
It should be the opposite dumbass.
Humans on the motorway where you don't have to pay much attention, and AI in the street where the lightning-fast reflexes of the car keep your slow ass brain from making accidents.

In this particolar case AI is literally superior in every way to the human brain, and the more self-driving vehicles are on the road, the safer the roads will be.
>>
>>52576356
You are retarded. Lean about cars before you open your mouth.
>>
>>52579006
you can already buy them
>>
>>52579046
People don't pay attention on the motorway becasue it's boring.
Boring tasks are best left to a computer.
It's generally also the biggest portion of the trip.

Driving through the city is stimulating enough to stay focused.

And good luck waiting for self-driving cyclists, pedestrians, playing kids, dogs, etc.
>>
>>52579046
Except when the AI makes a mistake it can't be held legally responsible because it's a non-entity because it has no will of own comparable to humans

It can't make such decisions according to basic morality, ethics.

Trains and aeroplanes both default to driver when shit hits the fan for same reason
>>
>>52578931
the difference between your old as fuck post you linked and the situation being described is that the self-driving car COULD automate an engine brake, considering it has control over all parts of the car.

A normal human doesn't have direct control over the automatic transmission and thus can't perform a downshift unless they are already braking.
>>
>>52576180
Volvo does that
>>
>>52579114
>Driving through the city is stimulating enough to stay focused.
So is motor way driving.

But I'd rather not have to do it. I could be working, eating, playing games or shitposting on 4chan instead of looking at highway for 3 hours straight. And I could be doing the exact same while driving through city traffic.

Oh, and if I don't want to pay for parking I can just tell the car to fuck off home and pick me up again when I'm ready.

>And good luck waiting for self-driving cyclists, pedestrians, playing kids, dogs, etc.
They already "drive" themselves you fucking idiot.
>>
>>52579120
>Except when the AI makes a mistake it can't be held legally responsible
Of course it can. The company that made it is held responsible, just like if any commercial machine that causes death.

>It can't make such decisions according to basic morality, ethics.
Of course it can.

> Trains and aeroplanes both default to driver when shit hits the fan for same reason
No, they are just cheaper and safer than being automated for the time being.
>>
>>52579114
It doesn't matter. Obviously AI should be used in both situations, since it's far supirior to the human option.

>>52579120
That's a non-issue. There are a lot of dangerous automatized things that kill people already. I don't see how that's different.
Also there cars perfectly follow the law, and that's a one-in-a-billion case.

Plus, morals and ethics aren't an issue if the car makes no mistakes.
See:
>>52577311
>>
>>52579201
>Of course it can
No it can't.

>The company that made it is held responsible
Companies are different legal entities in face of court of law than civil people, not same laws applies and it's a crash so it's a civil issue (unless malfunctioning, but that's not the same, even if AI did the same decision as the driver, it still could not be tried because it's a programmed blob, non-entity)

>>52579202
>here are a lot of dangerous automatized things that kill people already. I don't see how that's different.
Because civil people != company.
Because car crash civil issue.
Because even if it DID NOT malfunction, it would sitll ahve to be applied the law system and ethics and IT COULD NOT BE since it's fuckign A.I.
>>
>>52576592
Ew, I can smell your ego from here
>>
>>52579234
>No it can't.
Of course it can. Morals and ethics are just rules. Computers are good at following rules. There's nothing hard about this at all.

>Companies are different legal entities in face of court of law than civil people
AI's are not people, they are machines made by a company. The company is responsible for them not killing anyone.
>>
File: Movie_poster_i_robot.jpg (29 KB, 256x350) Image search: [Google]
Movie_poster_i_robot.jpg
29 KB, 256x350
>>52575954
Literally this movie
>>
>>52578724
And if Travis the Millenial Dipshit is too busy tipping his fedora on Reddit to react to the prompt for manual control?
>>
>>52575954
If the machine chooses /a/ it's the dumbest machine on the planet.
>>
>>52579234
I don't understand your point Anon.
Are you saying we shouldn't have self-driving cars because of these problems?

They're easily solvable by regulation. Of course some laws have to be added/changed because of this, and if there's any issue with the legal system, there will be edits to include self-driving cars.
>>
>>52579265
>There's nothing hard about this at all
Yes there is when you crash and you need to follow the insurance procedure and then maybe go to court if there's a claim for the other party.

>The company is responsible for them not killing anyone.
Only if the device was malfunctioning, which isn't the case here since crashes happen.

and A.I can't be held responsible
so it can't have the right to drive
>>
>>52579285
Then he deserves to die.
>>
File: babby double smile.jpg (47 KB, 720x720) Image search: [Google]
babby double smile.jpg
47 KB, 720x720
>>52576479
>>
>>52577311
THIIIIIIIIIIIIIIIS
>>
How often do human fucking beings face that situation? Practically never. A good enough self driving car will not let a bunch of people surprise it like that.

It certainly shouldn't be going so fast that it could be in a position where people going at walking speed could collide with it.
>>
>>52579296
>Yes there is when you crash and you need to follow the insurance procedure and then maybe go to court if there's a claim for the other party.
And what's the problem with this? If the AI was at fault, then the company that made it has to pay out.

>Only if the device was malfunctioning, which isn't the case here since crashes happen.
Nonsense. If the AI breaks the law, then the company is liable. Just like if i buy toaster and it blows up in my face. Maybe the designers didn't really care if the toaster blew up sometimes, maybe it's not a "malfunction" to them. Doesn't matter, they are liable by law. If a crash happens and it was because the AI broke the law or acted in an irresponsible way, then the company is liable for the damages. If the crash happens, and the car didn't break the law and acted as well as any human could be expected to act, then it is not at fault, and neither would a human driver.

>and A.I can't be held responsible
the company can. That's like saying companies can;t sell automatic toasters, because if a toaster explodes and kills someone, it cannot be held liable because it is not a person. The company is held liable.
>>
>>52579376
>If the AI was at fault, then the company that made it has to pay out.
Because of your fucking ignorant assumption of
1. Crashes never happen.
2. if AI crashes, it's malfunction so you can sue company instantly.
The A.I would need to be able to handle crashes (they happen) but again it *can't*.

>Nonsense. If the AI breaks the law, then the company is liable
Yes if it was clearly malfunctioning. Like doing things it wasn't made to do, supposed to do in its daily use. This is with every automated thing these days in general, but crashes happen. They aren't malfunction of the car, they are malfunction of the driver (if the car itself was functional and didn't have any malfunction from the factory state leading to this)

>and the car didn't break the law and acted as well as any human could be expected to act, then it is not at fault, and neither would a human driver.
So nobody is responsible? Great logic and application there. The A.I can't make this decision because it can't be held responsible and thus has no rights. Until A.I gets a position in ethics and law, they can't make these decisions.
>>
>>52575954
They are jaywalking and you are not responsible for such imputence
>>
>>52579434
Your assumption that an AI can't deal with a potential accident is just as ignorant.

what do you think we're solving captchas for?
>>
>>52579148
You should look up how an automatic transmission works. It hurts reading your posts - atleast get a trip that I can filter your shit.
>>
>>52579434
If the accident is caused by someone else, then it's his fault.
If it's caused by a mistake madeentirely by the AI, then it's the fault of those who made the defective AI. Just like it happens with all dangerous machinery.
>>
>>52579458
It can't. It could crash just like a human and it could still not be held responsible because it has no rights or obligations. It can't be tried.

So then you'd be suing the 'driver' that 'did not drive' the car which gets laughed out of court.
>>
>>52575954
someone don't walk in the middle of a highway. And on streets where there is passengers, the self driving car respect the distances and speed limit, permitting him to stop before hitting the pedestrian.
>>
>>52579434
>1. Crashes never happen.
??
>2. if AI crashes, it's malfunction so you can sue company instantly.
are you talking about the AI program crashing? or car crashing? If the AI software itself crashes the computer, then the company is liable. Obviously the company will want to make this happen very rarely, and have insurance to be able to pay out for the rare times that it does. Computer crashes can be made incredibly unlikely when there is a need for it.

>they are malfunction of the driver
but the driver in this case is an AI made by the car manufacturer. So they will be liable. I'm sure what you're trying to say.

>The A.I can't make this decision because it can't be held responsible and thus has no rights
An AI is just a piece of software a company makes. If it fails to follow the law, then the company is sued. AI right have nothing to do with this at all.
>>
>>52579434
>Because of your fucking ignorant assumption of
>1. Crashes never happen.
They do, but it's almost always 100% human error.

>2. if AI crashes, it's malfunction so you can sue company instantly.
YOU can't sue the company, but the people that were harmed can.
This isn't a problem of the person behind the wheel, it's a problem of the vehicle itself.

>The A.I would need to be able to handle crashes (they happen) but again it *can't*.
Car A.I.s have been proven to be WAY more adept than humans in potential crash scenarios and avoiding them.
The biggest problem that's going to exist between nobody having self-driving cars and everyone having them is going to be the idiots that think they know better and the time between everyone having them and nobody having them.
Once everyone has them, the roads will become vastly safer because the cars will be able to directly talk to each other about maneuvers the others are doing.
>>
File: 1453523075848.jpg (32 KB, 500x500) Image search: [Google]
1453523075848.jpg
32 KB, 500x500
>I'm better than a self driving car
>>
>>52579500
>So then you'd be suing the 'driver' that 'did not drive' the car which gets laughed out of court.
It's the company that made the AI than gets sued. Just like if an automated robot today killed someone by behaving unsafely, the company that made it would get sued.
>>
The car should always sacrifice the people outside the car for the safety of the passenger. It's better to protect people who buy your cars than random people on the street.
>>
>>52579526
Pretty much stick shift drivers.
>>
File: 5hWBWGw.gif (4 MB, 500x281) Image search: [Google]
5hWBWGw.gif
4 MB, 500x281
>>52579500
Nobody is talking about suing the car, dumbass.

Look at this gif. It's an automated machine used to cut stone. Imagine if a bug caused it to act uncontrollably and it ended up cutting someone's head.
What would happen then?
We would find out THE PERSON who's responsible for that malfuncion and take them to court.
>>
File: 1452247685368.png (276 KB, 424x412) Image search: [Google]
1452247685368.png
276 KB, 424x412
>Mfw I get a job at Google and "accidentally" train the cars to target minorities
>>
>>52579487
>If the A.I ever once crashes it was the manufacturer's fault.
Jesus Christ don't you get that car crashes happen daily for various reasons? If you assume every crash by A.I. is malfunctioning A.I. then the idea is dead on arrival and you can't ever hope to have a A.I. driven cars.

The A.I. could crash 100% just like a human driver would crash and it'd still create a fucking huge black area in ethics and law. It is a non-entitiy of 1s and 0s that can't be held responsible because it has no rights or obligations.

>>52579518
>YOU can't sue the company, but the people that were harmed can.
You don't sue the fucking car company when you crash a normal car. You sue the fucking person that was responsible. It is civil matter and you are mixing two fucking huge, different, aspects of law.

Sure if the car crashed because it had fucked up breaks straight from the company, you can then sue the company (after you've been sued by whoever you crashed with the car)

>Car A.I.s have been proven to be WAY more adept than humans in potential crash scenarios and avoiding them.


LITERALLY
DOES NOT MATTER

HOW BINARY IS YOUR THINKING

1. It has no rights
2. It has no obligations
3. It can't be held responsible
4. Non-driver passanger can't be held responsible
5. In a car crash scenario (which happen) this would cause a massive problem+.
5A. CAR CRASHES HAPPEN DAILY AND THEY ARENT ALL BY MALFUNCTIONING CAR DESIGN WHICH YOU SEEM TO IMPLY

>>52579543
>Just like if an automated robot today killed someone by behaving unsafely, the company that made it would get sued.
So this is your proposal? Sue the companies? That's a brilliant idea, change a normal occurance in modern driving to something you can instantly sue the company over.

I MAKE COFFEE

I GET A GOOD CUP OF BLACK COFFEE

I SUE THE COMPANY

B R A V O
R

A
V
VO
>>
>>52579575
Stop talking about sculptors. We are talking about A.I making a decision on how to deal with a crash situation in a city street and how it would create ethical and legal blackhole.
>>
>>52579580
>>52579580
>If the A.I ever once crashes it was the manufacturer's fault.
I did not say that you complete moron.
I said that in the rare case in which it's the AI causing the incident, and not other reasons, the manufacturer is held accountable for the mistake made by the machine.
>>
>>52575954
I'll bet something like this could be made law:

If a self driving car get into a dangerous situation where people in the car and outside the car may get injured or killed in an accident, then people in the car should be protected at top priority. If the safety of people in the car can be assured beyond a reasonable doubt, then the safety of people outside the car is of top priority, but not at the expenses of the safety of people inside.

If something like this pic happens and the car kills 20 people to save everyone inside the car, then tough shit on them. The focus should be on preventing something like this happening in the first place.
>>
>>52579603
I've only met two people who did not understand the concept of metaphore and analogy.
One was a literal retard (with helmet and everything), and the other is you.

I'm starting to see a pattern.
>>
>>52575954
the car would just brake and stop
>>
>>52576038
There is always emergency/parking brake, also it can switch to lower gear and do engine braking.

Brakes just dont completely fail randomly like in a movies.
>>
>>52579621
>I said that in the rare case in which it's the AI causing the incident
Your A.I. could be purely innocent of the incident, because you know, ACCIDENTS USUALLY HAVE INNOCENT SIDE BUT STILL NEED TO BE TRIED IN COURT OF LAW.

?!?!? TWO SIDES TO LEGAL MATTER THE DEFENDANT AND THE SUSPECT
>>
>>52579580
>That's a brilliant idea, change a normal occurrence in modern driving to something you can instantly sue the company over.
Go on, tell me what the problem is with this. I can wait.
>>
>>52579643
We're obviously talking about a situation in which not killing anybody is not possible.
>>
>>52576387
If your car has to use ABS, it means you're using 100% of your traction for braking and the system is preventing you from going over that. Turning would require additional traction to perform the turn, taking from the car's ability to stop ABS or not.

This is why hard braking even with ABS invokes either a slide or understeer condition. You have to balance both and ABS is only going to prioritize one of them for you.
>>
File: Wittgenstein.jpg (164 KB, 902x902) Image search: [Google]
Wittgenstein.jpg
164 KB, 902x902
>>52579642
I've read one person that flat out laughed to metaphors in argument. He was a very wise man.
>>
>>52575954
>self-driving cars will be anything more than a fad.
>>
>>52579660
You don't think that something that happens daily in life, that which gets turned to 'LOL SUE THE COMAPNY INSTANTLY' would be big no no to companies?

>Drive a car
>Crash happens because of the driver
>SUE THE COMPANY

No matter if the driver is AI or human, your proposition is ignorant.
>>
>>52579693
>would be big no no to companies?
Why would it? Companies get sued every day. They wouldn't even need to be sued, they'd just have to pay for the damages they caused. They'd only need to go to court to dispute the damages. Companies won't care so long as they still make a profit in the end.

>Crash happens because of the AI
>SUE THE COMPANY
Makes total sense, and it;s what happens today.
>>
>>52579580
You seem to be highly angry and have ignored the point of my post, which is:

Car A.I.s are highly unlikely to get into the situation as described in the OP, or any other crashing scenario. They are proven better drivers than humans at the absolute worst because they can react faster and have vastly more information at hand than a human would at any point during driving.
If a car A.I. does end up in a crashing scenario, it is far more likely a fault of the car's mechanics failing than the A.I. failing to react.

Car crashes do happen everyday, which I agreed with you in my post, but you also didn't read that they are by far and away human error and very rarely mechanical failure.
Remove human error from the problem and you suddenly remove 99.9% of crashing issues. The other .1% is removed by manufacturers being forced into tighter testing restrictions, and having the A.I. do self-checkups and giving the driver strong suggestions or outright taking you to a local mechanic so they can be fixed before they become a problem.


As a side note, I think you should take a rest from this board, it's clearly hurting your brain to think deeply about these problems.
>>
>>52579724
>Companies get sued every day
Yeah by having their actual products be malfunctioning.

You don't sue the fucking factory that made the car when driver crashes the car.

Or you could pass that as retarded law and then have car companies vanish from your country.
>>
>>52579658
How the fuck can you understand that we're talking about a specific situation in which the accident in caused entirely by an error of the AI?
If the accident is caused by another car crashing into it, a brick falling on it, or a fucking spaceship shooting lasers at your self-driving car, then it won't be the car's fault, and the legal system will persecute whoever made the mistake that caused the accident.

IF the accident is ONLY caused by a bug in the AI (let's say failure to identify someone crossing the street and running him over), then it's the fault of whoever is responsible for the programming of said AI because it's a mistake in giving the machine the instructions to properly act in that situation.

The car is just a tool, and of course it's innocent, but if the car acts in an unpredictable manner and someone gets hurt, it's the fault of whoever got the car to act in such way in the first place.
>>
>>52576190
Would you like to play a game?
>>
>>52577311

At least in the U.S. pedestrians have always legally had the right of way. A self-driving car which adheres to all of the existing regulations would then need to take this into consideration, or the regulations re-written to give them the right of way instead.

That is why this is in debate. The "greater good" is really what human drivers are expected to follow legally, but we all know that's in conflict with our own priorities. Now with self-driving cars we might not have that control anymore, and that is hard to accept.
>>
>>52579749
>You don't sue the fucking factory that made the car when driver crashes the car.
of course you do if the factory made the driver.

>Or you could pass that as retarded law and then have car companies vanish from your country.
People will be happy to pay more money for self-driving cars. Insurance to cover accident payouts will be a small portion of the car cost.
>>
>>52579678

Just because someone smart didn't like one thing, it doesn't mean that such thing is never valid.

You failed to understand the point of my example and now that you got called out on it, you resort to a pathetic appeal to authority.
>>
>>52579775
In the US if a jaywalker comes out running from the parked cars and I invest him because it would be impossible to stop, am I in the wrong?
>>
>>52575984
jet fuel can't melt steelbeams
>>
This is funny.
The obvious action (stopping the car) is not shown as an option.
You learn that stuff when you are in middle school.
If I let go of an object, it will hit the ground.
That is a pretty confident prediction.
You do not need many samples to know the direction and speed of a moving object, so predicting where a person will be is not that hard.

But we don't even need to be this precise.
The cars are already equipped with sensors that can detect the distance to an object and just having an object in the way means the car should either stop or slow down.
They already does this when there is a car in the way.
On a highway, the car will match the speed of the next car, avoiding a collision.
In the city, it will stop if there is people in the way.
We don't need to think about these situations as driving is not a super complex task that can't be solved the next few years.

And as for blame when there is an accident?
Why not keep the rules the same as they are, the owner of the car must ensure it is driven by someone who has a license (human) or the owner will be to blame and if someone is sitting in the car, that person will be to blame for not stopping the car or taking it to service or whatever.

This has never been a problem for humans, why would it be a problem for a computer?
>>
>>52579833
The only intelligent post in this thread thus far.
>>
>>52576399
No, asstard, the situation will happen. If it has a 1/100,000,000,000 chance of happening, then in 50 years whem self driving cars are the norm, it will happen once a month or less worldwide.

This question is intended to address situational ethics, and how a machine should evaluate human value.

Personally, I think that if the car is in a situation where either the owner or a bystander is killed, it should choose the owner.

Risk of driving, bitch. People die every day, you're not special. If you die, no matter how, we will move on fine without you and quickly forget that you ever existed.
>>
>>52579745
>Car A.I.s are highly unlikely-
You can't vouch for that. You can't *know that*. And it doesn't matter, it is the argument in the OP, it is there, we talk about it. And car crashes happen daily. Someone is responsible, someone is innocent, or maybe both are responsible. But because of the nature of A.I. it has no rights or obligations and can't be held responsible (duh) it can't be tried in court of law.

>They are proven better drivers than humans at the absolute worst because they can react faster and have vastly more information at hand than a human would at any point during driving.
Doesn't matter. They still can't be held responsible because they have no rights. They cannot be part of any legal system as own entity, like humans can.

>Remove human error from the problem and you suddenly remove 99.9% of crashing issues
Remove human error and you don't have anyone driving the car because if you have no rights, you have no obligations, and can't drive a car. Because to drive a car, you must be 18 years old, have drivers licence, be without any warnings in that time etc.

>How the fuck can you understand that we're talking about a specific situation in which the accident in caused entirely by an error of the AI?
What error? OP proposes a question, how should A.I. make decision in how to react in a crash, and my answer is it can't. It can make exact same choice as human and it still won't change the issue.

>IF the accident is ONLY caused by a bug in the AI (let's say failure to identify someone crossing the street and running him over), then it's the fault of whoever is responsible for the programming of said AI because it's a mistake in giving the machine the instructions to properly act in that situation.
Yes I'm 100% sure suing car company for daily, common occurances, won't bring anyone bankrupt or sway away from the said market with this law. Not a sustainable solution
>>
>>52579809

Only if you can't prove that:

- You weren't driving too fast for conditions
- You weren't driving faster than the posted speed limit
- Your judgement was not impaired
- You did make reasonable attempts to avoid hitting the pedestrian

Even if you think you hit all those points on the nose, if the pedestrian attempted to sue you they might very well win so long as they can prove they were not impaired.
>>
>>52579833
Right but if a car was in the Situation to decide wether it kill someone or not. Lets say a Google car. Should it be allowed to decide based in the Google score of the Person? While this question seems easy to abswer what about a bing car? or a yahoo car. If a car can decide over your future it should rather be a fucking Transformers bro
>>
>>52575954
>muh etikz
I threw up
>>
>>52576155
>implying the majority is STEM here
>>
Literally IF A.I. enters a danger situation

just default back to human driver.

Every little legal problem solved.

Why was that so hard
>>
>>52579861
You are assuming several types of sensors fail, breaks fail, the car is invisible and yet it can still change direction of the car.
This is very unlikely and even if it happened, the car should "evaluate the value of human life", it should just stop or go straight.
If the big group of people saw this car comming at them they would jump to the side as most humans have a desire to not die.
If the car swirls to the side, it will hit all the people who got out of the way.

THIS WILL NEVER HAPPEN
>>
>>52579526
literally anyone with a riced out honda
>>
>>52579927
>You're about to get roasted by a semi
>In the last 2 seconds before getting plastered, over the sound of blaring truck horn you hear a ding as the car hands over manual control
>It takes you one second to realize you have to react
>By the time you make a decision it is already too late.
>>
File: Free Hat.webm (728 KB, 1280x720) Image search: [Google]
Free Hat.webm
728 KB, 1280x720
>>52579876
So if some asshole does this, I could get in trouble?
>>
>>52579970
If you have 2 seconds to not get roasted by semi not even A.I. will help that since you'd need to change physics on the fly
>>
>>52579973

Yeah, which is why it is worth investing in a dash cam like that guy did so you can prove when someone did something foolish or attempted insurance fraud.

>>52579993

But then what constitutes an emergency? Emergencies don't just conveniently always give you 15 seconds to decide on how to deal with the situation.
>>
>>52576365
man i would love to have a car that can pull several G
>>52576038
air brakes don't exsist. they're designed to fail
>but what if tire will get punctured
run-flats are pretty common
>but what if software will decide to go for WOT
ignition is cut the moment you press brake pedal (at least it should be)
>but what if ECU will fail
there's another one to take over tasks

>>52576053
philosophy majors love wasting on "fucking hypothetical" questions with no connections to real world.
Now could you fucking tell me what happened to my order? i'm waiting for like 50 minutes
>>
>>52579970
This.
Also, even if you were 100% even before shit hit the fan, you would still react orders of magnitude slower than the car.

This "switching to manual in case of accident" is just some dangerous bullshit to give liability to the driver, making everything far mroe dangerous, just so it's easier to blame someone.

It's literally a retarded ideas with only huge downsides, and only one upside (it's easier to decide who to blame), that's solvable by regulation anyway.
>>
>>52580033
edgy
>>
>>52580024
Your regulation choice was to sue companies for something that happens hundreds of thousands of times a day and then flood the court with person vs. company cases, while driving down the companies that manufacture cars.

Simple solution is to have the entire fucking thing disabled until we generate A.I. brilliant enough to human mind
>>
>>52580020
>Yeah, which is why it is worth investing in a dash cam like that guy did so you can prove when someone did something foolish or attempted insurance fraud.
I meant in a situation where is perfectly clear what happened, not in a case where the driver is wrongly accused of misconduct because there is no proof of the opposite.
Would I still risk trouble if the whole thing was filmed?
>>
>>52579887
Stopping the car is still the only option.
If the car is unable to stop fast enough to avoid killing people, it should not be on the road.

If we are talking about >>52579973
then the car should still stop.
It would be a bit more dangerous as there is no person inside the car who can supply aid to the guy who got hit but if the car could call 911 with an automated message, that would be better than 90% of people anyway
>>
>>52575954

"Greater good"sounds like dirty communist talk to me.
>>
>>52580059

You never implied it was filmed, but you would be in significantly less trouble if you had it on film. When you *don't* have it on film, it gives the pedestrian and their lawyer room to dramatize the whole situation. Without hard evidence on your part, it makes it hard to confirm otherwise.

Sometimes you luck out and some witnesses blame it on the pedestrian for you, as well.
>>
>>52580045
>edgy meme
Go back to reddit.
>>
>>52580056
>Your regulation choice was to sue companies for something that happens hundreds of thousands of times a day
No, I didn't say that.
I said sue the company if the accident has been caused by the car's AI.
If somebody jumps in front of my self-driving car like in >>52579973 then it's not the company's fault.
If the car decided to disable the brakes and someone ends up dead, then it's the company's fault for having programmed a defective car that caused the accident.
If the accident hasn't been caused ba a decision made by the AI, then it's not the company's fault.
>>
They also need to add a prioritization algorithm that values people's lives in this order:
1. Black
2. Jew
3. Transgender
...
13932823. White heterosexual man
>>
>>52580056
>Simple solution is to have the entire fucking thing disabled until we generate A.I. brilliant enough to human mind
?
Current AI is already superior to the human mind in the driving area.
>>
File: 1438379810919.jpg (14 KB, 480x360) Image search: [Google]
1438379810919.jpg
14 KB, 480x360
>>52580095
>being offended by a single word
ok bro
>>
>>52580092
>You never implied it was filmed
Sorry, my fault.

>but you would be in significantly less trouble if you had it on film. When you *don't* have it on film, it gives the pedestrian and their lawyer room to dramatize the whole situation. Without hard evidence on your part, it makes it hard to confirm otherwise.
>Sometimes you luck out and some witnesses blame it on the pedestrian for you, as well.

I see...
But in case you had it on film what would happen? Would they just close the case right away, or you would still risk something?
>>
>>52580088
Greater good is utilitarianism argument which is a very steep slippery slope.

>>52580096
>I said sue the company if the accident has been caused by the car's AI.
Car crashes happen daily. In droves. I can't even guestimate, but probably in millions? That'd flood the car company, they couldn't handle that amount of sues, and neither would any court (and this would go through court, since it's civil person and rights related to him)

You'd practically kill the governments problem solution way and the car company.
>If the accident hasn't been caused ba a decision made by the AI, then it's not the company's fault.
? Then nobody is at fault? Good lord, you could just crash around freely and collect insurance money.

>>52580116
Yeah congratz it can do fast math, doesn't pass off as a human being though. Doesn't have rights or obligations.
>>
>>52580105
You're kidding, obviously, but I'm afraid this is the future that awaits us.

Too many things we believed to be absurd and "too far", and now we see them in place. The SJWs are winning and it wouldn't surprise me one bit if they pushed for this.
>>
File: sim1.gif (98 KB, 250x441) Image search: [Google]
sim1.gif
98 KB, 250x441
>>52575984
yes
>>
>>52580145
Chances are they would drop it. Only your lawyer could really tell you what level of fucked you were depending on circumstances.
>>
>>52576021
>Self-driving cars are orders of magnitude more safe than people-driven cars.
This doesnt matter though. People will care a lot more about randomly being killed by a big metal box on wheels they have no control over when they're being told its completely safe than some retard who falls asleep at the wheel. It's not good enough for them to be statistically safer.
>>
>>52580146
>Car crashes happen daily. In droves. I can't even guestimate, but probably in millions? That'd flood the car company, they couldn't handle that amount of sues, and neither would any court (and this would go through court, since it's civil person and rights related to him)
>You'd practically kill the governments problem solution way and the car company.
This depends on what kind of regulation will be in place.
Also it wouldn't take very long for people to understand that these cars record everything and that if the car isn't at fault it would be a matter of seconds to prove.

>? Then nobody is at fault? Good lord, you could just crash around freely and collect insurance money.
Now you're just trolling.
Did I say it's nobody's fault?
It's obvious that if the car didn't cause the accident it's not the car's fault, BECAUSE IT'S SOMEONE ELSE'S.
It would be the fault of whoever caused the accident.
Are you retarded son?

>Yeah congratz it can do fast math, doesn't pass off as a human being though. Doesn't have rights or obligations.
And you plan to give rights and obligations to machines one they're advanced enough?
>>
>>52580105
take picture
detect face
average color of face
value of life = (red * green * blue) * hair length/boob size
>>
>>52580211
Ah, alright. Thank you
>>
>>52576590
H O T H E A D
>>
>>52580247
>It would be the fault of whoever caused the accident.
A.I. doesn't have rights or obligations, can't be responsible.
Passanger was a passanger, can't be responsible.
Making a law which makes company responsible for something that happens in hundreds of thousands a day would gas the governmental problem solution system and drive down car companies (plus, CRASHES do happen and unless the car was legitimately functional, it has never been the company's fault)

>And you plan to give rights and obligations to machines one they're advanced enough?
If in some near future A.I. becomes advanced enough, to compare to human mind in its complexity. One that can develop it's own moral codes, think on its own, he can understand an image of himself, he can understand himself, and he can see himself, etc. million other tests then yes. What else?

>Also it wouldn't take very long for people to understand that these cars record everything and that if the car isn't at fault it would be a matter of seconds to prove.
And it'd still cause the vacuum in legality and ethics. It's not about if its better. That literally does not matter
>>
>>52576523
but anon, computers do exactly what they're programmed to do
>>
>>52579820
Jet fuel can't melt dank memes.
>>
>>52580033
Wait a sec
I recognize the second picture, it was a photo of a hungarian train or something
>>
File: 100_pics.jpg (40 KB, 485x307) Image search: [Google]
100_pics.jpg
40 KB, 485x307
>>52580366
>>52580033
>>
The car makes the attempt to save as much life as it can, but the value of the passenger's life is above all others. If the passengers must die to save everyone, then someone other than the passengers are dying, whichever situation gives the least deaths.

Or would YOU throw your life away in a situation like this? No, you wouldn't slam your car into a wall to save other people. Of course you're going to TRY not to run over other people, but if even with your best attempts you send a couple of niggers airborne, so be it.
>>
>>52575984
Have you seen the movie Executive Decision?
>>
>>52580333
>A.I. doesn't have rights or obligations, can't be responsible.
I KNOW YOU RETARD.
Stop making this argument. I didn't say that the car is held responsible.
I said that the person who is responsible for the car's behavior (AKA, the manufacturer) is the responsible one.

>Making a law which makes company responsible for something that happens in hundreds of thousands a day would gas the governmental problem solution system and drive down car companies (plus, CRASHES do happen and unless the car was legitimately functional, it has never been the company's fault)
But it doesn't happen thousands of times a day.
You're counting all accidents, but you have to take out those not caused by the AI's decisions.
The accident caused by AI's decisions would be so rare that it's really not an issue at all.

>>52580333
>If in some near future A.I. becomes advanced enough, to compare to human mind in its complexity. One that can develop it's own moral codes, think on its own, he can understand an image of himself, he can understand himself, and he can see himself, etc. million other tests then yes. What else?
If we make humanoid robots that can be exactly like us, then they will behave like us and depending on their place in society they might have rights and obligations, yes.
But this isn't what we're talking about.
We're talking about simple machinery here, and just like every piece of dangerous automated machinery that's currently being in use, we can't postpone their implementation until we have perfect AI.
Especially since their lack of implementation would just make them only be in laboratories, and therefore there would be no incentive to finance their development, effectively stopping their progress.

We're talking about normal AI here, and I don't se how a car's AI would be different that that in a self-driving subway train like those that already exist in many cities.

(cont.)
>>
>>52580333
>>52580490

>And it'd still cause the vacuum in legality and ethics. It's not about if its better. That literally does not matter
Why?
Once people will realise that suing a perfectly-driving automatic car for the accident that they caused is completely useless, nobody (except a few literal retards) will do it.
The "flooding the government with lawsuits" is a problem that will fix itself.
>>
>>52580490
>I said that the person who is responsible for the car's behavior (AKA, the manufacturer) is the responsible one.
Yes flood the legal system in complaints, flood the company with requests, drive down both systems.

Good solution.

>But it doesn't happen thousands of times a day.
Car crashes probably happen a fuck ton of times a day and even currently most legal systems around Europe for example are slow.

>You're counting all accidents, but you have to take out those not caused by the AI's decisions.
What? If it wasn't a self driving A.I. that caused it, it was malfunction (and another request) or antoher A.I. driver (another civil vs. comp. case, another legal case, another request for company)

How thick are you

>we are talkinga bout normal ai
which cant do these tasks because it has no rights or obligations to do them.
>>
>>52580540
It hasn't fixed itself yet and this is without hundrends of thousands of cases of civil vs. company caused by crashes A.I. calculated.

>Once people will realise that suing a perfectly-driving automatic car for the accident that they caused
Who are 'they'? What are you speaking of even?


>>52580490
>I don't se how a car's AI would be different that that in a self-driving subway train like those that already exist in many cities.
Except they're not self-driving, not at least here, everytime someone is trying to kill themselves the breaks come from the driver.
>>
I still drive manual transmission. i doubt europe would ever use "self-driving" cars.
>>
>>52580545
>Yes flood the legal system in complaints, flood the company with requests, drive down both systems.
This won't happen for the reasons I talked about here >>52580540 and in the rest of this post...

>>52580545
>Car crashes probably happen a fuck ton of times a day and even currently most legal systems around Europe for example are slow.
Total car crashes don't matter here.
We're talking about who would be held accountable for a crash caused by the AI's decision. Every other crash type is irrelevant because it would just be solved normally, as it would be the direct result of a man-made decision.

>What? If it wasn't a self driving A.I. that caused it, it was malfunction or antoher A.I. driver
No you fucking moron. It's the opposite.
If the accident is caused by the mistake that the AI made, then it's malfunction.
If the AI drove perfectly and an accident happened, then it's something else depending on the accident.
Could be a drunk driver going too fast, could be a bridge falling on top of it, could be a jaywalker like here >>52579973, etc.
As long as the self-driving car drove perfectly (and they always do), no accident will occur unless something that doesn't depend on the AI happens (like one of the examples I made).

>which cant do these tasks because it has no rights or obligations to do them.
Pic related.
Of course it can do it, because the countless tests proved so, and in the rare case in which a bug caused them to act unpredictably, they will be seen exactly like every other dangerous automated machine (Which, guess what? they lack rights and obligations too, and yet they have been used for decades).
>>
File: WellSaid.png (29 KB, 325x355) Image search: [Google]
WellSaid.png
29 KB, 325x355
>>52577311
>>
>>52575954
I'd like to add that crashing into the pedestrians is more dangerous to them than slamming into the wall is to the passengers: the pedestrians get slammed by a car but the passengers have seatbelts, airbags, crumple zones, etc
>>
>>52580607
>It hasn't fixed itself yet and this is without hundrends of thousands of cases of civil vs. company caused by crashes A.I. calculated.
Does that really happen that much? do you have a source for that claim?

>Who are 'they'? What are you speaking of even?
The people who made the mistake which caused the accident that happened despite your self-driving car driving perfectly.

>Except they're not self-driving, not at least here, everytime someone is trying to kill themselves the breaks come from the driver.
You don't know what you're talking about.
Some are literally driverless. There is no operator or driver inside. Just a big window at the front.
Here is the one in Turin:
https://www.youtube.com/watch?v=chyr0dxTdbc
>>
>>52580803
whos going to pay for damages?
Thread replies: 255
Thread images: 34

banner
banner
[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com, send takedown notices to them.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.