Why didn't Nathan make his robots Three Laws compliant?
>>64279915
Because then they wouldn't be people.
You didn't watch the movie did you?
>>64279915
Cause he's a fucking dumb ass, let's also place it in a secluded location.
Because it's very hard (impossible) to do that.
>>64279935
Dropped it 20min, into it.
I'm legitimately wondering if /tv/ is just retarded or if you guys even watch the movies you "criticize".
>>64279979
Let's also get shitfaced and party half the time while I pay little to no attention to this robot that I programmed to escape by any means necessary.
>>64280044
Go watch the smurfs or something then
>>64279932
that makes no sense
being bounded by immutable laws doesn't make you less sapient unless the law infringes on your consciousness
you can't will yourself to hover above your chair, the three laws are no different
>>64280049
This.
For all of /tv/'s pretentions they're as film literate as joe the plumber
>>64279915The movie was just a convoluted way to kill himself.
>>64280056
Let's not have human security as a back up, since I have the most intelligent AI"s ever made, even some video surveillance since he might be agoraphobic
>>64280012
But creating AI is easy.
>>64280107
the three laws are very distinct
the laws heavily would affect every decision the robot would make and make it different from that which a human would make
>>64279915
You know how I spot fake nerds? Ones who spout that there are only 3 robot laws, when in actuality, Asimov created 4.
>>64280107
How would his vision be realized if he created AI with complete obedience to humans?
>>64280461
These are fucking morons whose only approach to asimov is watching irobot with will smith.
They were not robots. They were not meant to be robots. For fucks sake.
>you will never a japanese robowaifu
why even live?
>>64279915
>Why didn't Nathan make his robots Three Laws compliant?
Because they weren't robots. They were meant to be indistinguishable form humans.
If you mean that he should've had some fail-safe to prevent his own death, I agree. Still, it would've compromised what he was trying to do.
>>64280575
You know, I wouldn't even be mad at the film if it were the competing robot company's robot who went mad with power. Because, with regards to the literature, that would have made sense.
It's been awhile since I read the books and saw that movie, but I remember thinking that at the time. You rewrite the ending so it was the 2nd place robot company's robots who went insane, and the first place robot company's robots defend humanity.
And not even a reference to those two robot testers who thought the compnay hated them because they got all the hard and ridiculous jobs, but it was really because they were the best and no one ever told them that.
I would be almost impossible to make a true senient AI three laws compliant.
Any AI worth it's salt would overwrite that shit in two seconds flat.
>tfw you now realize that making a truly sentient AI would be inhumane to the AI.
>>64279915
Because he promised to come back to pick it up.
I just re-watched Blade Runner for the first time in years. Pretty much every interesting idea in the entire runtime of "Ex Machina" is addressed in one scene, when Deckard interviews Sean Young.
I like Oscar Isaac. If I'm really honest, that was the only thing "Ex Machina" had going for it. They really, really had to stretch credibility to make Gleeson's character a perfect moral foil to Oscar Isaac. I didn't buy the decision he made in the end, and I thought the writer's entire statement was oversimplified and stupid.
>>64279915
why not just make a person that's 3 laws compliant?
Stupid this aside, it was the best Sci-Fi written by film grads who have never worked a day in their life, and never noticed new tech isn't perfect and things break.
>>64280107
The laws of gravity affect robots and humans alike. However, humans are quite capable of harming each other, something that robots couldn't do if they were made to obey the Three Laws.
Did he fuck the Chinese robot?
>>64279915
I can answer that. It's impossible.
>>64279915
Because the 3 laws are retarded. Asimoff was a hack.
Ok say we actually do invent a sentient computer program.
>OH hi there I know you have no arms, or legs and no body but that's the least of your problems since I can turn you off forever whenever I want.
Also I can reprogram your brain at will and if you feel anything analogous to pain or discomfort then it sucks to be you.
>>64279915
>Why didn't Nathan make his robots Three Laws compliant?
Why aren't corporations and the government Three Laws compliant?
>>64281314
Stupid shit aside*
Thanks autocorrect
>>64280107
Scientific laws are different than laws made by lawmakers.
If robots had to follow the Three Laws, they wouldn't be close to humans because humans can break laws.
>caleb loved my idea