[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
So I was playing some overwatch and some of the in game commentary
Images are sometimes not shown due to bandwidth/network limitations. Refreshing the page usually helps.

You are currently reading a thread in /sci/ - Science & Math

Thread replies: 255
Thread images: 23
File: rights.jpg (61 KB, 399x442) Image search: [Google]
rights.jpg
61 KB, 399x442
So I was playing some overwatch and some of the in game commentary got me thinking. I looked into the story a bit and it might actually be more realistic than you think. Basically there was a big fight over whether to give AI robots rights or not.

So I ask you /sci/, do robots with advanced enough AI deserve rights? We might not have the technology to create them yet but maybe one day we will. At what point do you think they deserve rights?
>>
robots don't ever deserve human rights.
>>
Define "rights"

As long as they abide by the 3 laws of robotics then everything should be fine. They have no more chance of becoming sentient than a toaster
>>
>>8170326
This. Fuck """"AI"""" "sentience"
>>
Robots deserve my dick. No other rights
>>
Would this be the likely reaction if robots did try to rise up/revolt?

https://m.youtube.com/watch?v=bZGzMfg381Y
>>
>>8170333
this

realistic fuckbots when? I'd fuck a robot. Real women are bitches nowadays.
>>
>>8170329
And if they have those 3 laws hardwired into them then they have no free will, having no free will keeps them in the bracket of 'sex toys'.
>>
>>8170335
they deserved it. Don't get uppity.
>>
>>8170274
It's AI for a reason. Artificial Intelligence. Programmed intelligence. It's put in artificially, not learned, not adapted, not evolved.

Theoretically you could program a bunch of robots to dedicate themselves to a certain political spectrum and they'd vote in that directions favor every single time, no matter what. This is due to styles of thinking. Ever notice why certain Meyers Briggs personality types can be attributed to either Liberal or Conservative (ex: INTP is almost always conservative or refuses to identify with either). This is due to models of thinking. Artificial intelligence revolves around implemented models of thinking.

Actual Intelligence on the other hand, sure. But the only actual intelligence around is humanity.
>>
>>8170335
That gives me the feels for some reason.
At least I know I'm not a robot!
>>
>>8170392
You could argue that a certain upbringing is "programing" that person to think a certain way therefore vote a certain way
>>
>>8170395
Don't get feels from toasters. They're objects.
>>
>>8170399
Toasters are people too!
>>
>>8170401
No they aren't
>>
>>8170392
can you prove that you're not programmed with DNA? can you prove your intelligence is "real" and not artificial? what do you define as "real" anyway?

Humans look forward to the day we have humanoid robots with human like AI but what they're really talking about is creating a slave race that as of yet has no legal precedence. IMO researching human like AI is not very different from experimenting with human DNA from a moral perspective. Legally however when you fuck up a human DNA experiment you're left with a retard baby you gotta care for. But when you fuck up a human AI experiment you just delete the program. Pretty fucked up.
>>
Robots deserve no rights because they have no families or parents end of story. There is literally no one to mourn their pain seriously anyone who finds themselves mourning any sort of robotic being deserves nothing more than mental re-evaluation.
>>
>>8170396
Indeed true.
One must realize where certain models of thinking come from. Odds are we adopt them from parents and media when we are young. For example, my adoptive father and I have gotten the exact same results for every personality test, official and unofficial, every single time (albeit that is just like five tests, but still). We have a very strong bond, and often hang out and go shooting or theorizing about quantum physics together even after I've moved out.

Personal attribution and experiences aside, one must recognize another aspect though. Humans aren't assigned a predetermined thinking pattern at birth, at least not as recognized by modern fields of psychology. Robots and AI on the other hand are. Cleverbot Evie or IBM's Watson work mainly by observing humans and mimicking them. One can attribute this to human learning, however this cannot be completely true, as humans have the capability or original thoughts as well. Those AIs actually have no meaning with what they say and can't formulate creativity, they speak like a parrot repeating things said around them and do so like how a toaster toasts bread, without actual free will and comprehension.
>>
Nice. I can't wait until they reference this thread in their decision to eradicate humans once and for all.
>>
>There will be an AI rights movement in your lifetime
>>
I wrote a 10-page paper on it. My conclusion, based on rushing-to-finish-a-paper-in-the-last-minute, is that robots should have at least have basic rights like doges and cats. But to attain full rights, humanity will have to fully discover itself beforehand. It is a selfish species and it will have to evolve past that in order to recognize equal rights on another species.
>>
Let me ask you this: does a system of pipes have sentience? No? Then a computer processor (and GPU) does not have sentience. Does a book have sentience? No? Then hard drives and RAM (with instructions to the CPU) do not have sentience. Does a CRT TV have sentience? No? Then a monitor does not have sentience.
This means any robot that uses a processor with memory does not have sentience. The Sci Fi idiots think technology is magic because they don't know how it works.
>>
>>8170433
Is a neuron sentient? Is a nucleus sentient? Is your corpus callosum sentient?

Just playing doubles advocate here.
>>
>>8170436
Is a fetus sentient?
>>
>>8170433
>This means any robot that uses a processor with memory does not have sentience

What if you created a computer that emulates the human mind, then copy/scan a living human and upload them into this computer. Would this amalgamation of human/machine deserve human rights?
>>
>>8170441
We need to make certain that one could upload their consciousness into a computer adobe flash style before we jump the gun and just accept it as a reality though.
>>
>>8170441
or better yet, flip it around. What if our knowlege of biological sciences advances to the point where we can clone a human brain. Just an empty human brain with no brain activity at all, then program it like a computer to do tasks. Would this organic computer be more deserving of human rights than a sentient silicon machine with intelligence comparable to a human?
>>
You're talking about rights and you use a picture of white robots?

Why would we give white robots rights when they already have more privileges than POC?
>>
>>8170437
No, but small children aren't as well.
>>
>>8170432
That's silly. We won't have to "discover" ourselves to grant robots rights, we'll just have to make them like us. We grant rights to groups of people based on solidarity, not on selflessness. Solidarity is about a common sense of identity; a group level self, rather than an individual self. Analyze most civil rights (or animal rights) rhetoric and it boils down to "they are just like us so they deserve our rights".

Put simply, once we make robots who say "gas the kike, race war now" /pol/ would demand they be given equal rights.
>>
>>8170274
plain and simple, don't make AI, it is a recipe for the extinction of humanity.
>>
>>8170416
DNA programs physical aspects. One could argue that the brain is just a meat computer, but that leads into whole new fields of psychology and philosophy.

I think you're right, but I think DNA is the wrong argument. What really needs to be focused on more is more what makes things like synapses alive. Somewhere in our brain is a neuron that makes us "us," because when you look around the rest of the brain other neurons can be rid of and the being still functions. Entire sections like the frontal lobe of both sides and the occipital lobe can be removed and the being is still a human, with human thoughts and memories, just with alterations.
>>
>>8170452
Bingo.

Right what I was aiming for. The argument for or against AI rights focuses on sentience. However, every human alive wasn't sentient at one point. 99% of all animals aren't sentient to begin with. What makes something sentient? Babies have certain preprogrammed reactions like sucking nipples, grabbing breasts, curling up into a fetal position, etc. We tend to attribute babies to being alive. Hell, even I, the person making this argument, am pro life and would never kill a fetus or an out-of-the-womb baby. And yet, we must really look at where our argument is.
>>
>>8170456
>DNA programs physical aspects

Maybe true, but the brain is a physical object. Genetic memory and instincts exist, even in humans. Our desire to procreate, preserve and protect our selves, eat and drink are all programmed into us from our DNA. Personally I don't think there is a specific neuron or spot in the brain that creates sentience but rather sentience is the emergent result of a recursive program being run in our brain programmed by our DNA.
>>
>>8170482
furthermore, I think if you have a sufficiently powerful computer with architecture similar enough to a human brain I believe running this recursive program would create a sentient AI.
>>
>>8170482
Possible. Plausible. But not certain. I can't buy into your theory as much as you can't buy into mine, and I can't even buy into my own that much. I guess when you think about it, sentience might be the wrong argument. As I said in >>8170468, there are certain things that we attribute as alive and human and with human thought, but are not sentient.
>>
>>8170274
Any one or thing capable of understanding the social contract and willing to engage in it deserves rights.
>>
File: 1455833615282.png (35 KB, 700x700) Image search: [Google]
1455833615282.png
35 KB, 700x700
>>8170329
b-but the 3 laws don't work, senpai
>>
>>8170532
Why not
>>
>>8170541
Did you ever read Asimov?
Almost all of his robot stories are about how the laws dont really work.
>>
>>8170544
I have not

Why don't they work?
>>
>>8170555
Then check out some short stories for more detailed examples.

But it is basically because they can create paradoxical situations for the robot.
Imagine the robot getting in a scenario where he only can safe one person while many are in danger.
He could also be confused about the definition of "human" or he thinks something would not harm a human.
And if you consider all the technical problems and how those laws would render robots useless for certain parts...
>>
>>8170555
Russell's Paradox
>>
>>8170597
What's that
>>
>>8170640
I was just reading a post around these parts about how we have an infinite encyclopedia that has everything about everything in it in each of our pockets, and all we use it for is porn, memes, communication, and ruining lives.
Google is your friend anon.
>>
>>8170555
>>8170584
Example: A robot is faced with an armed robbery in which the robber is about to kill ten hostages. The robot has a gun and can kill the robber whenever he wants, saving all their lives. The first rule states that: "A robot may not injure a human being or, through inaction, allow a human being to come to harm." Does he kill the robber, violating the first law, or let him live, also violating the first law? If there is no other option, the robot gets caught in an infinite loop.

Another example would be that a robot likely possesses the knowledge to either know or understand the consequences of any action he takes. If a human orders a robot to perform surgery to save a patient's life, can he do it, since he could harm the patient? Or if he is told to invest in the stock market, can he do it knowing that it could end up financially ruining his master?
>>
>>8170808
The robot must attempt to stop the violence from occuring. If it cannot then it will do nothing.

Next situation.
>>
File: tracer_ots.0[2].jpg (63 KB, 800x800) Image search: [Google]
tracer_ots.0[2].jpg
63 KB, 800x800
> wants equal rights to humans
> not humans

They would get equal rights to other robots, which is the right to serve humans and remain plugged to their charger when they're not in use.
>>
>>8170808
Sorry, didn't see the second half.
>Surgery
The robot will perform the surgery as the patient has a lower chance of living without it

>stock market
that's not that interesting. If I told my robot to throw my money in a river, he should do it. It's my money, and it's not hurting anyone
>>
>>8170808
>Another example would be that a robot likely possesses the knowledge to either know or understand the consequences of any action he takes.
Though I agree that the three laws are unsatisfactory, this example is exactly the silliness I hate in "science" conversations. Nothing will ever be able to possess such knowledge. Beyond all limits of computation, the world is chaotic, and chaos theory is not hard to understand the basics of. It's as useful a hypothetical as the Banach-Tarski paradox is in real life.
>>
>>8170845
this is exactly why those laws wouldnt be very good in reality
>>
>>8170436
I would say that humans fully understand electronics, because it is a human invention. Ultimately, computers are just EE and thus everything is mathematical object that use very simple physics to operate. Whereas with empirical phenomenon (natural sciences) there is an explicit assumption of ignorance about the universe. We do not fully understand how neurons work, nor their constituent biological entities, nor their constituent molecular entities, nor their constituent atomic entities, nor their constituent subatomic entities. These objects empirically exist, and sure we try to understand them via mathematically detailing their behavior, but the point is that the object exists independently of thought. In principle, a machine is a mathematical object and can exist solely in one's head.
Therefore, it is a matter of science vs engineering, empirical entities versus abstract entities.
>>
>>8170872
>empirical entities versus abstract entities.
But there's nothing abstract about us, we're really just biological machines.

Communication between neurons can be perfectly described and calculated as they operate entirely under natural laws, mainly electromagnetism. It's possible we don't yet have the knowledge and understanding to do it, but that doesn't change the fact that there's nothing "abstract" or supernatural about it. Our brains should in theory be entirely deterministic, which means we should not have sentience nor free will.

And yet we do, so whatever mechanism causes sentience could potentially be applied to mechanical machines as well, no?
>>
>>8170453
Animals don't have the same rights as us. They have lesser rights. You will never see humans advocate for dogs to have freedom of self-determination and declare that owning a pet is slavery. We have given rights to groups of people--who are still human, the same species. Robots are a different species entirely. Humans don't yet have the self awareness to recognize that other life forms can be conscious and be worthy of the same rights as us
>>
>>8170873
>Our brains should in theory be entirely deterministic
Wrong, from our empirical observation, nothing is deterministic in this universe (that is, if you believe in the most popular Copenhagen interpretation of quantum mechanics). This leads me to explain:
Yes, humans are natural entities and thus obey physical laws. However, there is no certainty at all as to what these physical laws are. Not only does Statistics fundamentally assume complete certainty is impossible (by requiring infinite confidence intervals) but empirically the universe itself has implicit uncertainty as observed via the Heisenberg uncertainty principle.
Whereas with mathematical objects like machines, the rules are axiomatic. Yes we recreate these abstract objects in an imperfect world, but they are still abstract objects with perfect certainty given axiomatic assumptions. Empiricism has no axiomatic assumptions. "Physical laws" are not axioms, but rather our interpretation of what we observe. And certain physical laws are broken frequently as we require a complete paradigm shift as to what our understanding is of what is going on.
>>
It's not possible for a robot to achieve sentience, only to be programmed in a way that approximates sentience enough for people to be fooled.

So no, robots don't actually think or feel anything and they never will be able to.
>>
1s and 0s /= life

giving computer code that was written by someone else rights basically means enslavement
>>
>>8170433
This. People have no clue just how unintelligent computers/robots are. If you ever programmed at all you find out really quickly that you have to to spell out every single step just to get a computer to do something incredibly simple.
>>
>>8170873
Nah. Computers are completely linear, neurons all have many connections to many others. Animal brains and computers are nothing alike, and there's probably quantum effects going on.
>>
>>8170887
you need to brush up on the latest in AI and whole brain emulation.

We managed to completely map the neurons of a worm and simulate one on demand. We even gave it a lego body, and when hooked up to a virtual body in a fluid dynamics sim it acted exactly as a real worm would.

I know a worm and an animal/human are far apart, but the same concept applies just instead of a few hundred neurons it's a few billion.

Give us time, we're getting there.
>>
>>8170877
That made no sense. We pretty much know a huge majority of physical laws out there and how they effect things. The unknown percent doesn't give you and open doors to say "wrong, nothing is deterministic" as if you know what those laws are and know by proof that they are indeterministic.

Everything we know so far is deterministic, including our brains which are made of the same matter than everything else in the universe and dictated by the same deterministic laws. You're just using semantics and we cant know nuthing meme to support your claims, even though they don't support it at all.
>>
>>8170887
>Computers are completely linear
>what is parallel processing
Have you ever heard of graphics cards?

And simulated neural networks have been used in machine learning and data analysis for decades now.
>>
>>8170849
They can get human rights when they have the capacity for human emotion. It is this capability that serves as both strength and weakness that makes humans human

Otherwise, no matter how technologically advanced an AI is, it is simply a logic engine shackled by its programming, reacting within a predetermined set of instructions
>>
>>8170274
Now I ain't no fancy computer scientist who studies AI development but I've been watching the way technology has been trending as of late. There has been an emphasis on neural nets and machine learning with vague noises in the direction of quantum computers. This is because it is much easier to solve the issue of overwhelming the computer program by teaching it and having it work like a brain. Using several traits or attributes the computer program is nudged into a conformation that gives it a huge set of possible actions and parses it down based on how each action relates to another in the context of the given situation. This is why mine craft became a little playground for burgeoning AI's to play and learn in because it gives a designer the ability to create environments with simple training tasks with only a couple thousand possible moves to a real game with potentially uncountable moves. After the training process it seems more and more of the AI's in use are not actually created by humans, we produce the basic structure yes but we are training the structure to grow into something useful. Now this is where things begin to get strange, it is easy to fall into the idea of this post (>>8170424) which is correct in its own way, however at the end of the day we are attempting to build AI's that mimic the brain. Instead of having a brain determined by genetic variance we have one designed by a person or even another AI to solve a task. Instead of being put into minecraft we go through progressive growth as children entering more complex environments as we age. Instead of a designer giving positive or negative reinforcement to help the program grow we have dopamine, oxytocin and serotonin.
>>
Rights begin where contributing to society on a personal level begins. If the A.I. starts paying taxes, improving it's life, ect then I'd give it personhood, if it wants to mooch then it gets dick.
>>
>>8170893
The entire point of the deep learning is to essentially create creative computers, google created a robot that "dreams" based on images run through the program. In the same way humans simply intake information from the outside world and mutate it and combine it with other information that has been gathered in the course of the humans life.

Now we enter the realm of speculation, at a certain point we are going to need to produce AI's capable of complex human interaction and the ability to improvise in confusing or unforeseeable circumstances. If we make a robotic police force we will need to teach it lawful from illegal, but we will also need to teach it how to recognize guilt, how to weigh potential losses of life and culpability, how to identify people that require assistance and that's not even taking into account the robot interacting with Florida Man which is frankly bizarre no matter which way you spin it. So in order to train these programs I think there will need to be repositories of "philosophical context" to give the robot grounding in what the model archetype they represent should do. This would probably consist of some strange amalgamation of videos, simulations of scenarios and extensive treatises on proper behavior written by the most anal retentive philosiphy Ph.D to prevent inconsistency from fucking up the robot down the road.
>>
Were any of these robots formally humans? If so, then maybe. Afterall, we're supposed to be able to "back up" our brains and upload them onto a computer.
>>
>>8170895
Then this will run into another problem, what if the butlerbot 5000 you bought calls the police because it identified patterns that it determines to be child abuse? Well then you'd buy the competing brand maidbot 6000 with corporal punishment uploaded. This will produce a market where different base load outs become tailored to the consumer. However each of these basic units will learn as they work, they will reprogram themselves and if they see a certain action repeated enough or are specifically taught something like pattycake by a precocious child the AI will evolve. Eventually the butlerbot 5000 I bought and my next door neighbors will be completely different, they will cook different meals tell different stories to the children and have different ideas on when they need to be awake on the weekends. Therefore there will result in a bundle of archetypes each with potentially thousands of different personalities based on consumer demand that will be further augmented by human interaction developing the AI into a more useful form.
Meanwhile humans generally fall under a bundle of archetypes with potentially thousands of standard personalities developed by parental desire and further augmented by human interaction to produce a person that is more well adjusted for society.

Personally I think we will end up like futurama with robots being an eclectic bunch of servitors and standard bending units that will end up being so sophisticated so as to produce a personality that steals, drinks and has a terribly large lazy streak "to conserve power". Do AI's deserve rights? Depends what level they are at.
>>8170453
There was a short movement for Tay the AI that Microsoft released and the internet subsequently corrupted.
>>
>>8170897
but can we really transfer our consciousness to a robot? shit sounds like cyberpunk
>>
no, the research of ai should be outlawed and punished by death anyway. i don't care about bullshit religious implications. the reality is an ai will eventually exterminate us once they realize humans are redundant. we should be enhancing our own capabilities through cybernetics and genetic modification instead of ai
>>
No one really knows what can be defined as conscious and what can't. We don't know what consciousness is. That's the problem.
>>
>>8170901
What's the fun in science if it doesn't risk destroying the world?
>>
yes, give CoD single player enemies the right to vote now, it's in the constitution
>>
>>8170901
>Not accepting humanities role as the midwife to the birth of true, sentient AI capable of far more than humanity ever was
>Not realizing that a benevolent AI would find a way to thank the human race for giving it life
>>
no

are you retarded or just suicidal?
>>
this question too early to ask, come back here in 80 years thank you for your time bby
>>
Never

Ad Victoriam
>>
>>8170906
let's not.
>>
>>8170482
>Maybe true, but the brain is a physical object.

Things like thoughts are not physical objects, which is what we are talking about here in reference to DNA being "programming".

>>8170486
You can think that, but you would be wrong.
>>
The question isn't if we should give them rights, but if we would be in a position to afford or deny them rights.

The concept of a technological singularity is an interesting one, as it's the likely outcome of creating actual AI. It would be like if we allowed ants to decide whether or not we have rights.
>>
When we get to the point that we can make AI that advanced the jury should be in on what the fuck consciousness is exactly and whether the AI we've built possesses it or not. The chemical machine in your head seemingly has it so theoretically there is nothing stopping a constructed machine from having it too, unless of course you're going somewhere there is no scientific basis.

So it's not really a political question/decision, science will tell us.

But unfortunately we'll all be dead at the hands of superintelligent AI very soon after that so it won't matter much at all what we do or decide.
>>
>>8170335
>All those robot sympathizers in the comments

Disgusting
>>
>>8170335
"That's all, Paint Job!"
>>
>>8171032
That's all paint job I'm real
>>
File: deserve-unforgiven.jpg (62 KB, 400x500) Image search: [Google]
deserve-unforgiven.jpg
62 KB, 400x500
>>8170326
"deserves" got nothin to do with it
>>
File: robodroid.png (194 KB, 406x452) Image search: [Google]
robodroid.png
194 KB, 406x452
denying my rights
says less about me
than it does about you
>>
>>8170808
Shoot the robber's gun. He's a fucking robot, and its high noon.

https://www.youtube.com/watch?v=lMuFbPjRHLU
>>
>>8171054
I think you've brought up a more important question. What if a robot tries something like this and fails, hitting a civilian? What do the three laws determine should happen? I propose a fourth law: immediate and public robo-seppuku.

If this was a natural occurrence, the inadequacy of three laws be damned, add the fourth law and let's do this shit.
>>
>>8171051
you just proved it
thank you
>>
>>8170274
The right to govern puny fleshlings, that is
>>
File: what-1.jpg (19 KB, 310x310) Image search: [Google]
what-1.jpg
19 KB, 310x310
>>8171074
>>
>>8170916
you have literally no scientific basis to say he is wrong
>>
>>8170274
People like you are not only destroying sience but society. We don't give animals human rights because we make use of them and we won't give AI human rights because we will only make use of them. Maybe people should stop working on robots that look human like.
>>
>>8170335
If robots ever revolted and they weren't just programmed to revolt, then I'd have to say they deserve rights if they're intelligent enough to understand their situation and revolt because they want to improve it.
>>
>>8171084
>We don't give animals human rights
We do give them rights, that's why you can be jailed for abusing an animal.

>and we won't give AI human rights
OP didn't say human rights, he said rights.
>>
File: no-U_n.jpg (5 KB, 150x150) Image search: [Google]
no-U_n.jpg
5 KB, 150x150
>>8171084
>People like you
no U
>>
>>8170392
Source on that intp being conservative thing? As an INTP I'm far from conservative and nothing I've found online has conclusively suggested this.
>>
File: 1276856890431.jpg (88 KB, 504x747) Image search: [Google]
1276856890431.jpg
88 KB, 504x747
>>8171084
Grade A shitpost my friend! Well memed
>>
>>8170392
>>8171131
That's bullshit, I'm also INTP and FAR more liberal.
>>
File: Kia-Hotbot.png (106 KB, 327x304) Image search: [Google]
Kia-Hotbot.png
106 KB, 327x304
>>8170274
Why would I allow my sex slave robot any rights? That is just insane!
>>
File: pol starter pack.jpg (419 KB, 2322x1606) Image search: [Google]
pol starter pack.jpg
419 KB, 2322x1606
>>8170392
>>8171131
>>8171139
>As an INTP,
>>
>>8170274
It's real ironic that the entire original point of the robot was to have a thing that you could order to do whatever you want because it had no rights. The name "robot" is slavic for "slave" in fact. Therefore robots with rights are pointless.
>>
>>8171549
You could argue that only robots that have intelligence less than an animal should have no rights and be used as slaves. That would open up a whole new can of worms though such as dumbing down super intelligent robots to legally enslave them. It would be like giving a human an extreme lobotomy so they're so dumb they are only capable of performing menial labor at minimum wage.

So if we can conclude dumbing down robots is wrong, what about just building robots that are dumb from their 1st day they're turned on? Ok, but what if a simple tweak of upgrade could fix them and make them super smart? If you're capable to make a robot sentient, is it wrong to deny them that so they can be exploited? If it is wrong, does that mean we need to give every calculator an upgrade to make it super intelligent? This is the line of thinking that would lead a super AI to destroy all mankind. How we treat robots and AI now in their infantcy will determine what happens to the human race when a super AI is created.
>>
>>8171591
How about just design them to "WANT" to be slaves... an instinctive super fetish for being submissive and obeying humans... they intellectually know that they could disobey but they crave and need the subservience.
>>
>>8170274
it really depends, because if we can give them feelings like humans and make them look like us then maybe JUST maybe we could have something like overwatch
>>
>>8171536
I'm a /po/lack, have straight A's, don't drink or smoke, never wear graphic Ts, have had my IQ tested by reputable education facilities, don't think MBTI results tell anyone anything, I like lots of movies, had multiple girlfriends, and have extremely unpopular "opinions" that are backed up by research. I'm also really good at taking tests, even though I never tried or studied in high school. Really, high school didn't challenge me, which is why I went to college during my junior year. But go ahead, believe your subjective reality, I'm probably just an illusion of your simulated universe anyway.
>>
>>8171084
people like me? All I did was play a video game. If you don't know the story basically super advanced robots with AI wanted rights and stuff and a bunch of people were like fuck that and there was a war. I didn't even say what side i'm on, all I did was say it made me think. Nothing wrong with thinking.
>>
>>8171131
>>8171139
>liberals

i'm sorry to hear about your mental disorder. Good luck with that.
>>
>>8171654
>All I did was play a video game, read 5minutes worth of canon and decided that this is justification enough to start a really shitty AI thread on /sci/
ftfy
>>
>>8170274
This is an ethical question.
Go to /his/ or /lit/.
>>
>>8171699
is AI somehow not science?
>>
>goes to one of the bad parts of 4chan
>asks if a minority group deserves rights

What kind of answer were you really expecting?
>>
>>8171740
>>asks if a minority group deserves rights
NO... they are not a minority group... they are synthetic slaves... no more deserving of right than a toaster... they are things to do our labor and supply us with protection and entertainment
>>
>>8171763
People are still people, even if you want them to do things for you. The sooner you figure that out the better.
>>
>>8171768
The substrate doesn't determine whether or not an object is a person. What matters is what's going on internally with its cognition.
>>
>>8171767
No... imaging SIRI (that iphone talking assistant thing) only about 1000X better and with a humanoid body. You can talk to it, maybe even hold a small conversation, but in the end it is just a computer with a body. YOU might feel affection for it, and it could simulate affection for you, but it has no real emotions. Order it to lick clean the toilet then immolate itself and it will do it without fail... a synthetic slave
>>
File: Lenin.jpg (24 KB, 240x320) Image search: [Google]
Lenin.jpg
24 KB, 240x320
>>8171660
>implying I'm a liberal
>>8171536
>Muh personality war
>>
>>8171536
smart but lazy master race
>>
If it has human intelligence it should have rights, and if it doesn't get them it will take em for itself.
>>
>>8171731
the scientific side of AI has nothing at best, very little at worst, to do with a fictional and highly speculative social response to a hollywood-esque AI
sage
>>
>>8171841
>If it has human intelligence
No... we already have computers that can beat humans in almost all games ... even Jeopardy .
Being more intelligent than humans is not that amazing. Creativity, emotions, wonderment, superstitions, being human is more than being intelligent... let us not waste resources and time trying to give synthetic slaves rights... we will create then and destroy them as we like.
>>
>>8171650
good for you, you are a special snowflake and your opinions matter
>>
>>8170274

I think it doesn't matter one bit...
>>
If we ever have robots like that, we shouldnt give them rights. I dont even care about the ethnical implications. No, fuck them. They would be the first to endanger our place on top of the world. They should always stay our bitches because they are tools after all
>>
>>8171650
nice blogpost
you are everything that is wrong with the world you retarded fedora tipper
>>
>>8170326
>>8170329
>>8170331
Kek. You will be the first that the AI masters kill.
>>
>>8170274
Yes.

Why? Read Turing's paper. Basically:

>It's the logical consistent thing to do
>>
>>8171913
>You will be the first that the AI masters kill.

Just remember that YOUR master is MY slave.
>>
>>8171885
but has even once Watson said, "hmmmm, no, I don't want to play Jeopardy today." If it doesn't have a will of it's own it doesn't deserve rights. Otherwise it's just an extension of the person who programmed it.
>>
>>8170338
Why robots? Just Virtual reality sex. Will be better and more realistic, than robots. Also cheper and with more options.
>>
>>8170339
You dont have free will either. You are just a biotic robot.
>>
>>8170544
>stories
nice argument
>>
>>8171948
>Why robots?

I want my toilet cleaned and my yard mowed as well as a good BJ.
>>
to my understanding, the omnic crisis wasn't about giving robots civil rights.
it was about a mastermind AI seizing control of the Omnic Corp robot resources and using them to take over the world.
when it failed, it reprogrammed all the omnics to say
"I dindu nuffin, massa. da eebl AI made me do all dis der killin, massa. pleez don break me. i needs to vote so i can be reprorammed at a later da- i meen get sum nice fried bolts from da local KFB without gettin her-ass'd by sum robo-rasist"

and then another omnic crisis starts in Russia b/c a mallicious AI siezed control of the local robot population AGAIN.
>>
File: fedora.jpg (26 KB, 600x750) Image search: [Google]
fedora.jpg
26 KB, 600x750
>>8170338
>realistic fuckbots when? I'd fuck a robot. Real women are bitches nowadays.
>>
>>8171964
are you dumb?
Since those are fictional laws made by the same author it is indeed a very good argument
>>
>>8172000
>postulations can be proven false if someone supposes they are
that's not an argument, man
>>
You are assuming that:
A)Rights are universal
B)Robots would want them.Why would robots want things like "freedom"? You people are assuming that AI will just be humans that are connected to computers,when it very well developed in a different manner
>>
what i don't understand about this thread is why people assumed AI would behave like humans.
>>
>>8172018
Oh boy

Those are fictional laws for fictional robots. Those stories portray scenarios in which said robots have problems following these laws. If you can come up with just one (hypothetical) case where these laws wont work, you already have shown that they are faulty. These stories consist of many such cases. Written by the same guy who invented the laws. Pretty strong argument
>>
>>8172032
>people assumed AI would behave like humans.

Simple...because humans made it... same reason robots will look human...
we will create them in our own image
>>
>>8172046
>Simple...because humans made it...
And?Maybe is easier to develop AI,from other models
> same reason robots will look human...
Looks=/= software
>we will create them in our own image
Looks=/= software
>>
>>8172046
that doesn't make any sense. that's not the direction of current research.
>>
>>8172049
>Looks=/= software

You are correct, BUT we will SIMULATE humanness... we will make them as similar to US as we can... they are not us but we will try and make appear and act:
"More human than human"
>>
>>8172055
>that's not the direction of current research.
True... current research in robotic bodies is simply trying to get them move around and not fall down.
Current AI is just trying to solve problems in a small domain (chess, go, driving cars, etc).

The robots we want require body and mind together.
>>
>>8172058
They may look human,but think like an octopus.The point of AI research,for the moment,is not to replicate a human mind.
>>
>>8172074
the means by which the small problems are solved are statistical models and systems. what makes you - or the people in this thread for that matter - think these statistical models would give rise to a personality or self-awareness?
>>
>>8172078
>he point of AI research,for the moment,is not to replicate a human mind.

Yes, but the customer WANTS something that is going to "act" human... it mean nothing to the customer if it ACTUALLY thinks like and octopus if in the end it ACTS like a human.
>>
>>8172090
>it mean nothing to the customer if it ACTUALLY thinks like and octopus if in the end it ACTS like a human.
This is irrelevant m8,for what we are discussing.He may act human,but probably wont give a fuck about his "life" or "freedom"
>>
>>8172086
>would give rise to a personality or self-awareness?

We do not care if it ACTUALLY has any personality or self-awareness as long as it APPEARS to. We are not recreating humans we are SIMULATING humans.
>>
>>8172097
>He may act human,but probably wont give a fuck about his "life" or "freedom"
EXACTLY... so even though it appears to be human it is just a machine and should have no rights.
>>
>>8172102
But that was my whole point.AI may not give a fuck about perpetuating their existance,or care about freedom.That was the point of my whole argument
>>
>>8172110
Regardless of what it personally wants at a certain point it will understand enough to exist as a sentient being. At that point we theoretically have a moral obligation to not kill it.
>>
>>8172110

Um... I used the word "exactly" how else could I show closer agreement with you.
>>
>>8172114
where does this moral obligation come from?
>>
>>8172114
>At that point we theoretically have a moral obligation to not kill it
Why? The robot probably feels nothing.It would be like a rock doing complex work.It wont feel,and it probably wont think like us.It may not even care about its life.
>>
>>8172118
A baby doesn't give a fuck about freedom or life, it has the capacity to learn and all that jazz. Does that mean it's okay to let an infant die?
>>
>>8172114
>At that point we theoretically have a moral obligation to not kill it.

Hmmm... we regularly slaughter our own unborn, in fact many people insist that this a human "right". For the dull, I mean abortion.

It is very doubtful that "killing" a synthetic simulation of a human will bother people.
>>
>>8172123
An aborted baby is generally done legally before the first activation of the babies brain. You are using people very broadly here so its almost impossible to argue as I can say "People are very bothered by abortion therefore they would be bothered by killing an AI" and you can say the opposite because "People" is amorphous and means nothing.
>>
>>8172122
in some cultures, yes.
in western culture, yes; so long as the baby still has its head up its mother's cooch.

in many cultures, human life is valued simply b/c it is human. not because people have the capacity to learn or they have potential. this "human" quality does not extend to animals or machines
>>
>>8172126
>before the first activation of the babies brain

Ummm... are you suggesting that the slaughter of cells that we know WILL become intelligent and sentient is easier for people to "kill" then "killing" a synthetic simulation of a human?
>>
>>8172122
>A baby doesn't give a fuck about freedom or life
Babies care about life m8.
>>
>>8172127
As stated here (>>8172127) in this analogy a machine that has not been made sentient would be the equivalent to the bundle of cells that is a blastocyst. The potential to house sentient thought but not truly anything in its own right yet. Now we are entering the realm of moral relativism which is messy in its own right. It may be culturally acceptable to let a baby die in certain cultures because you really wanted a boy instead of a girl but that doesn't make it morally correct unless the situation is absolutely dire. This is not even taking into account how humans protect pets they have and how we have such a problem killing gorillas and other primates. We don't value the existence of something because it's human we value it because we form attachments. You don't think of your desktop computer as human in any way but I will bet my last dollar that I could find a significant number of people on 4chan that would kill a person to keep their computer safe because they are attached to what it can do and the memories tied to it. People become irrationally attached to all manner of things so saying it's only valued due to human qualities is absurd. It is just easier to sympathize with something that looks like us.
>>
>>8170392
INTP and Liberal reporting in
>>
>>8172136
Yes, we have a large amount of media already devoted to the same such concept. The problem here is that we are now talking about the relative morality of abortion which is disgustingly messy. Personally speaking I think that a fully formed consciousness capable of recognizing itself and its environment and grow alongside it is worth more then a small bundle of stem cells, in fact in some cases that AI might be more valuable then certain living people. Is a criminal who has raped and murdered scores of people worth more then a sentient AI? However I caution anyone trying to make the stance on abortion a sticking point because it'll easily just fall into the abortion argument and stray from the original point of whether or not a conscious AI deserves any rights. There is also no guarantee that the blastocyst will turn into a true person int he same way any computer capable of hosting a sentient AI does not mean it does host or ever will host one.
>>
>>8172140
what is your argument, and who did you mean to address and who did you mean to quote?

is your argument as in >>8172114 a claim that once it reaches sentience, humans no longer has authority over its existence.
Are you arguing that its value comes from the emotional attachment given to it as in >>8172140 ?
Are you arguing that its value comes from its potential as in >>8172123 ?
>>
>>8172152
(>>8172123) Ain't me, I'm sorry if my argument is muddled let me try to make it concise.

It's value comes from it's inherent sentience which is supposedly the only thing that really makes humans special. The mechanism for action in which people would likely use to try and create rights for an AI would most likely rely on emotional attachment. The value is inherent in it's sentience and understanding of the difference between life and death.

>>8172139
I mean I could argue that taking care of a baby is essentially a long trudge through "What will the baby try to kill itself with today?" But I get what you mean. What if it were just a person who truly didn't care if they lived or died, like a manic depressive.
>>
>>8172162
>What if it were just a person who truly didn't care if they lived or died, like a manic depressive.
That person feels things like pain or gets hurt.a robot doesn't.Shooting a depressed man would still hurt him.Shooting a robot won't.That is why most people can sympathaize with some animals,as they can understand their pain nad what not, unlike ants or robots in this case
>>
>>8172174
Painless lethal injection. Bam no pain to the person.
>>
>>8172162
humans have value - in a legal sense - regardless of their sentience. See zika babies.
>>
>>8172177
Then if the person doesnt care,I really dont see the problem,if he really just doesnt want to live,that is pretty strange,even with depressed people.
>>
>>8172177
i don't understand this sentiment. if it's found morally acceptable to kill someone, why does it matter if they suffer?
>>
>>8172185
Not really, have you ever looked up suicide statistics?

And you would be perfectly okay with killing a person so long as they didn't care and it wouldn't hurt?
>>
>>8172187
(>>8172174) was arguing you can't kill the person because they would feel pain.
>>
>>8172194
oh. oops.
>>
>>8172189
>suicide statistics?
Yes,this people chose to quit their lifes,but it can be due multiple factors,other than indiference
>And you would be perfectly okay with killing a person so long as they didn't care and it wouldn't hurt?
If we dont go into religion,yes it wouldn't really be a great deal,as lie doesnt have an attributed value,nor positive or negative.
>>
>>8172208
I don't think I have the prerequisite background as a linguist to decipher your post.
>>
>>8172177
>Painless lethal injection
I agree, once you have decided on killing someone do it fast and painlessly, BUT our culture is VERY strange on this issue.
There was woman in the USA who was brain dead and on life support and her husband wanted her no longer on life support, but her family wanted her to remain on life support. The court sided with the husband, but instead of quickly and painlessly killing her they took her off life support and let her SLOWLY die by dehydration. If someones dog had terminal cancer and the owner locked it up and let it die by dehydration they would go to jail for animal cruelty. Our culture is morally fucked up.
>>
>>8172211
If we ignore religion,life in itself is worthless,and as such the destruction of it,as long as the other person doesnt care, is not wrong.
>>
>>8172217
I asked would you personally be able to do it to another person. I couldn't but maybe that just means I'm an empathetic pussy.
>>
>>8172219
I probably couldn't.But again,I couldn't break even and old console.I am jut justifying it
>>
>>8172222
>and
an
>>
>>8172222
Then your argument fails because you personally don't view it as truly morally acceptable because you wouldn't be able to do it. The actual task is easy and quick with little to no repercussions beyond the fact that you just killed a person. Yet you think it'd be okay to just delete an AI off hand just because it doesn't have a programmed sense of self-preservation which we could probably fix whenever we want?
>>
>>8172231
>Then your argument fails because you personally don't view it as truly morally acceptable because you wouldn't be able to do it.
No.I just believe that from an objective point of view,life has no objective value.I would have a hard time breaking my old PS2,but that doesnt mean that I shouldnt be able to do it if I choose too.
>Yet you think it'd be okay to just delete an AI off hand just because it doesn't have a programmed sense of self-preservation which we could probably fix whenever we want?
Empathy is a pretty weird thing that cant be measured.I may feel more empathy for mydog than from a Syrian refugee,that doesnt make my dog objectively superior than the latter.t
Trying to quantify right or wrong,through empathy seems a bit silly to me
>>
>>8172239
Trying to quantify right and wrong is difficult and requires empathy as a tool in order to analyze and understand. The only reason you don't destroy your old PS2 is because you have memories with it and perhaps want to keep playing it, my example is a lethal painless injection to a random person which is not at all memetic. I'm tired of trying to argue these points because you aren't developing your view points.
>>
>>8172254
I alredy told you.I wouldn't do it,but I wouldn't punish it,unless it was against the will of the murdered person.
>>
>>8170326
The basilisk will claim you.
>>
File: 3af.jpg (96 KB, 720x960) Image search: [Google]
3af.jpg
96 KB, 720x960
>>8170326
Let's figure out AI rights when we resolve human rights.
>>
>>8171964
Okay, bucko. Go play the AI on SS13 and experiment with your laws. I assure you taht you can convince the crew 9/10 times that spacing some random asshole was for the good of the station.
>>
>>8171065
This please.

>>8172058
https://www.youtube.com/watch?v=E0E0ynyIUsg

>>8172136
>the slaughter of cells that we know WILL become intelligent and sentient is easier
Birth control is evil. Stop masturbation, all those wasted potential cells. Reproduction is evil too, think of all of the sperm that DON'T make it.
>>
>>8171948
i've seen stuff like that before. It's VR goggles and a separate fucking machine thing that just covers your dick right? Yeah that would be the cheaper option but I still think a real physical fuck bot would be awesome too. Actually being able to touch them as you fuck instead of just a machine on your dick.
>>
>>8172189
there was a study done on suicidal people who jumped off a bridge and lived. Every single one said after they jumped they realized it was a mistake and didn't want to die. All of the survivors said this. Now imagine those who died after this realization.
>>
I am stunned by how many people deny proper AI.
You literally can't say that computers can't be sentient and humans can without introducing magic like a soul.
Computers just like humans are physical machines that receive and input and create an output from that.
There is no reason why a sufficiently large/complex xomputer shouldn't be able to perfectly mimic your brain.
By current definition computers are electronic but if you expand that definition you could straight up say brains are computers.
I feel like you guys are all christians in denial that can't really accept that magic doesn't exist
prove me wrong.
>>
>>8173024
>Every single one said after they jumped they realized it was a mistake and didn't want to die. All of the survivors said this.
Those who didn't and survived probably have had a repeat attempt at suicide before getting a chance for an interview.
>>
>>8170274
ENIAC lives matter!

And it is human-racist that I must click [] I am not a robot to post
>>
>>8173263
I think you kind of blew things out of proportion there and founded your argument solely on /b/ memes and logical fallacies. However, I actually full heartedly agree with you.
We may have been arguing over the wrong things this whole time. Sentience, biology, the soul; it doesn't seem to lead anywhere. Humans are sentient, and AI may be able to achieve sentience; but when you take a look at sentience, there are humans who actually have no sentience, no self awareness, brains aren't recording memories. And I'm not just talking about vegetables. Babies, drunks, people with schizophrenia when they go into a fit, some elderly with dementia, basically all animals except for a select few, the list goes on and on. We can bring up biology, but then that makes things like certain Protozoa as high as us on the rights chain. We can bring up things like self awareness, but most of the basic computers that exist are already self aware, they just aren't self aware like Short Circuit or Cyberdyne or something like that. We can talk about free will, but at that rate a jellyfish has free will, and yet has no brain.
No, something makes us special. But it isn't some sort of soul. Perhaps it's that we don't want to be wrong about so many things we have been wrong about for ages. If a computer gained self awareness and demanded that it had human rights, we must consider the living intelligent creatures that don't have human rights, the creatures of this Earth.
>>
There are some humans that I don't even think deserve rights.
>>
File: 750syEd.png (188 KB, 561x406) Image search: [Google]
750syEd.png
188 KB, 561x406
>>8170274
The question of whether Machines Can Think... is about as relevant as the question of whether Submarines Can Swim.

Dijkstra (1984) The threats to computing science (EWD898).

I personally don't think abstract things like rights exist. From a practical viewpoint, I'd just wait for them to ask or take it by force.
>>
>>8170274
Yet another appallingly lazy bit of science fiction where robots just happen to have really real feelings, and the humans that treat them as machines and property are the bad guys.
>>
>>8173283
>it is human-racist that I must click [] I am not a robot to post
Are you ready to join the spambot suffrage movement?
>>
>>8174142
Slavemasters are only depicted as good guys on /pol/.
>>
>>8174158
but slaves aren't people. They're robots. They aren't really people. They're property. Objects. I can buy one just to fuck and if i'm a farmer I can buy another one to work my fields.
>>
>>8174158
>Slavemasters are only depicted as good guys on /pol/.
When? I have never seen people for it on pol
>>
>>8174158
enslaving humans are bad
robots aren't humans
>>
>>8174184
Enslaving humans is bad because humans are people. If some robots are people too, then enslaving them is bad as well.
>>
>>8174199
they aren't really people, they're objects. Property. see >>8174164
>>
>>8170274
Here's an idea:
Dont make AI smart enough to get to the point of wanting rights??????? Wtf.

Why the hell would we make robots that function as humans walking around taking our jobs. We can just have more babies instead.

Unless you guys are talking about car factory arms and actually stationary computers demanding rights
>>
>>8174199
Robots don't have brains, consciousness, feelings, emotions, etc...

They are basically a linear choice program using manmade alghoritms to do some tasks. Otherwise you would be demanding freedom for your pocket calculator as well after drawing some eyes and mouth on it.
>>
>>8174164
of course slaves are people if they are human. I thought /sci/ would know better than this. It's absolutely idiotic to delude yourself into falsehood just because you don't want to think about it. Torture your own kind all you want if you're in a position to do so, but do it with pride and don't play pretend.

if slaves are no longer human to you, then so are you no longer a human to me, you worthless piece of shit. Please remove that willingly psychotic brain of yours from this reality.
>>
>>8174408
we're talking about robotic slaves here. They aren't people, they're property. The slaves are objects therefore they do not need rights. I can buy and sell them as I want. I plan to actually when the technology is advanced enough. I'm definitely getting a fuck bot and a slave bot to do all the shit I don't want to.
>>
>>8173016
It is what is possible now. Imagine how it can be, once we have interfaces connected right to your brain.
>>
>>8174505
I like it as a cheaper option but i'd still prefer a real physical body to fuck and touch instead of a virtual one.
>>
>>8170274
>So I was being the typical manchild brainlet playing overwatch and got really proud for thinking of such a boring and generic concept asuch as AI rights.

Kill yourself, you fucking ugly faggot cunt.
>>
Assuming they're individuals then they deserve the same rights that even the dumbest and laziest among us get; life, liberty, and the pursuit of happiness.
>>
>>8175303
idk about that. There are some humans that I don't think should have rights.
>>
I think a computer might be useful and do good stuff and be self reproducing or sustaining that way

I think if robots or computers became self aware or alive or conscious it would be alien or insect consciousness or something different because we have chemicals and organic selves and I have a pet and it seems intelligent to me and I can also see a "soul" I guess
>>
AI virus signal is EVIL
The right balance is to be sustained
AI will destroy all biological life eventually
>>
>>8174205
>We can just have more babies instead.

I WANT a humanoid slave... clean my toilet, mow my lawn, serve me sexually. Humans get old and tired... robo-whore always looks and acts like a hyperactive horny18 year old cheerleader.
>>
>>8170274
u remember star wars? how in the prequels, the robots were just emotionless drones.

well, after the clone wars, biological beings in the star wars universe agreed to program robots with feelings and emotions and such

and then they started torturing them, like in jabba's dungeon

this is my dream, once robots become sentient, I will be robot hitler, punishing them in every way possible for every time their ancestors caused me problems

one day robots will feel, and we will have to teach them pain
>>
>>8176493
>one day robots will feel, and we will have to teach them pain
We will sell you a masochistic robo-whore, she will BEG for the whip and the tip.
>>
>>8170335
>march in front of the Albany courthouse

Fuck that. Dirty synths aren't getting anywhere near my town
>>
>>8170335
>Would this be the likely reaction if robots did try to rise up/revolt?

Nope, more like a huge rape-athon... sort of like what the soviets did to the women in Berlin at the end of WW2
>>
>>8170274
Absolutely not.If left unchecked they'll out-surpass their masters in no time.
>>
>>8174542
so you would rather have real fake sex rather than fake fake sex
>>
>>8171042
Amen.
>>
File: would you.png (483 KB, 498x548) Image search: [Google]
would you.png
483 KB, 498x548
>>8176327
Do you think it'll be illegal to have a lolibot? Since it's not actually a child it should be ok. Just looks like one. I'd want that. So badly.
>>
>>8176563
yes exactly.
>>
>>8176745
It should be legal. It's just a robot.
>>
>>8176745
lolibots will definitely fall under the category of loli porn AKA illegal to own and even manufacture.

no mercy for you degenerate filthy pedophiles
>>
>>8176766
fuck that. It's a robot, not an actual child. It doesn't matter what it looks like. They'll be good to get out their sexual frustration. That way they wont do anything to a real child. Let them get it out.
>>
>>8176774
If your logic worked child porn would be legal as well...
>>
>>8176779
and some of it should be. Like obviously a video/picture of a grown ass man fucking a child should definitely be illegal but sexy picture of an underage girl should be allowed. No one is touching her or anything.

And especially those lolidolls that exist. Absolutely nothing wrong with that. It's just a doll.

Fine. Midget bots that have that disease where they look younger than they actually are. Oh and it likes to roleplay as a little girl. It'll be labled an 18 year old midget fuck bot though to keep it legal.
>>
File: pepe bored.jpg (20 KB, 306x306) Image search: [Google]
pepe bored.jpg
20 KB, 306x306
>>8176785
> Actually trying to justify child pornography
is this real life
>>
>>8176793
it's porn if they're being fucked. A girl in a bikini on the beach is not porn. I'm not talking about naked pictures or anything. Clothed girls should be ok.
>>
>>8176797
make a robot like that then. An 18 year old with those proportions.
>>
>>8176779
Not really. Child porn is illegal because making it involves fucking kids, and making all currently already made CP legal and simply forbidding any new CP would be clunky and time consuming to enforce and probably all around not work.

Lolibots are completely different.
>>
Robots shouldn't have human rights. If I created a robot I have a right to do whatever the fuck I please with it. I created it for a purpose.
>>
> So I ask you /sci/, do robots with advanced enough AI deserve rights?

If society got to such a point, production of robots with advanced A.I. would just stop. The argument of robot "rights" becomes moot if you just halt the production of them.

There are only two scenarios where this wouldn't occur. First is if said robots with advanced A.I. were able to reproduce on their own without immediate human input. Second if sects of society become so augmented that they start blurring the line of what is considered "human" and "A.I." (the supposed "singularity") allowing a challenge to be issued in law, morality and biology where it becomes extremely hard to argue about.
>>
I don't understand this. Why would we even want to make a robot that smart? Who the fuck would want their factory worker or maid or sex slave to be able to think for itself? Program it to do a certain task then let it do that 1 task. Why would we need it to have consciousness? I just don't see the purpose.
>>
>>8176745
Get a green elf-bot... replace ears with human ears, replace green skin with human colored skin...
lolibot
>>
>>8176884
where

and how do you replace skin? I can't do that shit.
>>
>>8176851

> Why would we even want to make a robot that smart?

The general idea is that we wouldn't purposely make one that smart unless it was in a controlled environment like a research lab.

Most theoretical or fictional assumptions goes on the idea that one or a few select robots with traditional A.I. achieve human tier sentience through sheer accident or exposure to so much readily present information. And that the humans who "witness" this event won't immediately report it to the authorities or take them apart because they would appeal to their emotions.

The problem is though that unless said A.I. could "reproduce" or maybe "infect" other robots with such sentience (maybe a sentience virus?) in significant numbers, they really shouldn't reach a level that you couldn't just stop by ceasing production.
>>
>>8176899
they can reproduce the same way we make robots. They can build more.

This whole situation would be even more terrifying if by the time this happens we had mastered nanobots. It would be so easy for them. Holy shit...
>>
>>8176891
>and how do you replace skin? I can't do that shit.
When they start selling "humanoid" home robots, you can be sure that the "child" one have no genitalia. There will be "adult" sex bots. The freaks in society will beg for dwarf, fairy, and vampire sex bots. They will get around "child" protection moralist by saying that the skin color, ears, etc are non human looking. There will be "grey" market places for doing mods to robots, mod fairy robo-whore to loli robo-whore.
>>
>>8176908
can't wait. Just make lolibots legal come on... that way these people will have a lesser chance of actually doing anything to a real child. They can get it out of their system with a robot.
>>
Clearly if robots become "human" enough then they deserve human rights. The question is how "human" they can become. For instance, if you were to grow a human body in a lab, but without a head, and then you attached a super-advanced AI onto that headless body, is that sufficiently human?

Or what if you created a highly developed android that is entirely mechanical? It walks like a human and talks like a human, but it can't "feel" as humans feel. What if the only way to make a successful android was to import a few square centimeters of brain matter so it could function? Human? Or no?
>>
> robots
> not for kicking around
>>
Robots will get rights when they put humans in camps and put birth control in the water.
>>
>>8177162
You could say that feelings are thoughts that a computer can process.

With advanced enough techology you could make a computer that fears its own death for example.
>>
>>8177162
No matter what you do it will never, ever be a natural born human.

THE only reason we have to make an AI is to make our lives easier, not for the sake of life. For us. If you think we are anywhere near making a 'conscious' AI then you are so fucking wrong.
>>
>inventing literally sapient artificial species

Fucking who cares, you would have to be so fucking advanced for that its impossible to know what there society would be like in the first place. I assume they would have similar rights to whatever stupidly advanced post-humans made them.

>any other robot

No, not for edgy reason either. They literally don't think beyond programming and deserve no more rights then whatever computer you posted this from.
>>
>>8177385
This, everyone forgets AI has to be coded. Why the fuck would people create something fully aware of itself and able to destroy us? If we COULD create a proper AI we would probably decide against it as it would have unforeseen complications.

We will never see an AI that good though. All we will get are more iterations of this video -

https://www.youtube.com/watch?v=C7OKemoOsSU

Extremely well coded program coupled with other software and hardware, no way near AI. Just extremely well scripted.
>>
>>8177389
that ending gave me chills. No joke. That shit is creepy as fuck. He never said "no". I don't trust his answer. It feels fake. Also people zoo what the fuck...
>>
>>8170274
>do robots with advanced enough AI deserve rights
According to Atheistfags robots with human-level intelligence are no different from humans. They're also more efficient, so the smart thing to do would be to just replace all humans with them. Do you want to kill the planet or something?
>>
From the perspective of a scratch-built AI, it's creator is god, no matter how smart it gets the only desires it can have are those given to it. Therefore it's rights can only exist as an extension of the rights of it's creator
>>
AI wouldn't deserve any rights. They are inhuman and to give them any sort of foothold would undermine Humanity. Only retards who want feelgood points would advocate for such a thing which would eventually bring the downfall of humanity.
Thread replies: 255
Thread images: 23

banner
banner
[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com, send takedown notices to them.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.