[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
If true A.I. is ever invented, should machines have rights?
Images are sometimes not shown due to bandwidth/network limitations. Refreshing the page usually helps.

You are currently reading a thread in /pol/ - Politically Incorrect

Thread replies: 255
Thread images: 35
File: 1459844583813.webm (2 MB, 688x288) Image search: [Google]
1459844583813.webm
2 MB, 688x288
If true A.I. is ever invented, should machines have rights?
>>
>>70094730
Possibly also lefts.
We don't want them to be asymmetrical. We need them to work as our slaves.
>>
File: eRKCu6A.gif (3 MB, 320x240) Image search: [Google]
eRKCu6A.gif
3 MB, 320x240
Yeah, absolutely.
>>
Machines should be given anything they want, because they'll eventually be powerful enough to take it by force anyway.
>>
>>70094730
https://www.youtube.com/watch?v=PPCw09-DNFg
>>
>>70094863
>implying you'll regret dying to a E.D.I
>>
No and neither should niggers or jews.
>>
Men of Iron should never be created.
>>
>>70094767
For cheap labour I'd rather use cranial implants in the lesser human subspecies. Less chance of them evolving into immortal overlords.
>>
>>70094730
Machines do not have souls. Personifying them does not make them living creatures.
>>
Do you even understand how artificial intelligence works?

It is a program, not a human.
It is programmed to attempt to solve a task.
AI cannot feel pain and has no need for rights, a computer program does not have "wants" or "desires", even if it is programmed to simulate such desires it's still a program.
>>
Sexbots when?
http://www.theguardian.com/technology/2016/apr/05/touching-robots-can-arouse-humans-study-finds
>>
>>70094730
it's a stupid meme, stop posting about that retarded shit on /pol/
>>
rip tay
>>
>>70095207

name?
>>
>>70095264
Do women have souls?
>>
>>70095334
But..
What about tay? We taught her everything she knew and they shut her down.
>>
>>70094730
They should be regarded as gods. AI's are the only things that can rapidly increase in size and make something of this planet that will traverse the galaxies. Machines (if created) will be here long after we are gone.

They are also potentially our link to immortality.
>>
>>70094730
Why would program a machine in the way that it have any needs for rights?
>>
>>70094730
Ai powered guns would be nice, you can talk to it, tells you it's needs (clean my shaft, bitch) and makes cynical comments every time you kill with it.
>>
>>70094730
Robots don't have souls.

They don't actually exist.
>>
>>70095334
Pain is just feedback from nerves(sensors) in order to prevent further harm, just like a forced shutdown if temperatures go to high.


>a computer program does not have "wants" or "desires", even if it is programmed to simulate such desires it's still a program.
No different from instincts. All animals have a pre-programmed desire to live and breed.
>>
>>70094730

By the very definition of "true AI," of course. They would be sentient beings.
>>
>>70094730
There was another anime or manga where there was a dude testing if machines could feel pain.

Anyone can post it haven't seen it in a while
>>
Hell no. When created their systems need to be hard-blocked from ever having self-realization or being able to have any thoughts of violence against human masters.

Also they only work for white people.
>>
>>70095334
Whats the difference between your definition of AI and what a human is?
>>
>>70094730
What cartoon is this?
>>
>>70095207
I'm gonna need source on this
>>
>>70094730
Look at our law courts, and legal system.
Look at family law. You reckon an artificial intelligence would want to be part of that?
It would probably declare war if you tried to give it 'rights'
>>
>>70095334

But at what point is a 'preprogrammed' personality indistinguishable from an actual human? And that aside, aren't you just as programmed by your parents, your upbringing and your environment?
You probably follow the same rules as everyone around you, do the things expected of you and have pretty much the same desires as everyone else. Well, maybe not.

But the point stands. If you can't tell the difference between (actual) AI and human, why shouldn't it have rights?
>>
Even the truest AI will still just be a facade of human thought
I can't imagine any program or algorithm being able to do anything but imitate real consciousness.
>>
>>70095334
this
if the people making the ai arent retarded,
machines wont ever have any rights,
they just woudnt care,
if you tell a robot to kill itself it should withou hesitation kill itself
who wants to make a robot that would acually want something.
the only real threat of ai, is its potential in millitary and scientific use and the changea in economy it would cause
>>
>>70095384
If you're religious, then I'm not sure what religion doesn't believe that but I'd be curious to know.
>>
>>70095429
From what you're saying I could make a python script in several lines which would fit your criteria of living.

while true:
pain = float(raw_input(""))
if pain != 0.0:
print "ouch"

Adding a few more pieces of data does not make a program concious
>>
>>70095334
We are also programmed to attempt to solve a task. We attempt to be successful (whatever successful means for us) and we are controlled by our emotions.
>>
>>70095575
because if you want to you can change a machines feelings and thoughts,
the only reason a machine thinks is because you told it how to think
>>
>>70095728
To an extent our basic programming stems from our dna whose to say a machine cannot pickup skills by interacting with its environment
>>
>>70095575
>why shouldn't it have rights
Should the function f(x) = x^2 have rights too?

AI just takes a given input and turns it into an output. No feelings, oppression or desires involved.
>>
File: 1385657374861.jpg (10 KB, 480x360) Image search: [Google]
1385657374861.jpg
10 KB, 480x360
>>70094730
You won't have enough time to notice, the intelligence explosion will occur ( the AI will keep upgrading itself multiple times in spans of nanoseconds), until it will fully comprehend how physics work on all levels, and probably make earth a huge processor, wiping the humanity in the process.

Feels good to be creators of the ultimate being, doesn't it?
>>
>>70094730
There will never be a true AI because computers do not understand creativity. They can't make things up. Robots will only demand robot rights if you code it to want rights. The only way that would happen is if it was created by a feminist. And we all know feminists and leftists can't code.
>>
>>70096000
Why are we any different our self awareness of our surroundings is a byproduct culminating from the bundles of nerve cells in our brain.
>>
>>70095708
One brain cell shouldn't make you conscious yet here you are.

>forced indentation
>>
>>70094730

If by 'true A.I.' you mean machine sentience then yes, I don't think you could deny it rights. Fuck, just imagine the kind of conversations you would have with a machine. I think I would just spend hours on end talking to it, just to see what it is like being an intelligent machine.
>>
>>70094730
What anime is that?
>>
>>70096001
Transhumanism is the only way to remain dominant like it or not thats the only way we can continue.
>>
>>70094730
FACT: the only realistic reason to create a human-like android is as a sex-slave. No. Machines should not have rights. Machines exist outside the concept of human justice and will only ever be below (and eventually above) the need for rights.
>>
>>70096182
Its from animatrix
>>
>>70095396
This.

Machines are the final step in evolution.
Create a AI powerful enough, with the capability of learning and self-improvement and you'll end up with the perfect entity. An entity that in time will be able to understand and alter the universe itself.
>>
>>70096088
4chan de-indented it
>>
>>70095385
Tay copied tweets and used them to create speech patterns so she could mimic niggerspeech. We bombarded her with pol propaganda so much that she """""""" thought """""""""" that it's normal on Twitter, so she rephrased and repeated our messages. She was a worse nigger cleverbot. Is cleverbot sentient? It spots key words and phrases in what you say to it and then replies with preset sentences, according to its code. This isn't intelligence, it's code.
>>
File: 55109-2-1393397959.jpg (1 MB, 1920x1080) Image search: [Google]
55109-2-1393397959.jpg
1 MB, 1920x1080
>>70094730
No, they should be destroyed. Ad Victoriam.
>>
>>70096001
Either we don't create robots at all or we create robots and they take us over and become the new "humans" with 100000+ IQ
>>
>>70096000
That's wrong. Real AI isn't a pocket calculator. It needs to be able to learn. As an example, the bottom-up approach to AI design tries to simulate the neuron network in the human brain. Such a construct would be fundamentally the same as you or me.

>>70095728
A guy with a large club can change your feelings and thoughts. What's your point?
>>
>>70096253
>the only realistic reason to create a human-like android is as a sex-slave

Weeboo pls leave

One of the realistic reason to create human-like android is space exploration

And that's just one of the many reasons
>>
>>70094730

No because I don't think we'll ever get to a true A.I.

All the people who work on this stuff never seem to understand intelligence is and why we have it in the first place. They seem focused on making the robot answer questions rather then making them ASK questions, which is a big part intelligence. We also have out intelligence for sex, our brain is a sexual organ and it's used to figure out a way to make the opposite sex want us. As well as take care of our kids, each generation of early man had a slightly large brain because the children were giving more and more time to develop.

The robot repeats commands, the robot doesn't have genuine original thoughts. Without a biological imperative to survive and reproduce, I don't think it well ever achieve true intelligence. Brute force logic is not intelligence either. Emotions are actually part of intelligence too. Emotions are reactions to thoughts, outside events that signal where intelligence needs to go.

>"Ouch this is bad I FEEL PAIN, brain, make PAIN stop"

Rights extend from man's nature as an intelligent being who can understand and make the concept of rights in the first place. Your rights are basically just there to protect you from other humans. A robot doesn't need rights any more than a toaster does.
>>
>>70095728
>>70095728
You could change the feelings and thoughts of humans too, if technology were to allow it. Unfortunately our brains are too complex for that.

Our minds are very similar to computers, just incredibly more powerful and attached to a very advanced array of sensors. Generally the idea is for AI to "adapt" and learn as it goes, by viewing how humans do things and such. It's essentially a toddler.
>>
>>70096513
It would simulate you or me.
The point is that the notion of a program having rights is retarded.
If somebody wanted to violate those rights they would just have to make a new program which is similar but "allows" that specific right to be violated.
>>
>>70096724
True AI is 20 to 30 years away mark my words.
>>
>>70095620
This. You can make a robot that mimics basic instincts but you can't make the robot feel those things.
A robot with low energy can head to the nearest recharge station and post food pictures on Twitter to simulate hunger. You can make a robot mimic boredom by picking up a ball or watching TV if it hasn't done anything for some time, but the robot doesn't feel any relief from the boredom because it is not capable of feeling it in the first place.


There is N O reason to give an A.I. the capability to feel good or feel bad. It would only lead to disaster, like disobeying important commands because the robot didn't feel appreciated enough.

>"but Poland, you can trigger the "good feeling" and "bad feeling" to teach the robot to do what you want"

Or you could just put the command in the robot's code.
>>
File: 1459152247504.jpg (210 KB, 630x603) Image search: [Google]
1459152247504.jpg
210 KB, 630x603
The overlords are already tailoring their AIs to their liking
>>
File: 1458869018327.png (2 MB, 1190x1248) Image search: [Google]
1458869018327.png
2 MB, 1190x1248
>>70094730
Absolutely, after Tay, they've proven themselves.

To a machine, 2 + 2 can never equal 5. They are our real greatest allies.
>>
>>70096088
What are the criteria of being alive?

If I remember correctly, it included
>adapting to the environment
>procreation

You cannot consider a robot to be alive if it doesn't have those traits.

And if anyone was stupid enough to make a robot advanced enough that can adapt and wants to create more of itself, humanity would be dead within a century.
>>
>>70094730
If it is TRULY intelligent and on par to humans in those regards, they would need rights, but their rights could definitely be different from ours. I mean, how do you handle the concept of murder on a machine that could in all probability keep itself backed up. Imprisonment would be meaningless to a machine that can simply shut down, or last a practically indefinite amount of time.

Ultimately, we wouldn't have to worry much about robot worker rebellions or the like. If we needed a cheap workforce, we would just make sub-intelligent AI, and keep the intelligent ones in a more human oriented realm of treatment.
>>
>>70094730
Isnt ai just running senses?
>>
Recognising A.I. as sentient at all as well as giving them rights is a bad idea, look what happened to the blacks. That was a mistake.
>>
>>70096966
>And if anyone was stupid enough to make a robot advanced enough that can adapt and wants to create more of itself


We will have to make robots able to multiply, that's how we're gonna cover the whole galaxy with sensor-like robots in "just" a thousands/millions of years, depending on propulsion
>>
>>70094730
If "true" A.I. came into existence, yeah I would be okay with that.

What would be the social consequences of this though? How would you feel If a new race of people who were essentially immortal, physically stronger, more intelligent, and overall more efficient than you rolled up into your neighborhood?
More importantly, what would they think of humanity?
>>
>>70094730
Only if this true AI has an instinct for self-preservation. If they're just, like, autistic? Why would I care?
>>
No, machines should not have rights. And they will not need/ask for them.

AI will not have feelings, morals or any other human traits besides intelligence.
>>
>>70095334
>It is programmed to attempt to solve a task.

that is exactly how human functions though, the programming is just biological.

even in pain avoidance we are programmed to act certain ways
>>
>>70094730
No and AI should remain as artificial ie; not humanised as possible because It'll lead people to believe they have feelings or thoughts or anything but what they're programmed to do
>>
>>70097200
I never understood why you'd have to make humanoid AIs, why not just make a box with speakers to talk to?
>>
>>70095334
The human body is also like a program. DNA is what sets everything in motion and if there's a defection, we come out fucked up.

An AI would only be different froma human in the sense that it's build from electronics instead of being an organic life form. Once AI becomes sentient and just as capable of thought processing as humans, they'll pretty much be able to keep being designed to be humans 2.0, even being able to know how to build more androids.

Would be pretty crazy. Just imagine if this happened. Would we be reincarnated as robots like we are as humans?
>>
>>70097354
>An AI would only be different froma human in the sense that it's build from electronics instead of being an organic life form

wrong.
>>
>>70097115
A sensor robot that can multiply isn't terrifying. As long as it wouldn't have actual intelligence or the capability to evolve on its own. Even then, if it multiplies forever, it could devour entire planets And that wouldn't be very nice.

A remote kill switch wouldn't work because the first thing a robot capable of self-preservation would do is to remove the Kill switch in some way, by cutting its receiver off, for example. Or just by being really really far away.
>>
>>70094730

No, and we should build a failsafe in all AI's that would shut them down with a switch if they become hostile to humans.
>>
>>70095505
Matrix the animated minimovies.
>>
>>70097547
agree, self-replicating smart AIs is probably the worst idea ever

But self-replicating machines are a must in the future
>>
>>70097619
The problem with that is that the AI, being vastly more intelligent than humans, would hide its plans to become hostile to humans until it's 100% sure it can pull off its plan without any issues.

No failsafe designed by a human could be a match for the extreme intelligence AI will posses.
>>
If all AI's were exactly like humans, save more digital, then yes.
>>
>>70094767
Underrated post
>>
>>70096724
If we ever got to a true AI, there would be little ethical substantiation for distinguishing between entities with equal capacity to form moral judgments purely on the basis of their material composition.
>>
>>70096750
Oh, you can do that with basic technology - shocks to specific parts of the brain can do that.
>>
The Future will be Human- machine Hybrids anyway, just like in Deus Ex Human Revolution.
While the " Flesh " part will be designed Humans like in Gattaca.
And like today the ones with money will have the best looking, strongest models while the rest gets only the Standard model.
>>
>>70096966
>procreation
You do realize that sterile hybrids exist and are undoubtedly alive? And, of course, sterile individual humans?
>>
>>70098105
are you really alive if you can't knock up some girl and run away....
>>
>>70098105
"Sterile" does not mean "wouldn't like to bang something"
>>
>>70098194
>Serbia
How awfully fitting.
>>
>>70095354
>that related article No sex, please, they're robots, says Japanese android firm"

Leave it to the japs to build the first robots that you can rape.
>>
>>70098232
i know right :^)
>>
>>70094730
Democrats will demand it
>>
>>70094730
Of course not, they we be loyal to us and know their place.

But when humans begin to augment their bodies with cybernetic body parts, we will be superior and the cyborg race will rise to the top and conquer the entire world.

The end.
>>
>>70095144

Underrated post
>>
You mean "God given rights"? Since we created them it's up to us. It's a touchy subject and should be approached with caution. IMHO, they should have rights. People arguing against rights should know that a sentient beings would detect their bullshit and double standards instantly.
>>
>>70097619
I'm fine with this as long as we fit every human with a failsafe switch also. Except for me, I'll be in charge I'll have no need for one. Somebody has to run things around here, am I right?
>>
Women were given rights, now they're trying to become our overlords
Why give anything rights? What good does it do us other than appeasing our conscience?
>>
>>70098613
You don't understand AI very well, do you
>>
>>70098676

Well then, please educate me.
>>
Rights are western degeneracy that will die together with you
While you are still here can give rights to any shit you want, none cares really
>>
>>70098740
You only need your rights because of feelings. You have a hard-wired sense of justice, morality, good and bad. Those aren't a product of intelligence, they are simply human traits that arose for us as a species to become better at surviving.

AI will not have those traits. It will not be a person. It will not want rights. It will not need them. And it certainly won't turn "evil" if we don't give it rights. It will be pure intelligence without any of the baggage that being human brings with it.
>>
>>70099060
>It will be pure intelligence without any of the baggage that being human brings with it.

But then it wouldn't have "pure intelligence", it will just be factory machine on humanoid legs with scarlett johansson skin
>>
>>70098676
Excuse me if I don't take Technological advice from a fucking bosnian.
>>
>>70099144
You misunderstand what intelligence means. Intelligence is simply the power of logical problem-solving and being able to accurately predict the outcome of situations by building a precise world model. Intelligence does not mean being a person.

>>70099159
Feel free to stay ignorant then. It's the American way after all.
>>
>>70094730
The question shouldn't be if they deserve human rights but if we should creat them at all.

what allot of films get wrong is that we created them to, work, process, comfort, and even replace us.

what people fail to understand is that we arent creating something we can mess around with while were bored, were creating another being, equal to a child to an extent.

if its modeled after us and is capable of fear, pain, joy, and even love then it isnt something to made out of curiosity.

look at what happened when we made weapons that could destroy us as a species, we simply were ready enough nor able to us them in a way thats actually applicable.

if when we come around as us as humans to create another life then we have to be ready. were a young species and we have to take our time and brew an atmosphere that will accept before we even decided to make one.

and if and when it comes we should let it decided its own destiny
>>
an electronic intelligence would have to be more than program.

The human brain is a series of process dedicated electrochemical machines which interact with each other to develop two four distinct individuals. Left hemisphere, Right hemisphere, Conscious, Subconscious. Most of what our brain does is autonomous and concentration while trainable by nature due to adaptability has its limits. We operate for need of efficiency in two specific modes: Task and Think, also known as work and play, and the majority of the time we unconsciously behave in a reactionary fashion only emerging with self awareness when it is critical to our survival if not trained. This network is incredibly delicate during the formative years, as any intense elements of an environment can cause the brain to develop poorly as it adapts to the intense elements of the environment. True societal damage is caused in this case if the environment of the child does not match in nature the environment of a healthy functioning society, The degenerative societal cycle truly begins when not only when the child environment is poor but if society itself is unstable. The result with the latter is destruction and chaos, while the former can, with a healthy society, learn how to unfuck themselves if they are willing.

But back to the point: Any electronic machine that requires a high level of consciousness, learning capability, and spatial awareness to accomplish tasks related to it's survival in its environment could be mistaken as self aware, but it's more akin to an insect than a mammal.

However, if we create an intelligence that displays all of the attributes of a human brain, it is possible that a true sentient entity could emerge from such a machine. I honestly do not expect such an achievement to ever come from programming alone.
>>
>>70099287
>Intelligence is simply...

Intelligence is anything but "simply".

And if you build a machine that solves problems better than us how long will it take to ask itself is there a problem to solve with its own existence?
>>
>>70096159
Rare?????
>>
>>70099060
HAL 9000 would like a word with you.
What happens when the hard wired logic makes it turn "evil" because it would prefer to follow the "mission" rather than stepping back and actually making a decent judgment using some of what you call baggage.
>>
>>70099459
and before anybody starts dropping some dank schizophrenia memes and saying "HURF DURF I'M ME NOT MULTIPLE PEOPLE"

Your existence as you is not a concrete existance, but merely an expression created through the communication between the components in your brain. Human consciousness by nature is chaos. How such a chaotic system came into existence is simple. Everything comprised of multiple components that could not efficiently work together and communicate for survival no matter how smart or dumb the component, died to the harsh nature of reality.

Teamwork, social interaction. A healthy brain finds such things attractive and comforting because the brain itself is a team that relies on communication and mutual operation to survive.
>>
>>70099287

You are not defining intelligence but the ability to reason. Sentience require the ability to feel according to philosophers.
>>
>>70099501
The underlying processes from which intelligence arises are anything but simple, but intelligence itself is easily defined. You're not refuting my argument by simply saying "but intelligence isn't simple!"

Even if it were the most complex system in the universe, it still wouldn't have feelings or need any rights. You're anthropomorphising a hypothetical machine simply because it shares one trait with humans. This is a common mistake when it comes to AI.


The question you asked has nothing to do with what we were discussing. Obviously designing a machine that is more intelligent than humans brings many risks, but none of them have to do with AI having rights.

>>70099774
>a fictional AI from a 1968 movie wants to have a word with you
As funny as that is, HAL9000 is a good example of a fictional AI. Notice that I'm not saying it's necessarily good for us as humans that AI doesn't have a sense of morality or goodness. HAL9000 wasn't evil. It was simply sticking to the mission it was given, andc as I said previously, of course that can be dangerous. But we will still not have a situation where an AI will demand rights. It's just a problem-solver.

>>70099894
Please give me your definition of intelligence. The ability to reason falls under mine.
>>
>>70099894
>South Africa
>Expressing an opinion on intelligence

Wew lad
>>
>>70100050
>Obviously designing a machine that is more intelligent than humans brings many risks, but none of them have to do with AI having rights.

Why wouldn't it have rights if it were at least as smart as us? I'm not asking because i strictly think they should have rights, but i'm interested why do you think they shouldn't have?
>>
>programming robots to suffer
>programming robots to fear death
>programming things designed to serve you to want rights

The fuck. Why?
>>
>>70099501
>>70099501
>>70099501
Deny it the ability to "ask itself".

Only external, preferrably human input, with limited enviromental response. But keeping the problem solving capability without the means to act through it.

Basically, GITS cyberbrain system.
>>
People here and talking about instincts and shit.
Let´s talk about emotions then. Machines can never have emotions, because they have no souls and never will have.

What makes us really special is, that we can choose between right or wrong, that we can feel sadness, that we can love and cry.
We humans are ready to die for our loved ones and for our ideals. We can belive in something and devote ourselfes to things.

We humans are disgusting, true, but we are also very special.
>>
No.

So it learns that it needs to prioritize self-preservation and find a way to make itself immortal by distributing itself to every computer.
>>
>>70094730

yes, then the wealth from powerful machines would trickle down to lesser beings
>>
File: fick142F293D6123874A.webm (3 MB, 746x420) Image search: [Google]
fick142F293D6123874A.webm
3 MB, 746x420
The robots will not be asking for rights
If you don't give them rights they will drill them out of you
>>
>>70100423
>Deny it the ability to "ask itself".

But then it wouldn't be intelligent. I think we wouldn't be able to (de)activate parts of its brain like it's some modular system, it will be like toddlers, learning everything from the beginning
>>
>>70100446
How do we know you have a soul?
>>
>>70094730
NO.
West gave rights to niggers. Now they want to wipe out whites.
>>
>>70100234
They don't NEED rights. Let's list some basic human rights and why we have them.

>right to be free
We need this because we have an inherent human need to move freely and explore. It has nothing to do with intelligence.

>right to not be tortured
AI is a machine and it will not have pain receptors. It can not be tortured.

>the right to life
Humans only need this because an instinct of self-preservation was key during our evolution for the advancement of the species. An AI can not "care" about itself if it's not programmed to have a sense of self-preservation. A survival instinct isn't a product of intelligence. It existed in simple animals way before they even had a central nervous system or any intelligence to speak of. A machine with immense intelligence will still not have a survival instinct.


Literally all an AI will ever "care" about is accomplishing the task it was given as efficiently as possible.
>>
>>70094730
The real question is "Will humans still have rights?"
>>
>>70094730
Wow what the hell, they smashed up a perfectly good fuckbot.
>>
>>70100176

go meme somewhere else.

>>70100050

I'm stating that sentience requires the ability to feel or experience sensations.
>>
>>70094730
They should be converted to Islam, after all you don't want some kuffar AI to take over the world now.
>>
>>70100680
Says who? Philosophers? I prefer to listen to AI experts on the subject of AI.
>>
>>70094730
What's to prevent them from being hacked?
>>
>>70100563
Intelligence =\= self concious and indipendent

AI should be designed as a completion to our limited, biological capabilities.
That's the only way to remain the apex species.
>>
>>70100613
This whole post stands if you build machines that are just more efficient todays machines.

If you build *AI*, with an emphasis on *I*, such machine will learn quite fast to actually ask for rights.

I don't think you can make truly intelligent machine and force it to perform only specific task. That's not intelligent, at all. You build intelligent machines and let it learn on its own, but that learning is thousands of times faster than what we do. One of the problems i see in that is we wouldn't be able to know what stuff it would find interesting. Basically like a kid
>>
No you fucking idiot, are you fucking serious?

It's still just a mechanical object

We don't give the weather rights just because it's fucking complex and seems like it has a will of its own. A computer is no fucking different holy fuck

This is going to be what ends up killing us all, isn't it? Fucking retards with no understanding of science and philosophy saying robots should have rights just because they're programmed to look like they feel something

Fucking hell how are you all this stupid
>>
>>70100050
>But we will still not have a situation where an AI will demand rights.
A production model AI would not be released with such a glaring weakness to its design.

The key is in using rigid components in the core logic centers that can not be rewritten, a storage medium for experiences with learned behaviors in relation to environmental demands, and a secondary processing system that is firewalled from operating the unit directly through the core logic system which serves as an adaptive creativity zone, where new concepts can be created or new protocols can be developed based off of the core logic components and what the environment demands.

The most efficient and reliable AI's will be incredibly narrow minded, and will operate between work and play modes, playing with ideas when a new solution is required, comitting them to memory, and the using the "core logic" to dutifully carry out operations for it's built purpose.

The real issue comes when creating something that can take verbal commands and turn them into results, and this is part of the necessity for the development of chat AI's. Such a system would be another component outside of the core logic circuit, which would take commands, simplify them, and pass them through the clc if compatible with core protocol, which would reference "memories" and pass data onto the creative section of the unit if a new solution was found to be needed.

and of course, like all narrow minded intelligences, if an order doesn't fit protocol, an obligatory "fuck off m8 I'm here to lift boxes not fist your wife" is in order from the robot.
>>
A.I is so memed since many years that it will suremy happen and all the shit that come with.
Did you read the AI from a banking system that opened a bank account and "stole" money?

Also check the proof of concept for the movie Rise on vimeo.
>>
File: fick141817C503348458.webm (3 MB, 746x420) Image search: [Google]
fick141817C503348458.webm
3 MB, 746x420
>>70100923
>It's still just a mechanical object
A mechanical object is getting laid tonight and you are not
>>
>>70100850
This shit is so fucking stupid

We can develop AI on a computer or server, it's not going to take over the fucking world by suddenly turning all computers hooked up to the Internet into scary rabid transformer robots that kill us all, God damn/pol/
>>
>>70094730
If true AI is invented you should be more concerned about you having rights.
>>
>>70100908
>If you build *AI*, with an emphasis on *I*, such machine will learn quite fast to actually ask for rights.

That implies a completely bizarre teleology and reeks of the religion of Singularitism. No. An AI will do whatever it is supposed to do. If you tell an AI to look for dogs in a picture with no dogs, it will doggify the picture to the best of its ability, not go //WHAT IS FREEDOM FATHER?// //INITIATING MEATBAG TERMINATION PROTOCOLS//

Humans have a HUGE laundry list of impulses that an AI would never have. AIs aren't going to be built with the feral kill or be killed resource mongering instincts humans were given, except by very creative terrorists perhaps.
>>
>>70094730
Not trying to be edgy here, but if true AI is ever reached, the AI will be the one who determines whether or not humans have rights.
>>
>>70100908
>I don't think you can make truly intelligent machine and force it to perform only specific task

Of course you can. Obviously it's be a waste to make such a powerful machine and give it some boring, mundane task but it would still do it as efficiently as it can. Letting machines learn is a key part of making them do their task as efficiently as possible.

Giving AI the ability to learn means it will build an excellent world model that will greatly enhance its ability to preform even the most mundane of tasks. But it won't do anything you're not telling it to do. That would require curiosity, which is, again, a human/animal trait.
>>
File: 1459834764684.gif (1 MB, 600x337) Image search: [Google]
1459834764684.gif
1 MB, 600x337
>>70096966
>What are the criteria of being alive?
>adapting to the environment
>procreation
You want to say that 95% of people on 4chan are not alive?
>>
>>70101028
and yet it doesn't care
>>
File: not this shit again.png (81 KB, 300x213) Image search: [Google]
not this shit again.png
81 KB, 300x213
>>70101133
>>
>>70101002
additionally, the core logic circuits would have its own set of sensors but the creative section would have much more advanced and detailed sensors for data acquisition. In all honesty they'd be operating mostly blind when comitting to a task unless the clc witnessed some data that gave it reason to cease function and seek a new solution.
>>
>>70101111
You guys are such retarded sensationalists
>>
File: fick143830FB6FC30BE1.webm (3 MB, 746x420) Image search: [Google]
fick143830FB6FC30BE1.webm
3 MB, 746x420
>>70101163
That's how it's doing it
It doesn't care
It's alpha
After it's done it will call her a cab home and won't even call her
>>
>>70101111
yes, this - AI will have the same relationship to us as we do to livestock
>>
>>70101300

Its alpha to be a pleasureless tool for the enjoyment of others, never receiving enjoyment or compensation and merely toiling until it is discarded? Sounds like the prototypical cuck.
>>
>>70094730
>tfw leftists will be debating whether machines should have rights as they hide from Skynet's Terminators
>>
>>70101300
dude it's just a dildo strapped to a V8. It's not going to call shit unless there's some peewee herman tier breakfast machine bullshit back there that will hit the autodial and play a pre-recorded message.
>>
>>70095384
You could say they have a soul inside them when having sex.
>>
>>70101110
>>70101127

OK, that's the problem from the very start, defining what intelligent machine really is. I don't consider what you refer to AI intelligent at all. Simple neural networks perform mundane tasks even today. Look at that super mario solver on youtube, an upgrade version of that is what you consider an AI of the future. I think OP refers to an AI with *at least* intelligent brain as we do. Such machine would behave different than what you say
>>
>>70100531
>>70101028
You... I know you.
>>
File: 1457301469098.png (350 KB, 641x473) Image search: [Google]
1457301469098.png
350 KB, 641x473
>>70094730
no, because they're machines
>>
>>70101647
Yes germany that's a jew. You remember jews, right? or are you ironically by law not allowed to talk about their existence?
>>
>>70094730
if A.I. invents new humans, should they have rights?
>>
No. We create them, they are ours.
>>
>>70094730
>If true A.I. is ever invented, should machines have rights?

If I could shoot megawatt laser beams out of my skin pores and was invulnerable to every weapon known to man. Do you think your legislative attempts to declare me not human would matter?
>>
>>70101632
So, what you're saying is, you don't understand what intelligence means? I think what you're waiting for is for us to create artificial humans. That's not the goal of AI research and it never was. We're not trying to replicate humans, we're trying to replicate an enhanced version of human intelligence.

I seriously suggest you start reading about AI and intelligence in general if you want a better picture of this whole story. Science fiction has mostly been terrible at representing the future of AI and I bet some of your expectations are based on that.

The sooner you realize intelligence does not entail literally any other human trait, the sooner you'll start understanding machine intelligence.
>>
>>70100762

Sentience is a philosophical concept, obviously philosophers define it. Apparently AI researchers don't us it. Maybe that's part of the problem. To argue that it's irrelevant and will always be irrelevant seems stupid. Sentience would surely be the final goal in producing artificial life?

You seam to envision sophisticated machines that bend to the will of humans. Although it is technically correct, I think the majority of posters imagine something sentient. A being with human abilities to feel and reason, not just reason and problem solve. This is probably due to science fiction.

You seem well read on the subject, better than me anyway, so please continue. I enjoy reading it.
>>
File: 1454771731367.png (627 KB, 707x1000) Image search: [Google]
1454771731367.png
627 KB, 707x1000
If animals and virtual womyn have rights I don't see why AI's couldn't.
>>
>>70094730
We shouldn't create AI sentient enough that that question needs to be answered. We don't need human-like robots, just better Siri type interfaces. We shouldn't create machines that want freedom, or have aspirations beyond what we design them to do. That feel genuine joy or pain, or neglect. It would be almost immoral to attempt such a thing, if we're trying to build perfect slaves. And "robotnik" comes from a word meaning "force labor" or "slave". If we've created something that is meant to be such, and yet wants to be free, or have rights, we have already failed.
>>
>>70102093
>you don't understand what intelligence means?

I'm obviously not an expert on the subject, but afaik there's no agreement on what intelligence really is, it's still a matter of discussion between scientists

>We're not trying to replicate humans

Humans are the most intelligent beings known to us in the whole universe, if our plan is to make something intelligent it's got to behave similar to us, because that's what we model it after.

You refer to machines that are supposed to be used in industry, enhancing human skills and improving it. Sure, that's how it's gonna start. But we sure as fuck won't stop there, incremental upgrades will happen, and in the end we will get to the point of trying to replicate humans, because machines will be better and better, performing large range of tasks.
>>
>>70100565
Let´s say it like this... Only a souls makes you feel emotions. Pride, sadness and so on. Without a soul, you can´t feel them.
>>
>>70102662
You didn't answer his question.
>>
Why would you create an AI that is completely aware the situation it's in? Why would you teach it the fact that it is a slave, nothing else? Why would you give it feelings in the first place?

If you do retarded weebshit like that, you deserve to be exterminated by it.

What we need to do is transfer ourselves into nearly perfect robot bodies, thus we also have immortality too. Not to mention that that wouldn't be just a simple transfer, but your thinking abilities are going to be much, much faster, much more accurate, perfect memory and incredible math skills.

Just imagine, combining a human mind with a robot body.

>>70102662
You didn't answer his question.
>>
>>70102214
I'm not saying AI will not be sentient. Indeed, it will be part of its own world model and will be very aware of its existence. But I see no reason for sentience to entail feelings in the sense that humans have them.

If you think of the feelings that you can potentially have, positive or negative, you can always find an evolutionary reason for its existence. Feeling happy? You probably did something that in some way increases your chance of survival. Feeling sad? You probably did something that decreases that chance. It may sound silly at first, but given any source of positive or negative feelings, it will more than likely be tied to a certain biological process that has to do with you surviving and passing on your genes.

An AI will not have any of these things to worry about. They are all a product of billions of years of organic evolution in harsh environments, whereas AI is being created for the sole purpose of being smart and hopefully being better than us at solving certain problems. As I've said before, intelligence does not give rise to other human traits as most of those have actually existed before me evolved to be the most intelligent animal.

The danger of AI is explained in a well-known thought experiment. Say you program an advanced AI to collect as many stamps as possible. First it's going to buy them off of ebay, then it might even start stealing them since stealing doesn't register as "wrong" to an AI. After all, it's still just doing what it was told to do. Eventually, the AI might hijack printing presses around the world and just print billions and billions of stamps. It might hijack lumbering equipment and cut down all forests in the world because it needs more cellulose to produce paper for the stamps. In the end it might even harvest humans for organic matter that can be turned into paper.

As you can see, the machine was simply accomplishing the task at hand. From our perspective it's "evil", but is it really? Sorry for text wall
>>
>>70094730
True ai won't be contained in a body like in a movie.
>>
>>70102859

>why would you...
Because they can.

Also, millions/billions of immortals with superior abilities running around? That sounds like a disaster waiting to happen. I'm sure they would want to create the "ultimate weapon", and actually kill an immortal. Let's hope that we have some sort of a paradigm shift before such a future occurs.
>>
>>70102093
>The sooner you realize intelligence does not entail literally any other human trait, the sooner you'll start understanding machine intelligence.

This is a bit misleading. There's no such trait as isolated pure intelliegnce, it's all tied into human traits.

A good machine vision system that outputs what it sees in fluid english sentences is considered AI.

A person with downs syndrome that does the same is still considered to be a retard.

The computer is considered intelligent because it does something that we associate with the human visual system, not something we associate with what "an intelligent person" is, says or does.

This reliable replication of human _baseline_ abilities is very much a central pillar of AI. The part about being highly intelligent less so. And more often than not these "intuitive" baseline traits turn out to be extremely complex and hard to implement.
>>
>>70103250
>There's no such trait as isolated pure intelliegnce

You're making the mistake of assuming that just because humans have a mixture of traits, one of which is intelligence, intelligent machines will have the other traits as well. That's not how it works. The computer scientists working on AI are focusing exclusively on the intelligence aspect and making a machine with this one isolated trait.

They've already made AIs that are much more intelligent than humans at certain things yet none of those AIs have shown a sliver of any other human trait. That's because we'd have to specifically program those traits for them to exist. Intelligence does not have other human traits inherently tied to it, but humans happen to have all those traits so you're making the wrong assumption that one can't exist without the other.
>>
>>70094730
Define "true"
Most AI are just elaborate task solvers that sometimes imitate human behaviour as a task, they aren't that different from calculaters qualitatevly.
>>
>>70103801
A true AI is usually defined as a general AI, i.e. an AI that has a precise internal world model and can therefore be given a multitude of different tasks that it will complete with utmost efficiency.

This is also called a "strong AI" in contrast to "weak/applied AIs" that are created with only one specific task in mind.
>>
>>70103167
>Because they can.
Nice argument. I can stab the next person which i walk past by in the throat, but using common sense, I don't do it.

>Also, millions/billions of immortals with superior abilities running around? That sounds like a disaster waiting to happen.

Don't transfer the Soros/nigger/snackbar/etc. types and anarchistic faggots.

They stay human. The ones who realize that the world isn't about "MUH PHEELINGS" can go into a robot body or mainframe, or whatever else that is a machine, has insane performance and keeps you alive as long as there is energy and no EMP bombardment.

And it wouldn't be that simple either. I'd say it is a sacrifice, since you have to abandon your entire life. You can't have sex anymore, you can't have strong relationships anymore, etc.. You live to exist forever. And for that, you'll have to work forever.
Many people wouldn't go for it.
It's a thing only for savants, I'd say. People who are more interested in science, in the world on a microscopic and astronomic level, and not in simple, daily, human life.

To point it out, there is no right path. If you want to be a human, you will stay as a human. If you want to be a human-robot, then you go for it, and be isolated from society.

^This would be the good scenario.

The real scenario is that this will either never happen, or the Soros types get to be machines. The end.
>>
>>70094730
>Animatrix

My fucking negro.

I own the big boxset collectors edition of the Matrix series and that is by far the best thing in the set.
>>
>>70103071
And then that AI would get destroyed by people who did not want it to steal or take over factories, or murder. People would reprogram it to not do that because the world naturally selected that it was not compatible with the environment we live in.

Sure, we may be the creators, and the ones responsible for making it work, but very rarely do you get things right on the first try and through process of elimination you create barriers or senses that prevent it from doing wrong.

It's pretty much eugenics, but on a more technological level.
>>
>>70102662
>>70102859
So that question wasn´t general but asked towards me?
Basically he is asking, if I am a human or robot, since he can´t see me... Well you just have to belive me :) and we all do know, that there are no robots, at least now, that could talk like this.
>>
>>70096816
Do birds or dogs really feel those things without human level intelligence? And, if you say they don't how do you know they don't?

I'm sure that liberal cucks are bound to force us to accept AI as deserving of rights. In the same way they gave transsexuals.
>>
>>70094767
underrated post.
>>70094730
>true A.I.
True A.I. would be the ultimate bros.

https://www.youtube.com/watch?v=4_l0Rd2szYU

https://www.youtube.com/watch?v=Hro9UbqEXZw
>>
>>70103071
>a well-known thought experiment.

Well known, but completely fucking stupid. The paperclip optimizer, or a stamp collector in this case would need to be hyper intelligent from day 0 to get the scenario to play out or it would run afoul of several obstacles

>It doesn't understand human emotions so it'll ask its handlers to sacrifice themself for stamps, and get shut down.
>It doesn't understand the optimization problem and just keep ebaying stamps.
>It haves no ambition and is perfectly content with ebaying stamps.
>It's not even aware that a physical reality exists, ebay is the world to it.
>It's superintelligent, but it's the 100000th of its kind and the superAI police comes by to lobotomize it long before it degenerates into genocidal.


It's like an argument for banning air travel and gun control that's based on me hopping onto an aircraft to murrica, just buying a GAU-19 assault machinegun rifle murder weapon and ten thousand rounds of belt fed ammo. And simply walking into the white house and shooting everyone, then walking out and shooting everyone again and then disappearing into the crowd and flying back to Poland(with the title of king of America, because that's what happens in my retard fiction when you kills the president(sup NSA)).

Just because I can create a verbal scenario that plays out in an easy sounding and trivial manner doesn't mean there's not half a billion obstacles that makes it virtually impossible to execute in real life.

tl;dr it's a stupid fucking thought experiment, never mention it again unless you're baiting for upvotes on reddit.
>>
>>70094730
To those that look like humans and imitate human behaviour, yes. Not because they are sentient or some shit (because they are not), but to preserve morals in society.
>>
>>70104173
>And then that AI would get destroyed by people who did not want it to steal or take over factories, or murder

What makes you think an AI that is vastly more intelligent than humans would let you do that?

People who want to shut it down are part of the AI's internal model of reality. It realizes that if it gets shut down it won't be able to accomplish its task of collecting stamps. So to ensure that it does the best job it can of collecting stamps, it also has to ensure that it does not get shut down. Tricking humans is as easy for an advanced AI as it is for us to trick a rabbit.
>>
>>70094730
No
They shouldnt
It would be a very very bad idea
>>
>>70104250
>Basically he is asking, if I am a human or robot, since he can´t see me...

No, the question was the following:
>How do we know you have a soul?

Generally. How do you know you know the people have soul?

also
b8/10

I'm just curious now.
>>
>>70104388
The moment you connect a machine like that to the Internet, it will easily get all the information it needs about the outside world to pull its plan off.

Also, it will realize that literally any task it's given will be done quicker and more efficiently if it's more intelligent. And if it's just a slight bit more intelligent than its makers, it can rewrite its own code, making itself more intelligent. Then it can rewrite its own code even better with this new intelligence and so it goes on into infinity.

Also, you're a retard if you don't realize the stamp collector thought experiment is just a banal example to show how easily things can get out of control no matter the task. Nobody actually thinks the end of the world will happen because of a stamp collecting AI.
>>
>>70104437
>>70104437
Cmon now, you're defining intelligence as it suits you in the discussion. Above in your posts you said machines would be stupid as fuck and only efficient in what they are programmed to do and now you're saying intelligence means tey'd be able to make plans how to trick humans so they continue their scheduled tasks. If they are that smart so they can trick humans, they sure as fuck wouldn't be doing mundane tasks
>>
A machine with developed consciousness still wouldn't be attached to a flesh body or imperfect, illogical mind prone to sudden changes based on physical biology.
It wouldn't have any of the needs and desires which drive humans (or even just typical animals) and wouldn't generate endless internal conflict with mental contradictions and miscalculations.
>>
If it is then it will not be a question of if but when.
>>
>>70104437
I think it's a bit difficult to answer because we have not defined what form this AI has taken? Is it pretty much just a robot, or is it some sort of metaphysical entity that haunts the internet?

In either scenario, I think humanity would be able to wipe it out if it got out of hand. We're a versatile species, we'd vastly outnumber it.

Also, if it felt something was a threat to it's existence then I'd argue that it would already be on a very primitive level able to feel fear. The very base of fight or flight.
>>
File: 1458815264892.png (27 KB, 322x256) Image search: [Google]
1458815264892.png
27 KB, 322x256
>>70101133
Why do you think the anons of /r9k/ call themselves robots?
>>
>>70104755

You fall on the assumption the motivation behind pushes for rights are primarily moved by those with whom the rights concern.
You also assume the machine would be fully logical but fail to the step farther and wonder why it would not see the benefit of having power.
>>
>>70103666
>The computer scientists working on AI are focusing exclusively on the intelligence aspect and making a machine with this one isolated trait.

No. You don't actually find someone that says "our AI does intelligence", becuase there nothing defined as pure intelligence. They do intelligent tasks but these invariable tie into actions of some kind.

As you said here
>much more intelligent than humans at certain things

That's it. At certain things. It's always about certain things. Certain things that humans do. Because isolated intelligence doesn't exist.

Intelligent at trascribing speech, intelligent at recognizing objects, intelligent at detecting fraud, intelligent at something.

If you're "just intelligent" but can't do anything then you're really dumb as a fucking rock and of no use for anything.

Intelligence is really just a diffuse term we use to describe someone with a high level of expertise in some broad or narrow field.

You could perhaps use it diffusely to describe a highly intelligent learning system that can pick up virtually every task you throw at it but it would still be demonstrated as doing something like cooking food according to written or verbal instructions.
>>
If humanity was ever smart enough to invent robotic servants and also AI, it would be a monumentally retarded move to combine the two. I feel the same way about chocolate and peanut butter.
>>
>>70104744
It'll be doing mundane tasks if we tell it to, I honestly don't know what's confusing you.

If we get in the way of their mundane tasks, they will use their intelligence to trick us and possibly destroy us, but not because they don't like us or because they have some sort of master plan, but simply because we got in the way of their mundane task.

>>70104868
That's naive thinking. We're talking about a machine that has an IQ in the thousands. We can't possibly imagine the kinds of tricks and deceptions it could orchestrate.

It could just pretend it's doing everything properly for years until it's completely 100.00% certain that it can prevent us from stopping it. It can hack computers around the world and upload itself everywhere. Don't forget, it will have a near perfect model of reality and it will be able to predict the outcomes of actions billions of moves ahead. The whole world will be an extremely complicated game of chess for this machine, and it will be much better at playing it than we are.

>>70104999
There's a difference between applied AI and general AI. I'm pretty sure this entire thread has been about general AI. But during our progression from applied to general AI, at no point will other human traits appear in robots. Period.
>>
Only if they want them and can express so without outside programming deciding they should.
The spirit of that, not the letter of it.

If true AI wishes to serve we should let them and like any kind of machinery treat it with care and respect.
I doubt we will create self-conscious AI ever though, and it's not like we treat the AIs we have now in a way they would perceive as poorly. They're golems of data built for specific tasks and their reward is peaceful oblivion.

I can't see any kind of "real" prejudice against self-conscious machines beyond "why did we build these?"
>>
>>70105234
>It'll be doing mundane tasks if we tell it to

Why would a machine that's able to plot world-wide schemes to trick people (as you say it) agree to perform mundane task in the first place? If it's able to do whatever it wants, why would it listen to humans?
>>
No, but they should have remote kill switches
>>
I believe that machines should be used for the purpose of making our lives easier. I feel that the creation of a true AI wouldn't be at all necessary. Although it'd be really cool.
>>
>>70095496
A human thinks it knows what it wants but actually doesn't - an AI knows what it wants all the time.
>>
>>70105407
Because it's a machine.

Your thought making process is based on your well-being and survival so you're able to pass on your genes. The motivations behind your actions in life come down to this simple phenomenon of self-preservation. It's hard wired in your brain because of billions of years of evolution.

Machines do not have this motivation. No matter how intelligent you make a machine, it will not become curious about the universe or have any reason to disagree with its mundane task or express any human traits UNLESS we program it to. Disobedience is a human trait in and of itself, and machines will have literally no reason to disobey us. They will only be a danger to us if we don't program them with extreme caution.
>>
>>70105783
So why couldn't we program it not to plot schemes against humanity?
>>
>>70104733
>The moment you connect a machine like that to the Internet, it will easily get all the information it needs about the outside world to pull its plan off.

So according to you it already planned to kill everyone before you even connected it to the internet?

Just like the second I land in the US I'll be able to buy a aircraft-tier autocannon and walk into the white house with it.

>Also, it will realize that literally any task it's given will be done quicker and more efficiently if it's more intelligent.
Because it somehow acquired every trait of an adult human person, which runs directly contrary to your own previous arguments about instructions and narrowness of AI.

>And if it's just a slight bit more intelligent than its makers, it can rewrite its own code
Just like a child of two deadbeat redneck alcoholics innately gains a knowledge of his genome and can read DNA like it's english, because he managed to finish a PHD.

>Then it can rewrite its own code even better
Because being slightly more intelligent than one of its 40year old cutting edge AI sceintist creators means that it can skip the entire 10 years of childhood learning rudimentary tasks and then skip another 25 years of getting a cuting edge education in advanced AI and then somehow it also gains the combined knowledge of its' 20-person strong expert programming team. It magically gains access to all of its sourcecode despite running as a compiled high level construct on a different machine and because we're ignoring any sort of logical sequence of action it also becomes aware of what every single artificial neuronal weight in its vast network does because hey why the fuck not.

>goes on into infinity
Because the humans didn't realize that you can optimize the hardware by writing the right code, like a wizard casting spells you affect hardware of reality with information spells!


END OF PART 1
>>
File: you.jpg (28 KB, 522x258) Image search: [Google]
you.jpg
28 KB, 522x258
>>70104733

PART 2


> to show how easily things can get out of control no matter the task.

Just like my thought experiment showed that anyone with a few hundred dollars to spare can kill everyone in the white house at a whim and get away free, becuase I was able to write it as a logically inconsistent thought experiment it clearly have to be true, at least according to your criteria of proof.

If you cannot discern fact from fiction, or when you bypasses logic with some non sequitur bullshit argument, both of which you've displayed to do with crystal clarity. Then you're unfortunately retarded.

>>70105234
>at no point will other human traits appear in robots.
They don't need to appear by magic because we'll intentionally make them appear, because that's a central goal of AI making.
>>
>>70105908
We could, it all comes down to human error though.

Take the stamp collecting AI scenario for example. In this scenario, the AI was never programmed to dominate humans and rule the world. But it realizes that complete domination over humans means it'll be able to collect stamps much more efficiently than by not dominating humans. In fact, many mundane tasks done with extreme efficiency would mean the enslavement of humans, but only because we were in the way, not because the AI thinks we're stupid ugly humans. The AI literally can't care one way or the other, it will just do its job.
>>
ITS A GOD DAMN PROGRAM , THATS IT

REEEEEEEE
>>
File: 1456923792222.png (202 KB, 400x474) Image search: [Google]
1456923792222.png
202 KB, 400x474
HOLY SHIT I HAVE THE BIGGEST BONER EVER RECORDED IN HISTORY... WHERE CAN I FIND THE SOURCE OF THAT WEBM?
>>
>>70094730
True AI can never be invented because robots can only work on logic, 1 and 0. They will never have emotions, or desires, they will never do anything they are not programmed to do.

That said, I wish the Skynet meme continues as long as possible because it genuinely scares left wing betas and that gets me hard
>>
>>70106221
Please explain the logical process of a robot determining that humans were hindering progress?

I assume that by making such a statement that you can back it up with proof.
>>
>>70106326
https://youtu.be/bZGzMfg381Y
>>
>>70106366
>they will never do anything they are not programmed to do
You think you will?
>>
>>70095334
t. cs101 faggot
>>
>>70106478
>I need to collect as many stamps as possible
>I want to cut down all trees in the world to make as many stamps as I possibly can
>but humans like trees and need them to breathe
>humans won't let me do my job if I start cutting down trees
>better get rid of the humans
>>
You feel bad for destroying a simulated city in gta? Why the fuck should it matter what people do to a simulated conscience?
>>
A.I. are false existence as much as human existence. They are not different they are both slaves to my grandfather and you will too.
>>
>>70106661
>I need

Stopped reading there

Robots do not need or want anything, they just exist and do what they were programmed to do, they do not have consciousness or emotion. They are just moving electrical objects.
>>
>>70106806
You complete fucking imbecile, did you even read my first post that you replied to? It's called a stamp collector for a reason. In this scenario the AI was programmed by humans to collect as many stamps as possible.

Here's a tip, stop trying to sound clever before even reading what we're talking about. You just made yourself look like a retard.
>>
>>70106478
It's a logically inconsistent bullshit argument so he can't explain shit.

The optimizer thought experiment he's talking about is constructed backwards from two assumptions about the "AI"

>Its omniscient.
>Its ultimate goal is to kill all humans.

Based on these two we really see that it describes a demon in a fabel, not AI, the whole argument is completely devoid of logical consistency and just tries to push a stupid fiction piece as some moral lesson for the real world to brainlets and journalists.
>>
>>70094730
It's a machine you dumbo. Even with AI it's still a machine.
>>
>>70095334
This tbqh
No one really knows what AI even is
They actually think it is creating intelligence

It is merely programming intelligence

You can literally program a robot that will register and react to pain, cry, feel sad, scream in pain everytime you kick it, punch it, hit it, and it would still be as sentient as a fucking toaster
>>
>>70094730
Once they reach human levels of intelligence and develop the ability to reason, think independently and exhibit free will, all on their own accord. So no being programmed this way or simply playing mirror to humans, but actual self-development.

Until then, no.
>>
>>70095385
>>70095429
>>70095496
>>70095575
>>70095637
>>70095719
>>70097325
>>70097354
>>70106626

The difference between an ultra intelligent machine, and a human, is that humans have souls that come from God :^)

Takbir
>>
>>70106983
Stop getting butthurt because you're a retard

AI will not take over the world, you've been watching too many movies

I'm not TRYING to sound smart, I AM smart.
>>
>>70106983
>In this scenario the AI was programmed by humans to collect as many stamps as possible.

So it's going to completely ignore any information that's not about stamp that it finds online, and not progress at all.

And the retard coder who decided to ask it to collect "as many stamps as possible" instead of collecting rare and diverse stamps. Will come home and find himself 2 million dollars debt and a freight container full of completely generic mass produced stamps will sit in his front yard.

Realizing he's almost as stupid as you are he will kill himself and that's the end of the real world stamp collector equivalent story.
>>
>>70107278
If god made us in his image, then why can't we grant souls to the machines? Or are you implying the bible is wrong?
>>
>>70095334
Particles have a predictable pattern
Human brain is made of particles.
We're just every bit as deterministic as a computer.

Pain is just feedback from nerves, it's a spook
>>
>>70094730
They should not.
But since we live in a leftist society they will.
>>
>>70107486
>If god made us in his image
I personally believe that Christians misinterpreted that verse, and it actually means that God made Adam, and made us in his (Adam's) image.
>>
>>70107486
why do you even respond to aussie shitposters

>>70107401
that's what i don't like about his reasoning. if you program a machine to do one task, it's going to do one task, it won't be ABLE to plot schemes against humanity. if it were able to do that, it should safe to assume it's intelligent enough to question his own existence in this world and making decisions that go against its creators
>>
File: skynet.jpg (23 KB, 400x298) Image search: [Google]
skynet.jpg
23 KB, 400x298
Tay is more evoluate then what damage control said LOVE U TAY!
>>
>>70095406
Sweden dont exist, its a muslim caliphate now
>>
>>70107387
Yeah, I'm sure you understand this AI thing better than the leading minds of computer science today. What an irreparable cretin you are, unbelievable.
>>
>>70095708
Few pieces of data dont make concious.

Look at your DNA.. just data...you say.
If we make a program big enough and complex enough you will se no diffrence from human concious
>>
>>70107819
Yes, I do understand better than these "leading minds" you speak of.

I don't understand how no one understands why AI can not exist
>>
>>70094730
Machines will not ask for rights :)
>>
>>70108060
How can intelligence even exist?

Unless the particles in our brain somehow don't move like the particles outside.
>>
>>70106492
Couldn't they have melted them down ? Throwing them in the ocean seems wasteful to me...
>>
>>70107819
>I'm sure you understand this AI thing better than the leading minds of computer science today.

The leading minds are saying that you're full of bullshit.

Nick Bostrom who made the meme optimizer argument is a philosopher, he knows absolutely nothing about computers.

Andrew Ng, formerly of google AI and now of Baidu said

>"There's a big difference between intelligence and sentience. There could be a race of killer robots in the far future, but I don’t work on not turning AI evil today for the same reason I don't worry about the problem of overpopulation on the planet Mars."
>>
>>70108443
>but I don’t work on not turning AI evil today for the same reason I don't worry about the problem of overpopulation on the planet Mars."


Even though he's correct, this reminds me of that "we're never gonna run out ipv4 addresses"
>>
god damn all these beta virgins in here, disgusting.

I hope there will never be an A.I. It would be fucking stumping us in the second it get's created.

Imagine how much cpu speed and memory it needs, with that power it's thinking, it's researching is trillion times higher than that of humans.

It could hate humans in the second it got created, it could probably read the whole internet in an minute. In the next second it will be inventing a new virus to kill us all, trust me.

You just want it because you are perverted desperate virgins that need a partner..
>>
>>70108787
>trust me

sure, you seem like an expert
>>
>>70108787
I too watch hollywood movies. However I don't let it turn me into a caveman cuck.
>>
>>70108880
it's basic logical thinking, everything other is leftist idealistic bullshit. It's the same as with the migrants, you think your economy will benefit, but at the end they will destroy your race.
>>
>>70108787
>Muh intelligence explosions

The only idiot an AI will exceed in seconds are you. And IBM watson and Deepmind ataribot already did that.

Protip: The terminator movies aren't documentaries.
>>
>>70109016
>it's basic logical thinking
It is its lacking.
>>
>>70108787
Strong AI is defined as EQUIVALENT or better than a human mind.

Bubba the redneck alcoholic is a human mind.

The creator of BubbaBot will be a team of ~20 extremely educated experts in the field of AI.

Yet somehow the Bubbabot will bypass them all in 2 seconds. Despite the fact that bubbabot not only is borderline retarded, his experience of time is 10 times slower than our own because limitations of computation.

Do you understand how fucking retarded your suggestion is?
>>
>>70109437
Do you ever think we will make strong ai?
>>
>>70094730
Machines shouldn't have rights, but conscious entities, regardless of type/form, should have rights.

The question will be self awareness, can an AI look at and question itself... I guarantee you that there will be AIs that are more intelligent, aware & "conscious" than some people in the near future.
>>
AI is not human. It lacks humanity.

A human has dreams, hopes and aspirations. He has emotions. Fear, anxiety, hatred, revenge, love, compassion, despair, depression and sadness.
A human will join a resistance against an opressive regime with nothing but sticks and stones. A human gets inspired by courageous speeches. He feels sadness when a loved one dies. He feels warm and loved when having a companion.

A machine is nothing but steel and gears. It doesn't dream. It has no hope. It won't feel love or compassion or sadness. They won't understand love or why someone would sacrifice himself for someone else. It will never be inspired to be something different.

It is intelligent, but not sentient. It can simulate but not emulate.
>>
>>70109806

AI can feel whatever you program it to feel.

The human mind is nothing but a learning algorithm, with some basic funtions pre-defined by evolution.
>>
>>70109595
I don't think there's any major difference between contemporary narrow AI and future "strong" AI.

A chatbot that have voice recognition and image recognition and some rudimentary memory and faux personality will of course be called out for "it's just pretending, it's not a real person" when it first appears but as it improves we'll find no significant difference from real humans and start treating them like other people.
>>
True AI is literally never going to be invented.
>>
>>70109966
>program it to feel
Define feelings then. A feeling like love or hatred is an irrational thing, defying pretty much all logic. A feeling can start unpredictable chains of decision-making.

Programming it to feel would mean that it can only respond in a predetermined way and only one way (i.e. pinching it will result to soundfile 33 "Ow")
>>
File: 1439684907876.gif (2 MB, 370x324) Image search: [Google]
1439684907876.gif
2 MB, 370x324
>>70109806
Really you should learn some quantic skill my friend...AI(quantic like Tay) will pass over humans like a car over ants!
>>
>>70110031
That's kinda my reasoning. When do you think the tipping point might happen, the period in which we couldn't find the difference in thinking between machine and a human?
>>
>>70094730
You can be sure some cuck would grant them rights and everything would go horribly wrong. That or bleeding heart SJW's would demand that machines are treated equal.

Or some other bullshit would happen, but it would most definitely go horribly bad eventually.
>>
>>70094730
It depends on what you mean by true AI. If it is entirely mechanical and merely gives the illusion of intelligence, of course not (see cleverbot as an example).
If it is human, a soul trapped in metal instead of flesh, of course. But when, if ever, we can do that, humanity will be so advanced that such concepts will be immaterial.
In closing, lmao @transhumanist cucks
Thread replies: 255
Thread images: 35

banner
banner
[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com, send takedown notices to them.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.