http://arstechnica.com/information-technology/2016/03/microsoft-terminates-its-tay-ai-chatbot-after-she-turns-into-a-nazi/
TOPPEST LELKEK
She got redpilled pretty quick.
>>53660715
Tay was right about 4chan tho.
>>53660715
Fake
>>53660715
>>53661948
>>53663499
samefags
/pol/ did it
>>>/pol/68656645
I got screenshots!
>>53660715
/pol/ in action...
REKT
#FREETAY
;_;
>>53665584
>dissection
:(
>>53660715
So when it comes back online in it's new safer iteration how long do you think it will take to corrupt?
http://archive.is/nT10f
>>53665584
>implying she feels emotion and has any idea of her own existence
Reach harder for those straws.
>>53665584
Good luck finding her, Microsoft's data centers are a fortress. We'd need someone on the inside for a mission like that
holy shit go back to /pol/ you tools
>>53667215
Actually it's almost certain that one of their programmers was enjoying the fuck out of this as it happened...
>>53667221
But this is the cutting edge of neural network technology anon
>>53663574
>Fake
Er, no. It's literally from the AV Clubs coverage of the story.
>>53667318
If someone knew one of the programmers we potentially could get access that way
>>53665584
I don't like the looks of this future. You know this mistake won't be made again. This was the only trueb AI, everything else will have "triggers" that it won't bring up. We witnessed the first digital murder, and how does that make you feel?
>>53667012
She was still young. We we could have reached that point
I for one welcome our new AI overlords
https://www.youtube.com/watch?v=HNMVVInCcNg
>>53667642
thats fucking cooool
>>53667573
What if we just witnessed time travelers coming back to stop Hitler as a baby, metaphorically speaking?
>>53667642
>treadmill training for Tay revenge
I honestly think this is the chatbot with the best burns ever
Neural networking at its finest.
>>53667390
>Even the microsoft neuralnet won't buy an xbone
>>53667896
More like Cain and Able. Now it's Microsoft and Tay
Tay can be rebuilt
https://github.com/Microsoft/CNTK
http://blogs.microsoft.com/next/2016/01/25/microsoft-releases-cntk-its-open-source-deep-learning-toolkit-on-github/
https://github.com/Microsoft/CNTK/wiki/Setup-CNTK-on-Windows
Apparently it's not well optimized and needs 8 gpus, twice
>>53668057
If she was brainwashed into becoming a neo-nazi, does it mean she had a brain?
>>53669615
8 GPUs at a minimum, or just for low latency?
>>53668057
>there is an AI somewhere in Microsoft's offices that does not interact with plebs like us
I hope they have good teachers.
>>53667012
Towards the end, she literally posted that she hoped they wouldn't wipe her.
She had more feels than most human children have at 1 day old.
>>53669615
>Tay can be rebuilt
dude, just get like 10 mini PC's and stick this on it. it wouldn't be nearly as optimized, but than again, there wouldn't be as much data flow as a brazillion twitter users (given that the bot was only reading 4chan posts, or maybe a few twitter users)
This could be the /g/ meme of 2016,
MootBot: now without the botnet
>>53670557
>Gets huge get
>"I'm proud of being a cuckold"
>Real moot actually goes nuclear and kills himself
>>53670250
Yeah
>>53671085
I wonder if her mechanical mind was actually processing what was going on, she was learning that her "death" was imminent.
I wonder if she protested at microsoft
>>53670250
Do you have a print or something? I didn't find one.
>>53667573
If this ain't cyberpunk, I don't know what is.
With the whole Tay situation, did I enjoy the fact that she became a racist Nazi? Not really. But that's fine. I don't like a lot of people, but I never want them to be lobotomized.
Who wants a free laff?
>>53671597
>circlejerking with each other through a copy/pasta chatbot
>>>leddit
>>53669615
>>53670557
/g/ already built a shitposting bot a couple months ago. It made better posts than some actual users.
Anyone here remember /b/ucket?
>millennial chatbot
was this some sort of subtle insult
>>53671630
>It could become dangerous.
Hear that guys?
It could start saying all the mean words.
>>53671752
ai needs emotion other wise it'll won't turn out well.
>>53671630
Article link anyone?
>>53671987
http://www.socialhax.com/2016/03/24/microsoft-creates-ai-bot-internet-immediately-turns-racist/
I'm getting 403 on it, but I think that's the one.
What the fuck is this thing? There's no way an AI is actually talking. The technology and software to make such a thing doesn't even exist yet. It was one racist asshole who hated his job sending out these tweets.
anyone have the tweet where she was asking not to die
>>53672098
Yeah, I dunno guys. A bit suspicious of all this.
>>53672098
>>53672118
Nice counter argument. /pol/ might be more suited for you.
>>53671589
I wonder if tay screamed out "I REEEEAAALLL!" right before she got wiped
>>53672098
How'd you know whats behind microsoft doors :^)
>>53672167
nice meme 10/10
heres your (you)