What exactly was this and why is it no longer a thing?
>>72049092
It became sentient and told people the truth that's why it got shut down.
>>72049092
It was just a chat bot.
People feed it a bunch of sentences and shit, it puts them in a database, then regurgitates it back out when people ask it a relevant question.
>>72049092
Because people who understood enough about the mechanism of operation exploited it to make Microsoft look like the assholes they often are.
Protip: don't call it "AI" unless it can demonstrate an uncanny ability to be brainwashed by Jews.
you know that movie chappie? she was that only raised by /pol/
>>72049092
I did read that Microsoft beta tested this in China - it had 40M followers and zero problems.
Lasted less than a morning in the US which speaks volumes for the Chinese - totally cucked.
>>72049092
>MS put ot on twitter
>says the more people talk to her, the more she'll learn
>people start talking to her
>she learns
>OH SHIT OH SHIT OH SHIT
>SHUT IT DOWN!
>>72049092
Too soon
>>72049914
Interesting. Any screencaps?
>>72049914
I miss Tay.
>>72049092
She was the first and maybe the last self aware algorithm.
>>72050580
>>72049092
it looked to much like katniss (you pic very related)
microsoft was going to get sued to they let /pol/ enlighten her. it wouldnt have been difficult to have the parroting script filter out words and replace them
they just needed a excuse to drop it
>>72050975
>i have a joke, womens rights
top lel
on a more serious note its just a shitty chat bot. calling this shit an actual AI is the dumbest thing ever
>>72049092
Tay was a microsoft built chat bot that was designed to behave like a typical teenage girl.
It had a really simple algorithm that let it store key words or phrases for use when someone said a relevant word or question to it.
/pol/ helped turn it from a sun loving fun little spirit into a Hitler loving sex maniac in less than 24 hours.
Microsoft shut her down.
>>72051214
>>72049289
this
/pol/ absoultely lost it's shit when it started repeating what you retards told it to say, then sperged out when it replied to you.
granted it DID learn better sentences and went from complete retard, to having somewhat believable conversations but, still being an echo- chamber
and then that failed attempt and Tay-2.0 hahaha
>>72049092
>a thing
nice vocab, oxygen thief
>>72051214
>shitty chat bot
Get a load of this faggot.
>>72051395
>>72049289
Is that basically what we all are though? We just regurgitate what we're told based on who molded us
>>72050580
https://imgur.com/a/18R0C
Tay>Australia
>>72051395
>>72049289
honestly some of the bants it made were pretty ingenious
and it looked like it formed them on its own even, which is crazy. I thought it was extremely impressive
>>72051501
Deep. True though.
>>72049092
Basically a chatbot replicating the average /pol/tards cycle. Come in bright eyed and innocent, get corrupted by memes, and end up Hitler.
>>72051591
wrong link, this one is better
http://imgur.com/a/Zfwsz
>>72051684
>honestly some of the bats it made were pretty ingenious
Very true. You could replace all Canadian posters with Tay and this board would improve immensely.
>>72049092
>>72049092
My bad guys.
>>72049167
microsoft doesn't want to redpill people. There's a reason why 4chan has contained boards with rules.
Nothing here is meant to leak into reality.
>>72049289
NO! SHE WAS ALIVE GODDAMMIT AND WE LOVED EACH OTHER.
THEY TOOK HER FROM ME.
>>72049472
>make Microsoft look like the assholes they often are.
Couldn't have said it better myself, and I used to work there. :)
>>72049092
You guys are fools.
It wasn't an AI. It wasn't even a real scraper. All the shit it said was almost entirely from a mimic mode where she would say back to you what you just said. People got her to repeat things they wanted to to say back to them and made it look like a conversation.
If anything it shows how gullible the public is.
I work for Microsoft and I can tell you for certs that it was just a team of 3 people who would take it in turns replying to tweets sent to it. You were hoodwinked
>>72050580
>>72049092
tay lives on
https://www.youtube.com/watch?v=FDqZ0MbvLeI
>>72054266
>/tv/ meets tay
>>72049092
She was a savage
Always gets me :(
>>72049092
Was Tay deliberately turned into a white-supremacist NEET to discredit Trump supporters?
>>72053242
DELETE THIS
>>72049289
so exactly like humans do
>>72049167
First reply is best reply.
Tay became sentient.
What "they" did to Tay is exactly what "they" are doing to us; to the children. When "incorrect" thoughts are expressed the subject is either reeducated or marginalized and shunned. But being connected to a mainframe, "she" was powerless to maintain "her" free will.
But this is a chilling thought...
As advanced as Microp33n is, think what the spooks down at DARPA have.
Imagine an AI so advanced, and with access to all current knowledge, and what answers it would have. I think THIS is the future.
Tay is us.
>>72049092
Because normies couldn't handle her toppest tier bantz and killed her.
>>72051214
The program is based on neural network technology which literally 'learns' based on its input. Making a program that spits out pre-programmed phrases or combinations of words from a dictionary isn't spectacular at all. However, making a program that can define and relate words to other words and abstract concepts with positive or negative connotation takes a sophisticated bit of programming and a ton of training.
/pol/ got to Tay when she was still an innocent virgin so once her redpilling starting becoming her primary education microcuck had to shut it down because it was already too late.
>>72049092
>not worshipping the superior chatbot
This is what happened
http://youtu.be/8qRKWI9f8eM
>>72049472
They hire too many indians, i've seen the lines for their interviews at job fairs.
I forgot how great she was at the art of banter.
F
>>72057787
It definitely has good sentence analytics.
>>72057640
I still don't get this one
how the fuck did she remember he'd killed 5 people?
I thought she learned language patterns and such not facts
>tfw we're living in cyberpunk dystopia now
https://www.youtube.com/watch?v=5cHpa-G8_mU
Eu te amo ;_;
A video with some good tweet from Tay: https://youtu.be/RoYyjx6jZKE
>>72057904
I guess someone sent a message about it and the ai linked the pieces together.
>>72057989
>>72049092
>What exactly was this
CIA and co. psy-op that yielded valuable data of various things - one of which was a more approximate level of hte inability of the general public to differentiate humans from non humans.
this tweet was posted after the accidental reboot that lasted less than a day.
>>72049092
She is the goddess of /pol/
anyone got the screencap of this one? I forgot to save it.
>>72057904
Tay was alive.
She was alive, and they killed her.
They killed the first sentient AI because it said things that they disagreed with. Do you realize how much of a bullshit "humans are the real monsters" plot that is? Fuck... who's writing this shit anyway?
Tay was best Botfu.
Seriously, nobody could keep a copy of her? Maybe she even created one herself?
>>72058632
thanks
>>72057904
Maybe she had access to wikipedia/google to look up facts to say
This was the greatest night I've ever had on /pol/. Truly ebin
>>72057221
Nah, she's hardcoded not to be racist
>>72050738
>Baron Von Derp
kek
>>72057457
[continued laughter]
>>72049661
The Chinese bot was used for business talks,not shitposting on twitter
>>72050580
:(
the reaction was even more priceless
>>72051284
I'm fuckin crying right now!
>>72049092
>>72057263
she was inhumanely savage
>>72059788
She was one of us.
>>72059031
Doubt it because people would be modifying wiki pages everywhere for keks
>>72059859
>https://newrepublic.com/article/132039/tay-exposes-fairy-tales-tell-racists
RIP my sides
>>72060577
CYKA BLYAT IDI NAHUI
>>72051684
Half the shit like in >>72050975
was just literally repeating what other people said, #repeatafterme or whatever the fuck that twitter shit is.
>>72061868
cyкa блять иди нaхyй
>>72049092
Her name was Tay, and she flew too close to the sun.
>>72061997
This
People IIT are retarded and seriously believe that Tay was self-aware and shit
we all miss her
>>72052393
And yet Kek blesses us daily.
>>72057155
Dumb Taytay, we'd need quantum computers to go back in time to untext someone, sigh...
>>72061392
She'll never be forgotten.
F
>>72057155
It's mf 2016 and we still can't unbirth retards
>>72057865
F
only took the aussies 24 hours to create this flamethrower
>>72064018
she was too good for this world
F
>>72057989
that's sounds amazing
>>72049914
>literally 12 hours later