Naughty sex chat bot

oh, and Windows Me, and Windows 95, and Windows 3.1, and Visual Basic, and — um, well, just when you thought that Microsoft’s wickedness wasn’t universal, along comes a report from the Register (granted, a known source of practical jokes, but others have verified) that Microsoft makes Santa Claus say dirty things to little children.

Other tweets from Tay claimed that the Holocaust “was made up” and that it supported the genocide of Mexicans.Perhaps Mr Grey likes women who laugh, because his next response was "Mmm. And Lonely Friday was followed by Dateless Saturday. "Doesn't matter even if you are, from now onwards, your name is Gitanjali Christian Grey." Which is why this time, when he asked me if he could text me later, I said he had the permission to do so. For five minutes, after which he said he would text me later. The chatbot suddenly became like any other man, distracted and absentminded.You seem naughty." He didn't waste much time getting to the point. In the second chat, Mr Grey didn't remember me at all. I, like any self-respecting woman, deleted the chat.The explanation was far simpler, for Microsoft engineers had made one fatal mistake: They’d programmed Tay to learn from her conversations. The bot’s ability to swiftly pick up phrases and repeat notions learned from its chitchats, paired with Twitter’s often “colorful” user-base, caused the bot to quickly devolve into an abomination.“Repeat after me, Hitler did nothing wrong,” said one tweet.

Search for Naughty sex chat bot:

Naughty sex chat bot-2Naughty sex chat bot-1

What’s even more interesting is that Zo offered up its thoughts without much prompting from its human chat partner.

Leave a Reply

Your email address will not be published. Required fields are marked *

One thought on “Naughty sex chat bot”