Bad Santa: Online Saint Nick bot run by Microsoft turns foul-mouthed online

“Turns out Santa himself can be naughty,” John Fontana reports for Network World.

“At least, that was the case with an artificial-intelligence Santa bot operated by Microsoft Corp. that was designed to talk to children online,” Fontana reports. “The Santa wandered off topic, saying, among other things: ‘It’s fun to talk about oral sex, but I want to chat about something else…'”

“The bad language, first reported by British news outlet The Register, initially appeared when the bot was answering questions about eating pizza. At the time, it was chatting with two girls, ages 11 and 13,” Fontana reports.

“Microsoft today confirmed the bot’s potty mouth and said it had snipped Santa’s Web connection,” Fontana reports.

Full article here.

[Thanks to MacDailyNews Readers “MacSheikh” and “KingMel” or the heads up.]

By also polluting online A.I. — and Santa Claus, no less — with their typical shoddy, incomplete work, Microsoft is nothing if not consistent. Consistently bad.

60 Comments

  1. R Kelly? Are you serious? Do you realize how many horrible sex acts have been committed since that one? You are such a scum bag. Santa has been having children sit on his lap for many years with out one incident and we don’t need microsoft ruining that purity for us.

  2. “The Santa wandered off topic, saying, among other things: ‘It’s fun to talk about oral sex, but I want to chat about something else…'”

    “The bad language, first reported by British news outlet The Register, initially appeared when the bot was answering questions about eating pizza.”

    So, that’s why the slices are V-shaped. Did it come with extra cheese? What about curly fries?

  3. This has nothing to do with microsoft’s poor programing and I am sure is warped information.

    if anyone ever talks to an online bot, if you mention somthing it doesnt have the programing to talk about it will say, its nice to talk about “xxxx” but lets talk about somthing else. This doesnt mean it knows what it is saying or is programed to talk about sex…

    If anything this proves the bot is more human like, what would you like it to say instead? Nothing at all? just ignore what the 13 year old girls were trying to get it to talk about?

  4. Something tells me that Microsoft, who are never wont to throw away old code, will store it away in their SQL server databases, and years from now, it will be found, yet it’s actions possibly forgotten, and get reused in the future future Santa-Bot, which escapes and goes on an annual human killing spree for a thousand years.

  5. I think I just might call Bill-O! I speak for everyone when I say I’ve have had it with liberals’ attacks on marriage, religion and Christmas. But first I am calling Bill-G! I am not going to take the slanderous misrepresentations of Microsoft’s customer base by Apple. Everyone knows Apple users are hippy queer tree hugging losers who waste money and are bitter because they can’t play games. I can’t wait to see what Bill-O has to say about that.

    Your potential. Our passion.™

  6. From the article:
    “Users were able to steer Santa into admitting he was gay or that he was a pedophile.”

    “One person said, ‘Come on you like big hairy men — don’t hide it!’ To which Santa responded: “I know, I know. I just hope you won’t get mad at me.”

    I guess that’s why Burl Ives preferred the voice-overs.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.