The toxic potential of YouTube’s feedback loop

Guillaume Chaslot for Wired:

From 2010 to 2011, I worked on YouTube’s artificial intelligence recommendation engine — the algorithm that directs what you see next based on your previous viewing habits and searches. One of my main tasks was to increase the amount of time people spent on YouTube. At the time, this pursuit seemed harmless. But nearly a decade later, I can see that our work had unintended—but not unpredictable—consequences. In some cases, the AI went terribly wrong.

In February, a YouTube user named Matt Watson found that the site’s recommendation algorithm was making it easier for pedophiles to connect and share child porn in the comments sections of certain videos. The discovery was horrifying for numerous reasons. Not only was YouTube monetizing these videos, its recommendation algorithm was actively pushing thousands of users toward suggestive videos of children.

Unfortunately, this wasn’t the first scandal to strike YouTube in recent years. The platform has promoted terrorist content, foreign state-sponsored propaganda, extreme hatred, softcore zoophilia, inappropriate kids content, and innumerable conspiracy theories.

MacDailyNews Take: We’ve already see the effects of echo chambers in things like televised and printed “news” and Twitter, for just two examples of where feedback loops can quickly become toxic.


    1. Ignorant and racist and not funny.

      I was doing some videotaping with a rapper in San Diego and he had a show in Tijuana. Club X.
      Before the show we interviewed Bootsy Collins in his room in San Ysidro. Earlier in the day, we had interviewed Davis Faustino (Bud in Married With Children) in Beverly Hills or Hollywood. Where one ends and the other begins is not my thing. Did you know he is a rapper? Or was.

      Anyway, while Ice-T (Banned in San Diego) was performing, I was chatting with Trick and he wanted to know what my Rap handle would be. I couldn’t think of anything.

      Thanks for the pointer. Since I was a computer guy, algorithm would have been fitting.

    2. Oh yeah. After the show after we had left, th eTijuana cops stopped trick and found a ‘joint’ in his car. I was in another car so it was a real zoo and confusing few minutes. Trick had to be bailed out.

  1. If you haven’t worked with the general public, you would be amazed at how stupid they are. And that experience for me was 35 years ago. I can’t imagine how much worse it is now.
    That’s what you have pushing the “trending”.
    It would be quite an experiment to do an IQ test for a certain level and then only those people go to a duplicate YouTube and see how much does the “trending” differ.

  2. On Youtube, I watched one interview of the Stranger Things cast, then for the next 3 days I was bombarded with literally 100’s of different videos about the Stranger Things cast . . . the problem with Youtube’s algorithm is that it gives you the SAME TOPIC over and over. . . . You can take the same media concentration and apply it to political news stories, etc. — Youtube is feeding each user with self-indulgent content. Youtube is not encouraging you to experience a variety of media and content. Instead, they have created an algorithm, similar to a Vegas slot machine, which makes you keep clicking.

  3. People are so predictable, and that is why algorithms work so well. As to stupid, when a few social media tech companies were called before congress ,the tech guys successfully hid behind “but it was the “algorithms” – as though the companies had nothing to do with the programming and congressmen and women just believe them!

    Stupid, predictable, and gullible!

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.