When.com Web Search

Search results

  1. Results From The WOW.Com Content Network
  2. YouTube's algorithm more likely to recommend users ... - AOL

    www.aol.com/news/youtube-algorithm-more-likely...

    The study noted that YouTube’s recommendation algorithm “drives 70% of all video views.” ... YouTube recommended videos that pushed voter fraud claims to Donald Trump supporters.

  3. YouTube moderation - Wikipedia

    en.wikipedia.org/wiki/YouTube_moderation

    When users show a political bias in what they choose to view, YouTube typically recommends videos that echo those biases, often with more-extreme viewpoints." [35] [38] When users search for political or scientific terms, YouTube's search algorithms often give prominence to hoaxes and conspiracy theories.

  4. Algorithmic radicalization - Wikipedia

    en.wikipedia.org/wiki/Algorithmic_radicalization

    YouTube's algorithm is accountable for roughly 70% of users' recommended videos and what drives people to watch certain content. [20] According to a 2022 study by the Mozilla Foundation, users have little power to keep unsolicited videos out of their suggested recommended content. This includes videos about hate speech, livestreams, etc. [21] [20]

  5. YouTube algorithms push eating disorder content to teen girls ...

    www.aol.com/youtube-algorithms-push-eating...

    The report, titled "YouTube's Anorexia Algorithm," examines the first 1,000 videos that a teen girl would receive in the "Up Next" panel when watching videos about weight loss, diet or exercise ...

  6. YouTube algorithm suggests videos about disordered eating to ...

    www.aol.com/news/youtube-algorithm-suggests...

    YouTube’s algorithm is recommending videos about disordered eating and weight loss to some young teens, a new study says. ... One in three videos recommended to the simulated 13-year-old girl ...

  7. Alt-right pipeline - Wikipedia

    en.wikipedia.org/wiki/Alt-right_pipeline

    The alt-right pipeline (also called the alt-right rabbit hole) is a proposed conceptual model regarding internet radicalization toward the alt-right movement. It describes a phenomenon in which consuming provocative right-wing political content, such as antifeminist or anti-SJW ideas, gradually increases exposure to the alt-right or similar far-right politics.

  8. YouTube's algorithm pushes right-wing, explicit videos ... - AOL

    www.aol.com/news/youtubes-algorithm-pushes-wing...

    YouTube's algorithm recommends right-wing, extremist videos to users — even if they haven't interacted with that content before. YouTube's algorithm pushes right-wing, explicit videos regardless ...

  9. Rabbit Hole (podcast) - Wikipedia

    en.wikipedia.org/wiki/Rabbit_Hole_(podcast)

    YouTube's content recommendation algorithm is designed to keep the user engaged as long as possible, which Roose calls the "rabbit hole effect". [5] The podcast features interviews with a variety of people involved with YouTube and the "rabbit hole effect". [6] For instance, in episode four Roose interviews Susan Wojcicki—the CEO of YouTube. [2]