When.com Web Search

Search results

  1. Results From The WOW.Com Content Network
  2. YouTube's algorithm more likely to recommend users ... - AOL

    www.aol.com/news/youtube-algorithm-more-likely...

    YouTube's algorithm more likely to recommend users right-wing and religious content, research finds ... “YouTube’s recommendation system is trained to raise high-quality content on the home ...

  3. Algorithmic radicalization - Wikipedia

    en.wikipedia.org/wiki/Algorithmic_radicalization

    Algorithmic radicalization is the concept that recommender algorithms on popular social media sites such as YouTube and Facebook drive users toward progressively more extreme content over time, leading to them developing radicalized extremist political views. Algorithms record user interactions, from likes/dislikes to amount of time spent on ...

  4. YouTube moderation - Wikipedia

    en.wikipedia.org/wiki/YouTube_moderation

    YouTube has suggested potential plans to remove all videos featuring children from the main YouTube site and transferring them to the YouTube Kids site where they would have stronger controls over the recommendation system, as well as other major changes on the main YouTube site to the recommended feature and auto-play system. [128]

  5. Research finds pattern of YouTube recommending right ... - AOL

    www.aol.com/research-finds-pattern-youtube...

    YouTube’s algorithm frequently recommends right-leaning and Christian videos to users who have not previously shown interest in those topics, according to new research released Tuesday. The ...

  6. Rabbit Hole (podcast) - Wikipedia

    en.wikipedia.org/wiki/Rabbit_Hole_(podcast)

    YouTube's content recommendation algorithm is designed to keep the user engaged as long as possible, which Roose calls the "rabbit hole effect". [5] The podcast features interviews with a variety of people involved with YouTube and the "rabbit hole effect". [6] For instance, in episode four Roose interviews Susan Wojcicki—the CEO of YouTube. [2]

  7. YouTube's recommender AI still a horror show, finds major ...

    www.aol.com/news/youtubes-recommender-ai-still...

    For years YouTube's video-recommending algorithm has stood accused of fuelling a grab bag of societal ills by feeding users an AI-amplified diet of hate speech, political extremism and/or ...

  8. Collaborative filtering - Wikipedia

    en.wikipedia.org/wiki/Collaborative_filtering

    The user based top-N recommendation algorithm uses a similarity-based vector model to identify the k most similar users to an active user. After the k most similar users are found, their corresponding user-item matrices are aggregated to identify the set of items to be recommended.

  9. You influence recommendation algorithms just as much as ... - AOL

    www.aol.com/influence-recommendation-algorithms...

    For premium support please call: 800-290-4726 more ways to reach us