YouTube Video Recommendations Lead to More Extremist Content for Right-Leaning Users, Researchers Suggest::New research shows that a person’s ideological leaning might affect what videos YouTube’s algorithms recommend to them. For right-leaning users, video recommendations are more likely to come from channels that share political extremism, conspiracy theories and otherwise problematic content.

  • dexa_scantron@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    They don’t do it to everyone. Some people get put in test groups that get ‘nice’ algorithms that don’t try to make you angry, so they can measure the effect on their revenue.