YouTube Extremism And The Long Tail

Zeynep Tufekci, the insightful scholar and observer of sociology in the internet era, argued over the weekend that YouTube is unwittingly radicalizing some of its viewers through the videos that it automatically recommends that they watch next.

She was watching Donald Trump rallies while conducting research, sitting through clip after clip, when eventually she noticed “autoplay” videos “that featured white supremacist rants, Holocaust denials and other disturbing content.” Then she watched a bunch of Hillary Clinton and Bernie Sanders videos. Soon, “I was being directed to videos of a leftish conspiratorial cast,” she wrote, “including arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11.”

The pattern held across other topics:

Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons. It seems as if you are never “hard core” enough for YouTube’s recommendation algorithm.

It promotes, recommends, and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century.

She posits that, in Google’s effort to keep people on its video platform as long as possible, “its algorithm seems to have concluded that people are drawn to content that is more extreme than what they started with—or to incendiary content in general,” and adds, “It is also possible that YouTube’s recommender algorithm has a bias toward inflammatory content.” She believes we are witnessing “the computational exploitation of a natural human desire: to look ‘behind the curtain,’ to dig deeper into something that engages us. As we click and click, we are carried along by the exciting sensation of uncovering more secrets and deeper truths. YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.”

Source :