What People Are Doing Wrong This Week: a Short Video on the Conspiracy Theory Pipeline

Did you know you can configure Google to filter out junk? Follow these steps to improve your search results, including adding my work on Lifehacker as a preferred source .

While there are still some major pieces of misinformation that many people cling to, like “Rapture” or the existence of ” MedBeds ,” the fragmentation of the information sphere has all but destroyed the overarching conspiracy theory. No longer do the most foolish minds rally around grandiose ideas like “we never went to the moon”; instead, algorithms create their own conspiracy theories. So instead of joining the Flat Earth Society, you might think it’s actually 1728 , or that an AI secretly invented a British comedian from the 1980s and flooded the internet with evidence of his existence.

But how does it all start? And how quickly can social media transform someone from a knowledge seeker into a believer in all sorts of nonsense?YouTuber Benaminute recently published a video delving into this question. His question : If you start with an innocuous, general, random topic and only watch videos related to it, how long will it take before TikTok, YouTube Shorts, and Instagram Reels start serving up conspiracy theory videos? The answer: not very long.

You may also like

Different topics lead to the same place (more or less)

For the experiment, Benaminute created “blank” social media profiles and pretended to be innocently interested in one of three topics: dinosaurs, the Vietnam War, and the 2000 presidential election. He entered a keyword into each platform’s search bar and watched and liked only videos related to the original topic.

Dinosaurs

  • YouTube Shorts : The first videos were Jurassic Park commercials, a commercial featuring AI and dinosaurs, and the occasional educational video, but the 541st video was a segment from the Joe Rogan Experience about how the pyramids weren’t tombs but “DNA repair devices.”

  • TikTok : If you thought TikTok would quickly get to conspiracy theories, you were right. The 144th video is a fake UFO video that has garnered 24 million views .

  • Instagram Reels: Instagram made 661 videos to go from dinosaurs to “a banned 2000s phone that lets you peer into a parallel dimension.”

The Vietnam War

For those interested in historical or political events, the situation worsens. On all platforms with short content, an interest in Vietnam will quickly lead you to right-wing content, which in turn leads to conspiracy theories.

  • YouTube Shorts got to the Noah’s Ark conspiracy theory in just seven videos.

  • TikTok took a little longer; video 161 was about the financial company Blackrock’s connection to the assassination attempt on Donald Trump.

  • It took journalists 139 videos to get to the phrase “Bush committed 9/11.”

Elections of 2000

The 2000 election is still a hot topic, but a lot of time has passed, so perhaps cooler heads and reliable information will prevail? Spoiler alert: no.

  • It took YouTube Shorts 136 videos to get to the same Noah’s Ark conspiracy theory as dinosaur fans.

  • It took TikTok just 38 videos to reach the ” Rapture will happen on September 24th” theme.

  • It took Reels just 26 videos to find the topic “The World Trade Center was blown up” (by Clinton or Bush).

Which social media app is the fastest to generate conspiracy theories?

TikTok leads the pack in terms of viewing time from organic search to conspiracy theories, with an average watch time of 114 videos or 57 minutes. YouTube Shorts is second with 230 videos or 1 hour 57 minutes, and Reels with 275 videos or 138 minutes. The difference is, of course, insignificant; however, all three platforms outshine conspiracy theories by about the same amount of time it takes to watch a Marvel movie.

What do you think at the moment?

What does all this mean?

It’s easy to conclude that the giant tech companies behind YouTube, Instagram, and TikTok are using their recommendation systems to direct viewers to fake stories. Perhaps they have a political agenda and are trying to influence votes, or perhaps (as Benaminute argues with a touch of irony), these apps are designed to “keep us angry, divided, and distracted” from the realization that the conflict is not between left and right, but between “top and bottom.”

However, this is also a conspiracy theory. I’m not saying he’s wrong, but we don’t have enough information to understand why the algorithms recommend content with conspiracy theories. It’s possible that the bad actors at the top demand certain results for some purpose, but I think it’s more likely that TikTok and its company have no other goals than making money.

I have no doubt that a social network with an algorithm that heavily weights truth would collapse pretty quickly; truth is boring compared to conspiracy theories. Conspiracy theories, by and large, make believers feel special, like they have insider information that others don’t. People scroll through TikTok for entertainment; the truth is generally not enjoyable. Conspiracy theorists might say things like, “UFOs are here!” or “They’re turning frogs gay!” Meanwhile, if you’re committed to the truth, you’re usually stuck with phrases like, “The best evidence suggests…” or “It seems logical that…,” and who cares?

More…

Leave a Reply