Down the Rabbit Hole

During+the+pandemic%2C+Hazel+Derry+spent+countless+hours+scrolling+through+media+sites+and+stumbling+across+popular+and+dangerous+conspiracy+theories.+If+someone+sees+something%2C+they+might+just+take+it+to+be+the+truth%2C+Derry+said.+%5BWere%5D+so+impressionable+and+it+can+change+someones+trajectory.

Ella Rosewarne

During the pandemic, Hazel Derry spent countless hours scrolling through media sites and stumbling across popular and dangerous conspiracy theories. “If someone sees something, they might just take it to be the truth,” Derry said. “[We’re] so impressionable and it can change someone’s trajectory.”

The glowing screen lit up Hazel Derry’s dark room, illuminating the ‘truth’ flickering on her computer screen. At the beginning of quarantine, as a freshman, Derry was not sleeping very much.

The pandemic saw an uptick in internet usage, especially among young people. As popular platforms such as TikTok and Instagram continued to receive traffic, America’s youth had more and more contact with misinformation. Derry spent hours skimming through YouTube and TikTok, watching videos recommended to her. During the midst of quarantine, Derry stumbled upon a QAnon rabbit hole: PizzaGate.

PizzaGate is the most popular QAnon conspiracy theory. The conspiracy states that a group of evil politicians and celebrities run an underground pedophilic sex-trafficking network that secretly controls the country. A 2020 NPR study found that 17% of the American public believed that PizzaGate had factual merit.

Looking back, Derry realizes that “it was really, really, really crazy.”

Why have so many outlandish theories, such as the ones spread by QAnon, recently come to the forefront of our elections and our political discourse? The exponential growth of social media and easy-to-use technology is one culprit, as it helps conspiracy theorists —and anyone with a particular point of view— gain easy access to large swaths of people, who spread ideas amongst themselves. Twitter, Instagram, TikTok and Facebook algorithms feed the flame: they tailor our pages to us, exposing viewers to posts and ideas they have shown interest in, in hopes of creating stronger global communities. As high school students, we find ourselves at the mercy of the Internet; every day we are exposed to new information—Instagram infographics, news headlines, TikTok videos—telling us about a new global issue, natural phenomenon or unbelievable statistic. How do we decide whether what we read is true, or whether our chosen content creators are just telling us what they know we want to hear?

Into this vacuum come conspiracy theorists. Some of the most successful and well-known conspiracy theories are largely baseless, or provably false. For example, QAnon, an alt-right group where many conservative theories are born, is based on the belief that Donald Trump is secretly saving the US from a group of evil, elitist Democrats. These beliefs have led to a radicalization that is actualized by events such as the January 6th insurrection, the Pelosi home break-in and the kidnapping threat of Michigan Governor, Gretchen Whitmer.

The first video Derry watched was on TikTok; it was a short video describing a lesser-known conspiracy. Derry didn’t think much of it and continued scrolling. But after the initial video, more and more similar videos popped up. The more she watched, the more she believed what the videos were saying.

“A lot of [the videos] were totally not credible at all,” Derry said. “But then some of the PizzaGate or QAnon videos would actually come up with fake statistics and documentation. It was really crazy how easily I got sucked into it.”

The successful spread of a conspiracy theory relies on finding a vulnerable audience—users who are relatively naive and/or don’t have the time to conduct their own research on an issue.

“[I watched the videos] during quarantine,” Derry said. “I was bored out of my mind and I wasn’t sleeping. I was not thinking critically or coherent[ly] at all. So I was like, ‘oh, [the theories] are totally real!’”

Due to the compounding algorithms of social media sites, supporters of extreme ideologies find themselves surrounded by like-minded thinkers, comfortable and unopposed, despite the fact that their beliefs have little basis in fact. These streamlined algorithms are not only designed to match individuals to like-minded people but also to keep the viewer interacting with the media. The more outlandish the content, the more attention it attracts; in a ten-year-long study, data scientist and researcher Soroush Vosoughi found that posts on Twitter containing false information are 60% more likely to be retweeted than tweets providing truthful information. The algorithm learns that extreme views perform well and as users spend more time on the platform, they are fed increasingly radical content.

“There’s an addictive aspect of [the videos] because you can kind of come up with a conspiracy theory about anything,” Derry said. “[That’s] what’s harmful about it because [the theories] are not based in logic.”

Derry’s belief in these theories was rooted in her vulnerable age and mental state. Derry and other young people are particularly susceptible to these algorithms. A 2018 study done by HopeLab found that 93% of teens between the ages of 13-17 were active social media users. Every moment a user spends on social media increases the data collected by the algorithm.

False information is everywhere— from exaggerated content to a call to arms over an extreme conspiracy. The internet has provided the primary platform for these theories, allowing ideas to easily reach millions of people. As of 2019, the FBI characterized conspiracy theories as a domestic terrorist threat. With the increased accessibility to the internet, false information will continue to grow and spread, affecting our politics, our worldview and our lives, unless we develop strategies for detecting misinformation.

“There need to be more ways to fact-check things on the internet,” Derry said. “Make sure you know that your information is coming from a reputable source. Also, just simply think, ‘does this make sense?’ There [also] really have to be bigger restrictions on what people can say on social media. The community guidelines should be a lot stricter. Because anyone can stumble upon something like that. And they might not know what to do with the information.”