Astroworld tragedy is 'perfect storm' for conspiracies as 'dangerous TikTok algorithms force Gen Zs down rabbit hole'

THE viral crackpot conspiracy theories surrounding the Astroworld tragedy illustrate how so-called "tech-savvy" Gen Zers are increasingly vulnerable to misinformation on social media, an expert has warned.

Jennifer Stromer-Galley, a professor studying social media platforms at Syracuse University, told The Sun the common-held belief that Generation Z has a greater understanding of social media than older demographics is "fiddlesticks."


"Just because people who grow up with these technologies does not mean in any way that they're more savvy, knowledgable, or wise to how it all works," she said.

"You know, we were saying the same thing about millennials with the World Wide Web and the kind of the early incarnations of social media. Some see it as, 'Well, they grew up with this. They must understand how this works,' and they don't.

"Put it this way: Just because you grew up around cars and drive a car doesn't mean you understand fundamentally how it works or how to fix it – social media is no different."

In fact, Stromer-Galley believes that younger social media users may even be more susceptible to fringe conspiracies than their millenial or baby boomer counterparts.

This, she explains, is because humans are innately "drawn to sensational content", and teenagers even more so because "the frontal cortex [of their brains] is still developing, so they tend to be more impulsive and therefore more drawn to sensationalism and more likely to believe it."

SATANIC PANIC AT ASTROWORLD

The professor's comments come after a myriad of conspiracy theories circulated on social media platforms such as TikTok and Twitter in the wake of Travis Scott's ill-fated Astroworld concert last Friday in Houston, Texas.

Nine people died and hundreds of others were injured when the crowd of around 50,000 people suddenly surged towards the stage.

While the cause of that surge remains under investigation, videos ludicrously claiming the tragedy was actually a large-scale "satanic blood sacrifice" orchestrated by Scott have been racking up millions of views online.

Others have bizarrely suggested the crowd had been bewitched by a devilish spell moments before the crush.

Read our Travis Scott Astroworld Festival live blog for the very latest news and updates…

Apparent believers of the unfounded claims have seized on several far-fetched "proofs" to support their wild narratives, including apparent demonic symbolism in the design of the stage and promotional material for the event.

One of TikTok's most popular videos on the subject, which racked up a staggering 23 million views, showed a hologram of a winged bat-like creature at the beginning of the show, surrounded by fire.

Captioned, "Not even 40 seconds in," the video's comments section was rife with conspiracy theories about satanic presences and perceived occult symbolism.

“Look at the symbolism!! A DOVE, a symbol of the human soul, ON FIRE?!?! They knew what they was doing. This is pure evil,” one of the top comments, which has 77,000 likes, reads. 

A second clip that had upwards of 800,000 views showed a photo of the stage, highlighting eight flaming pillars.

"For those saying this wasn’t satanic. 8 pillars of flames and 8 people dead," the caption read.

PERFECT STORM

Stromer-Galley described the tragedy as the perfect storm for conspiracy theorists.

She said Travis Scott's celebrity status, his energetic performances, and his tendency to instruct his audiences to "rage", combined with the infernal imagery present on stage, makes it "unsurprising that you would see some of these stories come out."

While some TikTokers might be playfully indulging in the theories and sharing them simply to boost their "micro-celebrity status", in the online universe where it's often so difficult to discern between sincerity and insincerity, Stromer-Galley warns that "one person's play can quickly become another's passion or real world view."

"I think that's a challenge with any of this misinformation stuff because I really do think that some people or creators of what we would call misinformation are just doing it because they're trying to increase clicks," she said.

"There are people who are creating this content specifically because they know that people can't help but watch it.

"We're attracted to controversy, we're attracted to these crazy stories of demonic rituals, and the event is a hot topic of conversation right now – everyone's talking about it."



Stromer-Galley continued: "So if you're going to monetize something or capitalize on something [to amass a larger following] you're going to focus in on a current event that's already got a lot of people's attention.

"But for ordinary social media users, for instance, the 15 and 18-year-olds on TikTok watching these clips, they don't necessarily know that.

"They're more likely to believe it to be real, especially if they think their friends might also believe it because they're very social at this age and get a lot of kind of chemical positivity when they're connected with other people.

"Young people, on average, spend an immense amount of time online. And so they kind of create this reality for themselves. That gets reproduced, reinforced until it looks real – whether or not it really ever was."

DIGITALLY NATIVE, BUT DIGITALLY NAIVE?

In 2018, around half of American teenagers reported being online "nearly constantly" and 59 percent said they consider YouTube to be their preferred learning method over books and in-class teaching.

A similar poll conducted by POLITCO/Morning Consult last year found that, during the election cycle, Gen Z looked to YouTube, Instagram, and TikTok for updates on the presidential race, rather than traditional news outlets or text-based media.

A total of 59 percent of respondents said they used Youtube; 53 percent said Instagram; 43 percent said Facebook; and 32 percent said TikTok.

Comparatively, 40 percent said they used TV news for election updates and only 21 percent reported reading newspapers.

The problem with photo and video platforms such as TikTok and Instagram is that the origins of information can be so easily obfuscated.

Some studies show that Gen Zers are better at spotting fake news stories or misinformation than their parents

However, because they are inundated with a quantity of information not seen by earlier generations, and while they may generally be apt at discerning between fact and fiction, they may not necessarily have the time or desire to do so.

Other studies have found that Generation Z, colloquially known as zoomers, are actually no better at spotting falsehoods online than millenials or boomers.

A 2015 Stanford University survey found that 82 percent of middle-schoolers couldn't tell the difference between an advertisement and a news story.

The survey also found that over 30 percent of middle schoolers surveyed considered a fake news story more credible than a real one.

A similar study conducted four years later by Stanford History Education Group yielded similar results, finding that  96 percent of high school students failed to question the validity of an unreliable source regarding a story about climate change.

More than half also fell for a video about the threat of ballot stuffing in the US when all of the clips shown were actually from Russia instead. 

'TOXIC' TARGETED ALGORITHMS

Stromer-Galley believes it's a myth that just because zoomers have grown up with social media that they somehow have superior knowledge of its inner-workings than other generations.

While they may be digitally native, they're still digitally naive, she said.

And one of the biggest threats posed to Gen Z when it comes to misinformation is the targeted algorithms of the sites they glean information from, Stromer-Galley added.

TikTok, like Instagram, shows videos algorithmically, rather than chronologically.

This means that users won't see videos or images in the order they're posted, rather when an algorithm deems them interesting enough for them to see, depending on their engagement habits.

Therefore, anyone who engaged with a post about an Astroworld conspiracy theory is likely to continue seeing similar posts more frequently as they continue to scroll.


Stromer-Galley called targeted algorithms a "huge problem", highlighting how they can force teens down a rabbit hole of misinformation.

"There algorithms are trained on our user behavior. So every time I engage with an ad or every time I click on a story, or I read somebody's post and like it, it that further reifies that behavior.

"So what happens with things like Astroworld, or other types of sensation content – because we're hardwired to not look away from things that are more extreme, more emotional or more violent – is that the algorithm picks this up and then reproduces so we get more kind of extreme content, more conspiracist information, more emotional content, more violent content.

"That is potentially, I think, incredibly toxic for everybody. But it's especially toxic for teens and younger people who are so connected to their phones."

A potential remedy for the problem would be to introduce more randomized content to the timeline of social media apps, Stromer-Galley said.

However, doing so would run contrary to the platform's or tech company's imperative to make money, she added.

"You have to question whether someone like TikTok or Facebook would really want to do things like randomizing content more or allowing people to drop back into a chronological newsfeed rather than an algorithmic news feed – because ultimately they lose.

"There's a reason they do it the way they do because it keeps people on the platform."

CALLS FOR CHANGE

Stromer-Galley pointed out that some platforms, including Instagram and TikTok, provide users with the opportunity to request to see less of a certain type of post or ad by manually clicking "See less of these posts" or "This is not relevant to me."

Users are also permitted to report or flag content they deem to be harmful or misleading.

However, the problem with these features, she says, is that it requires a user to be proactive when so much of social media consumption and interaction is passive.

"Furthermore," Stromer-Galley added, "the act of actually clicking on the button to say 'no, I don't want to see this anymore' once isn't actually going to fix it.

"You have to do it over a series of times and days to actually see a change on your tailored algorithm."

Substantial changes are unlikely to be implemented save for the breaking up of big tech firms or comprehensive government regulations, Stromer-Galley believes.

The long-term implications of failing to intervene remain unclear, she said, but theorized that the rise of micro-targeted online content could sow more division in wider society.

"One of the things I've been contemplating is that what we're really seeing is a return to feudalism or tribalism of sorts.

"We're increasingly consuming less mass content – such as mass media – and instead consuming micro-targeted content to people like us.

"This is creating smaller information and interest communities. And the kind of global mass connection that we had to people who are different from us seems to be dissolving. 

"So polarization is increasing, resulting in an 'us vs them' narrative, where 'I've got my people, you got your people, you've got your beliefs, and your ideology and your facts. I've got my beliefs and ideologies, my facts and we're just going to be at war with each other because we can't even agree on the basic facts.'"

'US vs THEM'

Conspiracy theories can drive a similar chasm between people online and in the real world, Stromer-Galley continued.

"One of the key characteristics of conspiracies is that they're elaborate. There's a lot of story behind them … and those details help people to explain the otherwise unexplainable or bizarre chain of events.

"And it becomes incredibly difficult to counter because it becomes almost ideological, and very 'us vs them'.

"Just telling them the story isn't convincing isn't enough because really all this is about storytelling.

"The only way to correct misinformation is to provide a more compelling story that still fits with the ideology, and that's hard to do."

The satanic panic conspiracy theories swirling around last Friday's Astroworld tragedy spread uninhibited for several days before TikTok announced on Wednesday that it was taking action to remove them, citing a breach of community guidelines.

As recently as Tuesday, searching "Astroworld" on the app would bring up "Astroworld demonic" as the second suggested search term.

Even after TikTok's announcement, misspelled or similar phrases, such as “atroworld demonic,” “astroworld conspiricy,” and “astroworld portal to hell,” were still visible in the suggested searches bar.

PAST FAILURES

TikTok has a long history of failing to curb conspiracy theories on its platform.

Last summer, the platform gave rise to the infamous Wayfair sex trafficking conspiracy theory, which suggested the online furniture retailer was secretly smuggling children in containers for its high-priced orders.

Various QAnon conspiracies and election fraud claims have also proliferated on the app in months prior.

TikTok has not yet returned a request for comment from The Sun.

We pay for your stories!

Do you have a story for The US Sun team?

Email us at [email protected] or call 212 416 4552.

Like us on Facebook at www.facebook.com/TheSunUS and follow us from our main Twitter account at @TheSunUS

    Source: Read Full Article