*For the purpose of maintaining her privacy, the advertising professional mentioned in this story will be referred to as Source A.
The issue of algorithmic radicalization, the concept that social media companies push radical content, is not a new idea. But with the election and ever-increasing polarization between political parties, fears about it — real or imagined — are only growing.
Social media is used by citizens across the country as a news source. However, some research proposes that as content gets more radical, so do its viewers’ ideological standings, which in turn influence the content they see, a constant positive feedback loop.
Performance Marketing Director Source A, who has previously worked for Meta, currently creates advertisements on various social media platforms. She believes that extreme content goes viral because it stops the fingers of users from scrolling on the platform. This finger-stopping content is so valuable in marketing that Source A’s companies have spent hundreds of millions of dollars trying to achieve it on behalf of their clients.
Like Source A, Lynbrook High School library media teacher Amy Ashworth has noticed that content meant to evoke an emotional response gets more likes, regardless of whether it’s factually correct. Most recently, Ashworth has noticed an influx of misinformation regarding Hurricane Milton, with some social media creators spreading outdated or false information to garner more likes. For this reason, she never takes the news she sees on social media at face value, and always double-checks it with a reputable news source. She believes that algorithms tend to push content that users enjoy or engage with and causes them to spend more time on the platform, leading to an echo chamber. On the other hand, posts that challenge users’ worldview are suppressed and hidden.
“The algorithm is like a waiter at a restaurant coming to you and asking, ‘What do you like? What do you not like about this? Is there anything that I can change?’” Ashworth said. “They’re trying to give you what they deem as the best experience.”
In addition to divisive and echo chamber content, algorithms have been shown to promote harmful content due to high engagement with such posts. According to an article by the BBC, social media companies show violent and misogynistic content to teenagers, especially young boys, and turn a blind eye to the harm this content poses. A study by the University College of London reveals that these algorithmic processes lead to hateful ideologies and misogyny being spread and becoming embedded in youth culture.
In accordance with this finding, Ashworth believes that the combination of social groups and social media can affect one’s worldview and the development of their beliefs. When the Israel-Hamas war broke out, her son, a student at the California Polytechnic State University, was pushed in support of one side by social media posts. This led to him feeling torn between maintaining an unbiased point of view and supporting his friends, many of whom were of Middle Eastern descent. In response, he began to “recalibrate and recenter” his viewpoint and be conscious of what content he interacts with, in order to view events from a less biased perspective.
Assistant Professor of Sociology at Yale University Daniel Karell says that when social media is considered alone, it has a small correlation to an increase in political violence. He agrees that algorithms do promote sensational content, but cautions against blaming them wholly for radicalizing people. Rather, “economic inequality, cultural grievances and a sense of cultural and political loss, cable news and other media, and the rhetoric of political leaders,” along with radical and insular social media circles, can cause real-world extremist and terrorist activity. Karell believes that a possible reason for most of the blame being assigned to social media is because of the rise of both happening at the same time: it is simpler to blame the algorithm for real-world problems rather than try to address the more complex reasons.
“Consuming sensational or extremist content does not mean that someone is going to commit political violence,” Karell said. “After all, most people select into consuming content — that is, they seek out the content they want. For example, racists seek out racist content.”
In line with Karell’s point that radical people seek radical content, junior Khushi Chetty cites her father, who consumes political content on YouTube, as an example of someone who seeks out content per their previously held political views. But while his viewpoint informs the kind of content he consumes, she believes the content also further influences his opinions — an ongoing self-reinforcing loop. She has noticed her father’s political stances lean ever farther from the center because of social media. Chetty, who says she doesn’t currently hold strong opinions on political or social topics, believes that if she began seeing left or right-leaning content on her social media feed, she may start to believe it.
The promotion of viral, misinformative content by algorithms, and the insular “filter bubbles” formed when algorithms prioritize content that users already agree with, combine to make the ideal breeding ground for users to self-radicalize. To Chetty, this has potential implications for the political state of America.
“To a certain extent, it may have influenced the election, because there’s a lot of people who are easily influenced by what they see on social media,” Chetty said. “I do think it could definitely alter the opinions that they previously had based on new information they find on social media. Especially with biased videos, they could definitely impact people’s perceptions of big events.”
Like Karell, Ashworth agrees that polarization doesn’t necessarily contribute to offline violence. However, she says social media does have strong ties to real-world political events due to its instantaneity and interconnectedness. She believes since it allows people to plan events and communicate about gatherings, offline violence is more likely to happen at a larger scale. She also states that emotionally impactful events are much more prevalent because of social media.
“Before, you didn’t hear about the event until after it had happened, maybe a week later,” Ashworth said. “Wherever you are in the U.S., all of a sudden you have all of this news that’s popping into your feed because that information is readily available and consumable. That information push is so instantaneous that something that was once a small bubble is now a massive bubble, and it has a ripple effect in larger groups of people.”
As library media teacher, Ashworth’s role in teaching research and media literacy skills means she has strong convictions that all people, especially high school students, should be informed about what content is harmful for them to believe and could potentially drive them to insulate themselves from outside perspectives.
“Once you’re in your little bubble, you tend to not want to leave your bubble,” Ashworth said. “That’s a polarizing effect, and since these algorithms are feeding you information you’re staying on one side, whether you realize that you are or not, without taking anything else into account. I would recommend that people just take it with a grain of salt and say, ‘OK, well, this is what I’m hearing from these people. But I’m going to investigate myself.’ Very few of us actually investigate for ourselves.”