Graphic by Sophia Ma
The Facebook Papers are a set of documents revealing the internal controls and disputes of the social media giant that sparked a series of investigations on the practices and methods the company employs.
Released by whistleblower Frances Haugen in late October, the documents provide details alleging that Facebook “intentionally hides vital information from the public” and prioritizes “profits over people” through its platforms. As an example, Haugen suggests Facebook propelled political polarization by not taking down posts that violated their policy. The effects of this materialized during the Jan 6. insurrection, which Haugen claims Facebook knew about beforehand and sat idly by.
The company allegedly doesn’t take down posts that “almost certain[ly] violated Facebook’s … policy,” citing free speech as a rationale, with Mark Zuckerberg, Facebook’s CEO, defending this decision by claiming Facebook’s “position is [one] that should enable as much expression as possible unless it will cause imminent risk of specific harms of dangers spelled out in clear policies.” This internal discrepancy has led to disputes among employees and prompted Haugen to ask, although “Facebook did not invent partisanship [or] polarization, what choices did Facebook make to expose the public to greater risk than was necessary?”
A key component to Facebook’s appeal is its content, which is specifically tailored to the user based on a detailed algorithm that tracks a user’s digital movement on and off the app in order to feed into one’s interests. The personalized aspect of the app heightens user experience, but can also fuel radical ideology by feeding users content that aligns with their views. School-based therapist Richard Prinz sees this tactic as a way to “take [users] down rabbit holes,” increase app engagement and drive up profits.
“If you keep watching Fox News they’ll keep feeding you Fox News so you [stay] on [the] internet and on your device,” Prinz said. “The more they can keep you on their device, the more advertising they can [get]. They’ll feed you what you’re watching; if you like looking at cute little dogs, then they’ll keep feeding you cute little dogs.”
This constant feedback loop also leads to increased app usage which, if prolonged, can harm users. According to a study by MentalHealth.net, 81% of online teens use some type of social media and 77% of online teens use Facebook. The study also found that 10% of Facebook users display “disordered social networking” also known as internet addiction. Social media addiction is especially challenging for teens because it can severely impact their development.
Prinz also believes excessive internet usage can lead to lack of communication skills and decreased empathy, which he says dehumanizes interpersonal interactions. Prinz saw this in a student who used the internet to “vent and take out his sadness or his anger on others and feel better about himself,” leaving him unable to properly interact with others in person.
“It goes against our human nature,” Prinz said. “We’re brought up by closeness, by touch and contact with people. [With] the internet, even though sometimes it feels like we can connect, we’re not — it’s not physical, face-to-face.”
The Papers also reveal that Facebook did an internal investigation on the adverse affects their platforms posed towards teenagers, finding that another of their social media platforms, Instagram exacerbated body image issues for one in three teenage girls. Junior June Wang says she isn’t surprised by this finding and sees Instagram as a place where the “ideal body type” of being skinny is magnified due to the glorification and exposure on these platforms. After starting to use Instagram more actively, Wang felt she was “definitely more conscious of [her]self and [her] physicality,” something she wasn’t as aware of before.
However, when senators sent a letter in August requesting the results of the study, Facebook declined, saying it was hard to say how much screen time was “too much” and withheld the statistics until Haugen shared the information. During a hearing after the leak, Senator Richard Blumenthal said that research was a “bombshell; it is powerful, gripping, riveting evidence that Facebook knows of the harmful effects of its site on children and that it has concealed those facts and findings.”
According to a study conducted by the Royal Society for Public Health, Instagram and Facebook are the most adverse social media sites in regards to teenage mental health. This is because teens are prone to “seeking validation from not only their friends, but from complete strangers. However, the pressure to be socially accepted can prove too much for some, which in turn, can lead to low self-esteem.”
Facebook’s platforms are also known to enable a comparison mindset among users due to the increased “expos[ure] to other people’s profiles,” propelling an unhealthy outlook on their own lifestyle. School financial specialist Calvin Wong, who deleted Facebook in 2018, thinks that on Facebook “you definitely only see a side of people’s life.”
The app’s wide ranging features allow users to edit, filter and post only the appealing aspects of their lives online — attributes most prevalent on Facebook’s platforms due to its main feature [of] photo sharing — creating false narratives and glorified versions of their lives. Junior Ved Pradhan believes this altered representation can end up harming users by making them feel more “ashamed.”
Wong personally felt this while on his social media feed, which led to “endless searching time looking at other people’s profiles and get[ting] caught into the comparison game.” For him, deleting Facebook had a “positive” impact on him and he doesn’t “miss it.” He thinks those who face this issue need to regulate their own screen usage which will keep a “good positive balance.”
Prinz agrees and also thinks that society needs to be more aware of social media companies’ intent and harms. To him, their “motivation is for profit” and they’re “always marketing to make money.” He believes users need to abstain or cut their social media usage, forcing a shift in power through regulation.
“It takes people saying ‘No, I’m not going to keep watching this stuff if it does this [or] supports these things,’” Prinz said. “There are some checks and balances but it doesn’t seem to come until there’s some form of protest and then these companies finally [change]. It seems that there needs to be a different motivation than greed. It has to be more of [an] altruistic one, [such as] wanting the best for people instead of just money for shareholders. But that’s [going to] come from us.”