The flames of a long-standing controversy just got flared up once again following the reports dissecting the societal impacts of Facebook, termed, “The Facebook Files”. The core story is that internal documents of Facebook have revealed the tech giant has concrete data showcasing how the platform is causing social and psychological harm to its users. Despite the evidence, there is serious inactivity from Facebook’s end to tackle these challenges. As the cherry on top, some of Facebook’s measures to counteract the negative effects of the platform seem to have backfired.
Separate Review Process for Elite Users; Contradicts Facebook’s Public Statement
Facebook’s CEO, Mark Zuckerberg had previously stated to the public that the platform’s users are all treated equally regardless of their social status or fame. However, internal documents have revealed that Facebook has built a program titled “X-Check” that exempts elite users from some or all of Facebook’s regulatory policies. This elite user base consists of celebrities, influencers, politicians, journalists, and more.
Read More: Dun & Bradstreet Acknowledges Alesha Mart for Product Standards and Timely Service
The social media platform has measures in place to enforce rules so that regulate the content posted by users. The “X-Check” system grants full or partial immunity to elite individuals, letting them get away with breaking Facebook’s own rules. In fact, there are some elite users who are “whitelisted”, meaning they are completely immune from Facebook’s regulatory actions.
A devastating example of exempting elite users from regulation occurred in 2019 when star Brazilian footballer Neymar was able to post nude photos of women who had accused him of rape to millions of his followers. The dangers of this “whitelist” are apparent, especially when taking into account that these whitelisted users have spread false news, preached that vaccines are dangerous, and protect pedophile groups.
Instagram: A Threat to Mental Health
Instagram is heavily dominated by fitness influencers, who flaunt their seemingly perfect bodies and envy-worthy lives. This type of content is so prevalent and promoted on the platform that these unattainable standards have been ingrained as the “norm” among regular Instagram users. An event that has created serious mental health issues, especially among young women. In a Wall Street Journal reviewed research conducted in 2020, it was said that 23% of teen girls were unhappy with their bodies due to the content they see on Instagram. Similar findings have been found by Facebook’s own researches on Instagram’s impact, with a common theme of causing complications such as depression and eating disorders among women. In one of Facebook’s own presentations in 2019, they stated that Instagram makes 33% of teen girls feel worse about their bodies.
This issue has far more ominous implications when 13% of suicidal teens in the UK and 6% of suicidal teens in the US attributed their suicidal thoughts to Instagram.
“Family & Friends” Algorithm Backfires, Hostility on the Rise
Facebook had previously revamped its algorithm, designed to curate the news feed content of users so that the content they view does not cause hostile exchanges and create division based on political, social, and religious views. However, news portals and Facebook’s content creators simply abused the new algorithm to go viral by creating more divisive content. As these are mostly businesses that thrive on engagement, they deliberately make and promote provocative content so there’s a strong reaction from the platform’s audience.
Read More: Walton Came Back Strongly With 125.6% PAT Growth in FY 2020-21
The new algorithm allocated weights to posts based on the number of likes, comments, reactions, reshares, etc. Posts with higher weightage were displayed more. This created a loophole. Posts that provoked reactions of anger tended to be reshared more, and consequently due to the high weightage of reshares, got more exposure. This opened the doors for false information, violence incitement, and toxicity.
Drug Cartels and Human Traffickers Identified on the Platform, Lack of Action from Facebook
Facebook employees were able to identify accounts of dangerous individuals, including drug lords and human traffickers who were using the platform to conduct illegal activities. This even included hiring and paying hitmen!
Despite this terrifying finding, the accounts of drug cartel leaders were allowed to continue their use of social media and post on Facebook as well as Instagram.
Facebook employees further raised concerns about how the platform is being used in developing nations. These countries have a massive user base that is exceptionally vulnerable to the harms of social media. Furthermore, even human traffickers from Middle Eastern countries were identified on Facebook who trapped women and forced them into slavery or prostitution. Despite such accounts being identified, the Facebook authority did not take the required action to subvert these illegal activities.
Part of the problem is what Facebook chooses to prioritize. The Facebook Files revealed that Facebook merely responds to the public surfacing of dangerous content by taking them down and does not alter the system that allowed such content to go up in the first place. Facebook’s key priority has always retaining users and acting in the interests of its investors. This caused negligence when it came to regulatory and security measures. Some countries did not even have appointed individuals for identifying criminals who spoke the local language. In fact, Facebook has been complacent in regulating dangerous content supported by some governments so that Facebook still had the authorization to operate in those countries.
COVID-19 Vaccine Efforts Sabotaged by Anti-Vax Activists; Facebook’s Partially to Blame
The Facebook Files have a tendency to uncover internal information that contradicts what Facebook promises to the public. When the pandemic arose, Facebook had vowed to push 50 million users to take the COVID-19 vaccine. The company’s CEO even bragged in a press release that the platform had connected more than 2 billion users, nearly 2/3rd of the entire userbase to authoritative COVID-19 information sources.
In reality, the nature of vaccine-related content in regards to Facebook’s algorithm instead promoted efforts of anti-vax activists who rallied against vaccination. Facebook’s own testing of the social media site’s contents revealed that 41% of all comments on vaccination-related posts which were in English, had discouraged users from taking the vaccine.
The findings of the Facebook Files have triggered outrage from officials and the general public alike. Lawmakers, government officials, and influential personalities criticized Facebook’s unimpressive role in combating the dangers of social media and have called for more research on the detrimental effects of using social media.
The Facebook Files have highlighted the core problem of the platform. Facebook is built on content created and shared by its users. The reaction these content get and the events they transpire are so difficult to control that it creates barriers for regulating false news. On top of that, Facebook either failed or simply decided not to utilize the tools they have to enforce the regulations, which only made the situation worse.
Now there’s a lot to speculate. Is Facebook’s model inherently flawed and unsafe? Is Facebook hesitant on enforcement action because of business interests? What will be Facebook’s plan for the future? The chaos is apparent.
For more updates, be with Markedium.