Internal Documents Suggest That Facebook Policies Were Driven By Political Bias
As per reports, Facebook's internal documents that had been leaked have revealed that politics has been playing a significant part in company's decision-making process.
Since last week, multiple news outlets began reporting on findings from a cache of Facebook internal papers that had been leaked. Lawyers for Facebook whistleblower Frances Haugen provided the majority of the redacted business records to the United States Congress, the Securities and Exchange Commission and a group of news organisations.
According to a report by The Wall Street Journal, some of these internal documents have revealed that politics has been playing a significant part in Facebook's decision-making process.
Last year in June, when the African American man George Floyd was killed by a Minneapolis officer on the street, a Facebook employee wrote on the company's racial-justice conversation board, “Get Breitbart out of News Tab”.
Breitbart or Breitbart News is a syndicated American news website created by American journalist Andrew Breitbart. Facebook's News Tab is a tool that collects and promotes articles from a variety of sources. However, according to internal papers supplied to the WSJ, Facebook ignored the comment posted by the employee.
Following Floyd's death, massive Black Lives Matter rallies swept the country, with millions of people demanding attention to issues of racial justice and police brutality.
Despite employee protestations, the social media company decided to leave problematic content about the protests from Breitbart—popular with former American President Donald Trump's followers—on its News Tab.
The unnamed Facebook employee posted screenshots of some of the headlines from the news website on the message board, such as "Minneapolis Mayhem: Riots in Masks," and "Massive Looting, Building in Flames, Bonfires!"
The WSJ report stated that according to the written conversations on Facebook’s office communication system, the anonymous employee then wrote that these articles were “emblematic of a concerted effort at Breitbart and similarly hyperpartisan sources (none of which belong in News Tab) to paint Black Americans and Black-led movements in a very negative way”.
A company researcher noted in the same conversation that any steps intended at removing Breitbart could meet internal difficulties due to the potential political fallout. Additionally, the person wrote: “At best, it would be a very difficult policy discussion”.
According to a spokesperson for the social media giant, the company bases its decision on the specific content published on Facebook, not the entire Breitbart site, and that the Facebook material met the company's requirements, including the need to follow its anti-misinformation and anti-hate speech policies.
However, many Republican party members had claimed that Facebook discriminates against conservatives.
Although the document examined by WSJ did not conclude whether prejudice plays a role in its overall choices, they have demonstrated that staff and their supervisors argued whether and how to constrain far-right-wing publishers, with more senior employees frequently acting as a check on agitation from the ranks. As per the report, these records did not reference similar discussions about left-wing media.
Additionally, other records show that Facebook's management team has been so concerned about avoiding accusations of prejudice that political concerns are frequently at the forefront of its decision-making.
Employees at the company have constantly agitated for the firm to take action against far-right websites, as evidenced by a huge number of internal message boards. The report claimed that the staff had structured their arguments around Facebook's enforcement of its own rules in many cases, claiming that the platform is giving far-right-wing publishers a pass to avoid public backlash.
Even at that time, one employee said: “We’re scared of political backlash if we enforce our policies without exemptions.”
According to the employees, Facebook offered Breitbart and other conservative publishers preferential treatment, allowing them to avoid penalties for spreading false material or hate speech.
While citing a research firm’s report, WSJ‘s report highlighted that right-wing sites are consistently among the best-performing publishers on the network in terms of engagement and this is one of the reasons why some on the left criticise Facebook, claiming that its algorithms favour far-right content.
However, in response, Facebook spokesman Andy Stone said: “We make changes to reduce problematic or low-quality content to improve people’s experiences on the platform, not because of a page’s political point of view.”
“When it comes to changes that will impact public pages like publishers, of course we analyze the effect of the proposed change before we make it,” Stone added.
The claims against Facebook—regarding the link between politics and the company’s decision-making procedure—are the most recent in the "Facebook Files" series published by the WSJ.
Earlier this month Facebook whistleblower Haugen claimed that while CEO Mark Zuckerberg "never set out to make a hateful platform," the corporation has failed to appropriately handle hate on the network and takes decisions that benefit the company rather than the public.
The social media behemoth has consistently rebuffed charges made in the Facebook Files that it tolerates hate and misinformation and allows illicit activity like drug trades, human trafficking, and cartel operations to go unnoticed.
In March this year, the company said that it would no longer promote specific Facebook groups that spread misinformation and hate in users' feeds.
Separately, in September, a series of WSJ reports citing Facebook’s internal documents have revealed that the American social media giant is not just aware of its platforms' harmful consequences on users but also has failed to address them.
In one report about Facebook-owned Instagram, the highlighting fact is the impact of the platform on teens. According to the article, Facebook's own researchers' slide deck pointed out that the app is harmful to mental health, while one slide from 2019 stated: "We make body image issues worse for one in three teen girls".
As you are no doubt aware, Swarajya is a media product that is directly dependent on support from its readers in the form of subscriptions. We do not have the muscle and backing of a large media conglomerate nor are we playing for the large advertisement sweep-stake.
Our business model is you and your subscription. And in challenging times like these, we need your support now more than ever.
We deliver over 10 - 15 high quality articles with expert insights and views. From 7AM in the morning to 10PM late night we operate to ensure you, the reader, get to see what is just right.
Becoming a Patron or a subscriber for as little as Rs 1200/year is the best way you can support our efforts.