Swarajya Logo

Technology

Facebook Under Scrutiny After Internal Documents Reveal How It Overlooked Platform’s Negative Effects On Users

  • As per reports about Facebook-owned Instagram, Facebook's own researchers' slide deck claims that the app is harmful to mental health, while one slide from 2019 said: "We make body image issues worse for one in three teen girls."

Bhaswati Guha Majumder Sep 17, 2021, 07:27 PM | Updated 07:27 PM IST
Facebook owned Instagram causes negative effect on users

Facebook owned Instagram causes negative effect on users


A series of reports citing Facebook's citing internal documents have revealed that how the American social media giant is not just aware of its platforms' harmful consequences on users but also has failed to address them.

All the reports were published by The Wall Street Journal this week. In one of the reports about Facebook-owned Instagram, the highlighting fact is the impact of the platform on teens. According to the article, Facebook's own researchers' slide deck claims that the app is harmful to mental health, while one slide from 2019 said: "We make body image issues worse for one in three teen girls."

Similarly, another slide stated that youngsters blame Instagram for "increases in the rate of anxiety and depression ... This reaction was unprompted and consistent across all groups". According to WSJ, one such presentation showed that 13 per cent of British users and 6 per cent of American users who expressed suicidal thoughts blamed Instagram for their wish to die.

However, these examples are noteworthy because, in arguing that there is little association between social media use and depression, Facebook has frequently cited external studies rather than its own researchers' conclusions.

Meanwhile, Karina Newton, head of public policy at Instagram, said in a post on 14 September: "The Wall Street Journal published a story today about internal research we're doing to understand young people's experiences on Instagram. While the story focuses on a limited set of findings and casts them in a negative light, we stand by this research."

Newton stated that while Instagram can be a place where individuals have "negative experiences," it also gives marginalised people a voice and allows friends and relatives to keep in touch. Additionally, she said that the internal Facebook study proved the company's dedication to "understanding complex and difficult issues young people may struggle with, and informs all the work we do to help those experiencing these issues".

Even, according to a 2017 study by the United Kingdom-based Royal Society for Public Health and Young Health Movement, after conducting a survey on almost 1,500 teens and young adults, it was found that Instagram is the worst platform for mental health and wellbeing. It was highlighted that while the photo-based platform was praised for its ability to foster self-expression and identity, it was also linked to high levels of anxiety, sadness, bullying and FOMO, or "fear of missing out".

Now, Facebook is being questioned by American lawmakers from both parties (Democratic and Republican Party) and chambers of United States Congress about how its services affect the mental health of teenagers and children. The United States senators said this week that they are "in touch with a Facebook whistleblower and will use every resource at our disposal to investigate what Facebook knew and when they knew it—including seeking further documents and pursuing witness testimony."

However, Facebook CEO Mark Zuckerberg has stated several times that his company is a neutral platform that treats all of its billions of users equally. But WSJ discovered a 2019 internal assessment which called the trillion-dollar worth company out for misrepresenting itself in public, revealed another news report on the company's "whitelisting" practise—a programme that permits politicians, celebrities and other public figures to defy the platform's regulations.

According to WSJ, the internal review said: "We are not actually doing what we say we do publicly." It also added that "unlike the rest of our community, these people [included on the whitelist] can violate our standards without any consequences".

Andy Stone, the Facebook spokesperson told WSJ that even though the criticism of the practice was justified, it was "designed for an important reason: to create an additional step so we can accurately enforce policies on content that could require more understanding".

In 2018, Zuckerberg stated that a modification to Facebook's algorithm was made to boost connections between friends and family while reducing the quantity of professionally produced content in their feeds. Staffers cautioned, however, that the change was having the opposite impact, according to records published by the WSJ. As per the report, a team of data scientists said: "Misinformation, toxicity and violent content are inordinately prevalent among reshares."

Moreover, they wrote: "Our approach has had unhealthy side effects on important slices of public content, such as politics and news."

The following year, according to the WSJ, one Facebook data scientist wrote in an internal memo: "While the FB platform offers people the opportunity to connect, share and engage, an unfortunate side effect is that harmful and misinformative content can go viral, often before we can catch it and mitigate its effects."

However, in an interview with the WSJ, Lars Backstrom, a Facebook vice president of engineering, stated that there will be certain ways for it to be misused or taken advantage of, just like any other optimisation and "That's why we have an integrity team that is trying to track those down and figure out how to mitigate them as efficiently as possible".

Not A Good Year

Considering the current situation of Facebook, 2021 hasn't yet proven to be a good year for the social media giant.

Earlier in 2021, reports said that the Irish Data Protection Commission had begun an investigation to find out whether Facebook broke the law when it came to the data leak that was exposed in April.

The authorities said: "DPC launched an own-volition inquiry pursuant to section 110 of the Data Protection Act 2018 in relation to multiple international media reports, which highlighted that a collated dataset of Facebook user personal data had been made available on the internet. This dataset was reported to contain personal data relating to approximately 533 million Facebook users worldwide."

As reported, the DPC believes that one or more aspects of the GDPR and/or the Data Protection Act 2018 may have been, and/or are being, breached in connection to Facebook Users' data, based on the information supplied by Facebook Ireland to date.

In June, the European Commission said that it had launched a formal antitrust investigation into whether Facebook broke EU competition rules by utilising advertising data, namely from advertisers, to compete with them in marketplaces where the American tech giant operates, such as classified ads.

More recently, an investigation by ProPublica has raised questions about Facebook's claim that it upholds privacy as a key feature. According to the probe, even though Facebook has stated on numerous occasions that it does not read messages shared between users, but as per the report, WhatsApp—currently owned by the tech behemoth—has more than 1,000 contract workers filling floors of office buildings in Austin, Texas, Dublin and Singapore, where they look through millions of bits of content from people.

"Seated at computers in pods organised by work assignments, these hourly workers use special Facebook software to sift through millions of private messages, images and videos. They pass judgment on whatever flashes on their screen—claims of everything from fraud or spam to child porn and potential terrorist plotting—typically in less than a minute," the report claimed.

In response, a WhatsApp spokesperson told Business Insider: "Every day WhatsApp protects over 100 billion messages with end-to-end encryption to help people communicate safely. We've built our service in a manner that limits the data we collect while providing us the ability to prevent spam, investigate threats, and ban those engaged in the worst kind of abuse."

"We value our trust and safety team who work tirelessly to provide over two billion users with the ability to communicate privately," the spokesperson added.

Join our WhatsApp channel - no spam, only sharp analysis