Swarajya Logo

Magazine

Don’t Let Facebook Updates Decide Your Emotions

  • Knowing the danger that Facebook poses and working on ways to reduce its impact is something that needs urgent action.

Prithwis MukerjeeJun 30, 2017, 06:33 PM | Updated 06:33 PM IST
Can Facebook meddle or otherwise with, or influence, human minds, en masse? (Carl Court/Getty Images)

Can Facebook meddle or otherwise with, or influence, human minds, en masse? (Carl Court/Getty Images)


Facebook is the mythical 800-lb gorilla in the media world that, as the original joke goes, “sits down wherever it wants to”. With 1.2 billion pairs of eyeballs eyeing it every day, it has an audience greater than any American, European or Asian TV news network, newspaper or online news portal. This immense reach also makes it the most effective medium of entertainment. In societies where it has crossed a critical threshold of penetration, it has become the most potent mobilising force in politics, and all this eventually translates into Facebook being one of the most 32 valuable companies in the world.

We know that information is power. We also know that power corrupts and absolute power corrupts absolutely. Should we be wary of Facebook? Consider the following.

In the Foundation series of iconic science fiction novels by Isaac Asimov, we have the villain, a mutant psychopath called the Mule, using popular musical concerts as a mechanism, a medium, to transmit subliminal messages to an unsuspecting audience, that demoralises the population and breaks its resistance to the Mule’s political hegemony.


Can a mass media platform be used to meddle or otherwise with, or influence, human minds, en masse?

As an early adopter and ardent evangelist of social media, I had always thought that platforms like Facebook and Twitter were an excellent replacement for television and newspapers as channels for current news and diverse views. But after getting drawn into a series of unintentional and inconclusive spat and flame wars with strangers with whom I have little in common and which left both sides as unconvinced about the other’s point of view as ever, I am sceptical. Was the price I was paying for using these “free” channels far too high in terms of the collaterals of irritation and anger generated in an otherwise placid and cheerful person like me? Was this my fault? Was I not savvy enough to handle this new media just as an earlier generation is psychologically uncomfortable with shopping at Flipkart or using an Android smartphone? How did the evangelist in me morph into a social media Luddite, ranting against a technology? Was it just me? Or is this feeling universal?

In a peer-reviewed paper published in Harvard Business Review in April 2017, Holly Shakya and Nicholas Christakis have established what I had recently come to believe, namely, that ‘The More You Use Facebook, the Worse You Feel’! This is paradoxical because social interaction is a necessary and healthy part of human existence and many studies have shown that people thrive when they have strong, positive relationships with others. But when real-world physical relationships are replaced by digital and virtual relationships, the situation changes. The authors measured well-being—through self-reported life satisfaction, mental and physical health and body-mass index—and Facebook usage—through the number of likes, posts and clicks on links—from three waves of data of 5,208 users over two years, and came to the conclusion that overall well-being was negatively associated with Facebook usage, with the results being particularly strong for mental health.

A study shows that a decline in well-being is strongly tied to the quantity of facebook usage and the quality of interactions.

Moreover, the study also showed that the decline in well-being is strongly tied to the quantity of Facebook usage and not just the quality of interactions as it was believed to be in the past.

While the authors offer no explanation for this negative association of well-being with Facebook usage, it is not difficult to see why this is so if we consider what shows up on your newsfeed. Depending on the number of posts that your friends, and pages that you have liked, have shared, there would be approximately 2,000 plus items that Facebook could show you, but since this leads to an uncomfortable information overload, the actual number shown is possibly as low as 200. This selection or curation is not performed by any human editor but by an artificial intelligence (AI) program that is designed to maximise benefits for Facebook.

Since it is in Facebook’s interest to stimulate conversations, its AI will obviously select items that would provoke a user to react—just as in a zoo, visitors throw stones at the animals instead of allowing them to rest in peace. Hence, while placid and informative items will not be totally ignored, there will always be a slight bias towards items that will provoke a reaction. For example, a Hindutva follower—and Facebook knows our preferences to the last detail—will be shown more items on minority appeasement, knowing fully well that is more likely to trigger a torrid response and a subsequent equally torrid counter response, than pictures of flowers and birds. Of course, this bias is neither obvious nor in-your-face. You will still see the usual quota of bland, feel-good quotes and pictures of friends holidaying in Goa or Singapore. Which is fine, except that you just might feel a tad disappointed that you are stuck in messy Mumbai instead of being in Goa which is another reason for feeling a bit sore with yourself! Since nobody posts about their problems, this too leads to the depressing belief that everyone except you is happy.

In fact, playing and tampering with Facebook users’ emotions and deliberately trying to modify it is the subject of a very controversial paper—“Experimental evidence of massive-scale emotional contagion through social networks”, published in the June 2014 issue of the Proceedings of the National Academy of Science USA, by members of the data science team at Facebook. For the purpose of this paper, the Facebook team deliberately introduced a certain bias in the nature of items included in the user’s newsfeed and observed the impact on their subsequent behaviour.

To quote the authors, “In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”


While there is no evidence of any deliberate evil intent as yet, the fact that its AI-based news selection service can detect and tamper with the emotions of users is a big red flag because, as noted earlier, Facebook touches more people than any newspaper, television channel or news portal and so can mould the emotions of a significant part of the global population.

While Facebook has been targeted for being a channel or firehose for fake and unsubstantiated news, the real danger lies in its ability to tamper with our emotions and, as reported in the HBR paper, make all of us feel angry, frustrated, jealous and upset with the world around us. Can we do anything to mitigate this unfortunate state of affairs?

At a personal level, one could reduce the amount of time spent on the platform, but since Facebook is an addiction like tobacco or alcohol with similar withdrawal symptoms, this may not be a feasible solution for everyone.

What users could ask for instead is greater transparency in the algorithm, the procedure, used to determine what they see or don’t. If I want to see posts about birds and flowers, I must not be shown pictures of stone-pelters in Kashmir. In fact, such a process does exist, because you can indicate the kinds of posts that you want to see less of, but a more direct method should go a long way to restoring the sense of choice that we have in newspapers and TV to read or ignore specific items of news and views.

Social media is here to stay and Facebook, with its unassailable reach and immense clout, is something that—like the monsoon rain—we have to learn to live with. However, knowing the danger that it poses and working on ways to reduce its impact is something that needs urgent action.

Join our WhatsApp channel - no spam, only sharp analysis