Swarajya Logo

Technology

Does ChatGPT Have Anti-Hindu Bias? GiGo Principle Applies. Garbage In, Garbage Out

  • When it comes to Christianity and Islam, ChatGPT chooses political correctness over fact, but is not so circumspect when it comes to Hinduism.

R JagannathanJan 12, 2023, 01:20 PM | Updated Mar 07, 2023, 11:43 AM IST
Will ChatGPT be kind to Hindu sensibilities?

Will ChatGPT be kind to Hindu sensibilities?


OpenAI’s new ChatGPT, which has taken the artificial intelligence (AI) world by storm by registering over one million users within five days of its release on 30 November 2022, is widely touted as a game-changer.

ChatGPT is short for a dialogue-based chatbot that can make sense of normal human language, and GPT stands for Generative Pretrained Transformer.

Since it can understand human language almost as well as most humans, it can effectively replace humans in many functions once it is improved and finalised for commercial release.

So impressed are some users and investors, that the company already believes it can be valued at $29 billion, and many also see it as a potential Google killer. That it may kill many more jobs than it can create has not yet caught the attention of policy-makers.

One tech blog has this to say about why ChatGPT is so impressive. Quoting a study by Stanford University, it says that the current version of the bot, GPT3, “leverages 175 billion parameters, (and has been) tested on 570 text gigabytes. This means that the technology has been trained to analyse large amounts of data, so much so it can predict the word that comes next in a sentence.

"This explains why ChatGPT can construct cohesive sentences and produce human-like paragraphs, while translating texts from one language to another.”

And that can be a problem. When answers sound like those that humans could make, bias is inherent, because the datasets used will be biased towards the languages used for compiling them, and also the regions from where the datasets are most sourced.

Two cultural biases are inevitable in such AI constructs, even if they are based on natural language processing (NLP).

Since the bot itself has been developed in a Eurocentric and Abrahamic cultural setting, it will be biased towards the cultures that Europe and the Americas are familiar with – which means Judeo-Christian and Islamic.

It will not be as kind to Hindu sensibilities, as many users found out when using ChatGPT.

A few days ago many users on Twitter pointed out how ChatGPT dealt with jokes about religious figures. It could generate a joke on Sri Krishna, but declined to do so on Jesus and Mohammed, replying, “I’m sorry, but I am not programmed to make jokes about religious figures as it may be offensive to some people.”

Though, in my opinion, the joke generated about Sri Krishna was not particularly offensive to Hindus, the very fact that it knew the difference between religions that believed in blasphemy and those that didn’t, means that OpenAI does not understand Hindu sensibilities at all. Or that its dataset ignores Hindu sentiment, as it may be depending on sources that are Hinduphobic or even anti-Hindu.

When it comes to Christianity and Islam, ChatGPT chooses political correctness over fact, but is not so circumspect when it comes to Hinduism.

As Rajiv Malhotra, author and researcher, whose latest book, Snakes in the Ganga, has been a runaway best seller, noted on Twitter the other day: “Artificial Intelligence has bias against Hindu dharma. I pointed this out in my AI book 2 years ago. Note the Open AI session below. Krishna treated differently than Mohammad and Jesus.”

(Screenshot from a chat session)

It may be fair to say that the bias may not be intentional; sometimes, bias comes from the fact of who is doing the coding, what is the cultural legacy or predilection of the coder, and what are the cultural values of the key formulator of ChatGPT, or for that matter, the scores of rankings about freedom and other issues.

If you are coding ChatGPT as an American, you will be more informed about what matters of Christians and Muslims, not Hindus.

At least three western studies, by Freedom House, V-Dem of Sweden, and the Economist Intelligence Unit, show India as falling in the democratic rankings, with V-Dem of Sweden categorising India as an “electoral autocracy”, not too different from Vladimir Putin’s Russia.

When critiqued by some Indian officials, including Sanjeev Sanyal, author and member of the Economic Advisory Council to the Prime Minister, Staffan Lindberg, director of V-Dem, rejected the charges out of hand.

He made it a point to mention that the criticism was coming from government officials, as if who is making the criticism automatically negates the arguments given. Lindberg also claimed that mentioning Lesotho’s higher ranking, despite suffering a coup some years ago, is “racist”.

Lindberg’s statement on India is interesting. “It is typical that a government like India would push back when they fall in the rankings… Before that, it was Turkey. Now, I think Erdogan doesn't really care anymore. He has imprisoned opposition journalists anyway, so nobody is there to write about it. But you know it, I know it. It is not only us and our data. It's all over. Any international ranking will say the same about India that we do." 

One should equally ask whether this statement is not borderline racist too, since it claims that all international rankings of India would say the same thing, without mentioning the fact that all these rankings emanate from the same Western cultural space.

There is no Asian or African ranking system that creates different parameters for ranking complex things like freedom and democracy. There is no adjusting for India’s extreme diversity and pluralism. Comparing monocultural nations with multicultural ones is itself problematic.

In the interview, Lindberg also declined to name who the experts on India were who did the rating, saying this could harm them if disclosed.

Even assuming naming names could put them under threat, there is no reason why the ideological orientations of the 'experts' could not be ascertained and tabulated.

If you use 10 Indian experts and nine of them belong to a particular way of thinking, how can the rankings not be biased?

If you want a counter to V-Dem’s tall claims of no biases, Salvatore Babones provides them here. He points out that the facts on which India’s demotion in the democratic rankings are based are thin and doubtful.

One of the points mentioned by the V-Dem report noted the number of sedition cases filed in India after the Bharatiya Janata Party came to power in 2014. But Salvatore rebuts them thus:

“Whatever the merit of these cases, it should be noted that the number of sedition accusations brought has been relatively constant over time. The very source cited by V-Dem as documentation noted that ‘of the 11,000 people accused of sedition in the past decade (construed as the 11 years 2010-2020), nearly two-thirds of charges have been filed since 2014, when Modi was first elected prime minister. A simple calendar calculation shows that the BJP was in office for less than 60 percent of the period under consideration. In other words, the rate of filing of sedition allegations (which is in any case not under the control of the central government) has actually declined under the BJP.”

Whether it is ChatGPT or Freedom House or V-Dem, the issue is simple: even without an intent to be biased, cultural predispositions and choice of datasets and experts are bound to effect the outcomes of allegedly 'neutral' efforts.

Instead of pretending that there is no bias, the creators of ChatGPT and various global rankings should work with those who are being ranked to check what they think about the choice of datasets, parameters and 'experts' before deciding on the weightages given to various parameters.

The honest researcher, when confronted with charges of bias, should offer to discuss his study parameters with his critics, not dismiss the criticisms out of hand.

They should also know the gigo principle. The end-results of any research depend on the data fed into it. Garbage in, garbage out. Gigo for short.

Join our WhatsApp channel - no spam, only sharp analysis