The foundations of economic theory are based on patently absurd assumptions. This needs to change. It will not happen overnight, but economists must start thinking about it.
Between 1970 and 1972, when I was studying for my Master’s degree at the Delhi School of Economics, I often felt two conflicting emotions. One was of persistent elation that I was registered in the crème de la crème of the courses that Delhi University had to offer. The other was of deep depression arising from the growing suspicion that we were being taught nonsense, especially in economic theory.
That was a time when economic theory still held sway and when econometrics had just begun to come into its own. Little did anyone know that by the time the 1970s ended, empiricism, supported by the ever-increasing robustness of econometrics, would replace theory as the core of modern economics. The demand for better data from governments and other donors, on the one hand, and the increasing supply of data on the other, had made this possible. One result is that, as the years have gone by and computing power has grown, economics is no longer a discipline requiring great intellect. A robot with a powerful laptop is usually just as good.
I have often wondered if this is something worth mourning. On balance, yes, it is. But we must also keep in mind that the theorists had overplayed their hands. The foundations of economic theory were based on patently absurd assumptions. Altogether too often to ‘prove’ a theory, the theorists would assume the equivalent of, say, water does not boil at 100 degrees centigrade under normal atmospheric pressure.
However, in social sciences, thanks to the intellectual inertia of adherents, the decline of orthodox methods and the resulting wisdom tends to be very gradual. Usually it needs a shock of gigantic proportions to deliver the coup de grace. The Great Depression of the 1930s ended the sway of classical economics, which had assumed that markets adjusted instantaneously. The oil shock of the 1970s created a demand for empirical-proof backed by reliable data, rather than arcane theories based on advanced mathematics. And in 2008, the global financial crash sent economists to meditation rooms to figure out where even empiricism had gone wrong. A discipline which had been preening like a peacock in the monsoon was now in total disarray.
Meanwhile, the triumph of mindless empiricism has seen the advent of a menace called the policy economist. This tribe is to economics what ‘non-state actors’ are to security. The trouble with empiricists, as opposed to empiricism whose importance can’t be denied, is that they can’t tell the difference between correlation and causality. I once wrote an article showing how every change of party of government in the US after 1945 has been accompanied by a recession.
The mere existence of a data set has enabled an entire generation of economists to postulate things that do not fit comfortably into a composite theory. Often, they fall prey to the post hoc, ergo propter hoc (after this, therefore because of this) fallacy.
The correlation was easy to establish but causation, in either direction, is another kettle of fish altogether. Even after having been advised early not to fall prey to it, they make the classic post hoc, ergo propter hoc mistake. Governments, in particular, are very prone to making this type of mistake.
Overall, the swings between the two extremes – of economics posturing as physics so that it can use advanced mathematics, and data-based vacuity which has no roots in politics, custom, institutions, law, etc – have led the discipline into a state of meaningless debate and argument amongst economists – tarka and vitarka – where scoring petty points is mistaken for scholarship and where debate becomes an end in itself.
Oscar Wilde, who said that an economist is a person who knows the price of everything but the value of none, was wrong. Today, it is the other way round.
***
The story of how demand for post-war economics was created is fascinating, not least because its roots lie in the struggle for ideological supremacy during the Cold War. The key intellectual challenge posed by communism to capitalism was that markets always delivered socially inferior outcomes. Implicit in this ideology was an even more devastating belief: that often markets did not exist at all.
For the West, then, it became something of an intellectual obsession to prove the opposite, namely, that not only that markets always exist for everything but also that they always deliver socially superior outcomes. It was not enough for the people to know this instinctively and see it daily in front of their eyes. It had to be proved to the satisfaction of the intellectuals, in an intellectually acceptable manner and garb.
This led to a proliferation of ‘theorems’, ‘lemmas’ and the like. Simple common sense was dressed up in mathematical finery. A new prize was created in 1969. Its real name is the Bank of Sweden Prize in Economics but it was projected as the Nobel Prize. No other social science was hyped in this manner.
Through clever research funding, economists were persuaded to ask questions that had no real answers. Growth theory and development economics – both now very passe – were excellent examples of this. Indeed, almost all of economics was like this. To give it all a garb of science, a battery of tools from advanced mathematics was employed as economists chose to believe that economics mimicked physics in many important ways.
But, as in Newton’s Third Law, for every explanation there has been an equal and opposite explanation. In fact, there has been more than one. And it is not just the simultaneous presence of multiple and equally valid explanations that was a problem. It was also the refusal by economists to acknowledge that, beyond a point, this was a pointless pursuit because it was never going to yield any definitive answers.
Take for instance, quite randomly, the famous Turnpike Theorem, which led to a huge number of reputations being established. Turnpike is old English for a highway, which connects two places by the shortest distance. Using this analogy, economists asked if you want to accelerate your country’s long-run growth, think of your economy as being on a side road. You now have to decide whether to get on to the highway or not. This wasn’t a terribly hard problem to solve but it acquired huge intellectual momentum in the 1960s.
Politically, the new growth theory involved giving distribution and distributive justice a much lower priority than growth. But India discovered in 1967 – when the all-powerful Congress party was given a beating in the elections – that this was not very clever if you were a democracy. The reaction to the electoral disasters that followed came in the 1970s when India decided to over-compensate in terms of distribution. In contrast China, without a democracy, has done well out of the Turnpike Theorem. Amazingly, it had been tried by the Union of Soviet Socialist Republics (USSR) first and then Mao Zedong as well, but had failed to work for them. But it worked for Deng Xiaoping because the US, in order to bring down the USSR, decided to cooperate with China and not refuse to buy things from it.
The Turnpike theory thus had very little to do with China’s success. Suffice it to say that growth theory has been an awful waste of time and effort because it has suffered from the key deficiency of economics: it has not been able to establish causality beyond reasonable doubt.
***
As mentioned, earlier assumptions are central to any discipline of structured thought. Thus psychologists assume that everyone is a little crazy; historians automatically assume that the historical material they use is true merely because it is old; social anthropologies assume that what is true of a part is also true of the whole. And so on.
Economics differs from these disciplines in one very important respect. It makes more assumptions than any other discipline. One of these is (or used to be) that people and societies act rationally, where rationality consists mostly of consistency. But after more than a century of using this assumption as the bedrock of their theories, economists have now begun to abandon it in order to justify the discipline’s inability to come up with explanations that stand the test of time.
They are not saying that people don’t behave rationally but only that there can be no standard yardstick by which to judge an action as rational. Thus, even though suicide bombing is regarded as completely irrational behaviour, to the bomber himself it appears to be the most rational thing to do if the reward is 72 virgins in heaven. The result is a complete overturning of the rationality assumption. What is left of the old economic theory then?
The abandonment of rational behaviour, without acknowledging that rationality has to be judged in a given context, was honoured with a ‘Nobel’ in 2002. Daniel Kahneman, who is now regarded as the father of behavioural economics, was awarded the prize. But his explanations were rooted in psychology, not context.
Kahneman and his colleague Amos Tversky basically said that you could not wholly disregard intuition. They “explored the psychology of intuitive beliefs and choices and examined their bounded rationality”. They conducted experiments using individuals and groups to validate their theories.
Bounded rationality? What in heaven is that? No one really knows who invented the term but it is most widely attributed to another Nobel Prize winner, Herbert Simon. People, he said, don’t have the brains to handle complex situations, so they often act in ways that are not consistent with total rationality. In other words, people are rational only some of the time.
But the question that needs asking, but which no one is asking, is: by diluting the rationality requirement, has the subject gained or lost? Having thought about it a lot during the last two decades, I would say that one way or another, it has not made any difference.
Therein lies the real tragedy of modern economics. Nothing, or very little, that went before has made a difference.
***
In any decent university, one of the first warnings given to students of economics pertains to the problem of causality. If, say, the demand curve for something shifts upwards, how do you know which of the following caused it: an upward shift in per capita incomes, a change in tastes and preferences, or simply a generational change?
Economists, with their certitudes, often carry on as if they know for sure. The truth, however, is that there is simply no way of telling. The volume of intellectual effort that has gone into the problem of analysing causality in economics is staggering. But the discipline is no further along the road to establishing rules for causality now than it was five decades ago.
In spite of having been warned early, economists also often fall prey to the oldest fallacy that haunts causation, not just in economics but in all subjects: post hoc ergo propter hoc (which, to repeat, means after this, therefore, because of this). The error lies in assuming that if X happens after Y had happened, X happened because Y happened.
A British nutritionist once referred me to the famous British medical statistician, who laid down seven conditions that had to be fulfilled before association in medicine could be viewed as being causal. These were strength, consistency, biological gradient (the more the causal factor, the more the disease), temporal pattern, specificity, biological plausibility, and the coherence of the evidence.
Now how many of the causalities in economics would fulfil such rigorous criteria? Not one, I would venture to suggest. This is because many of the so-called causal factors in economics are related in both directions: A causes B but B also causes A. In medicine, this is the equivalent of saying smoking causes cancer but cancer also causes smoking!
Superstition, it is easy to see from this, is purely a result of the post hoc, ergo propter hoc fallacy. Economic theory was more-or-less the same.
***
If you keep changing the essential foundations of a discipline, and, most of all, if you can’t define its limits, sooner or later you will lose credibility. Nowhere is this truer than in macroeconomics, which is a closed system with about half a dozen or so variables, such as the gross domestic product (GDP) growth rate, the fiscal deficit, interest rates, the exchange rate, the current account, capital account, and so on. Economics assumes these variables to be in very tight relationships with each other and arrives at conclusions that go wrong because, in fact, these relationships are very loose.
Also, at the intellectual level, macroeconomics consists of a series of conditional – if this, then that – statements. Thus, if governments borrow more, interest rates will go up. But by how much, no one can tell. If interest rates go up, bond prices will go down. But by how much, no one knows. And so on. The result is that anyone can, and does, become a macroeconomist and everyone can claim to be right all the time without actually being so.
There is another problem which professional economists will not even begin to admit. In a way, it is the central problem of macroeconomics. This is that it rests on very weak logical foundations because it uses what logicians call induction as its chief means of drawing an inference. But induction as a valid method of logical inference was discredited when macroeconomics was growing up, that is, in the 1940s.
It was Karl Popper, who first asked the question that Nissim Nicholas Taleb of Black Swan fame asked half a century later: Does the fact that all the swans you have seen are white mean that all swans are white? That is, could it not be that there is at least one swan that you have not seen, and that it is not white? When you apply this to economics, its problem becomes clear: does the fact that all fiscal deficits that, say, International Monetary Fund (IMF) economists have seen were associated with major crises mean that all fiscal deficits always lead to crises?
The realisation that swans are not always white is what has led the United States to adopt diametrically opposite policies for itself in the current financial crisis to what its handmaiden, the IMF, had proposed for East Asia in 1997. This shows how macroeconomics and its policy prescriptions make a giant leap from a few observed particulars to a universally valid generalisation.
But using inductive logic is not the only flaw in macroeconomics. Even the logical structure of theoretical macroeconomics is deeply flawed because it is tautological in that it makes a series of self-evident statements such as “let us bifurcate this into two” or “income-tax is a tax on income”. The best example of a tautology is the famous Keynesian identity, which he developed to explain how the economy worked, mainly to the UK’s Conservative politicians. It was a deliberate gross oversimplification to explain a very complex phenomenon.
But from it have followed a set of standard prescriptions that are appropriate only 25 per cent of the time. Can you imagine such a high failure rate in real science? Yet, economics calls itself a science.
***
It is most alarming that under the current framework of macroeconomics, its existing tools have become ineffective for a large variety of reasons, not the least of which are technological change, the removal of an upper bound on how much currency the US can print, and the slow disappearance of the 20th century employment model wherein income was smoothened for the working life of a worker. Theoreticians need to address this issue, especially in regard to the role of ‘G’ (government) in the identity. The old prime the pump approach is now only a way of increasing inflation and/or debt, as the Chinese economy has so ably demonstrated.
For the last few years, I have been arguing that Keynesian solutions in a democracy are a recipe for disaster in a globalised economy, especially when both labour and capital are in excess supply. A month after Lehman’s crash in September 2008, I had written all this in an article for The Hindu Business Line in which I had pointed out that Keynes had “essentially provided an intellectual basis and, therefore, political respectability, to support a policy of government spending designed to prop up sagging economies. This was a new idea. Until then, markets were supposed to equilibrate the economy while the government looked on hopefully or helplessly depending on the context.”
But massive unemployment increased the communist threat and the governments of the Western hemisphere needed to be more pro-active to prevent communist ideas from taking deeper root. Keynes, with his immortal identity, provided the economic rationale for convincing the politicians. Basically, said Keynes, the controlling variable for the level of economic activity in a country would be the level of government expenditure. Keynes thus quietly smuggled politics into economics. He said the only way to keep the economy on even keel was for the government to “prime the pump” when aggregate demand fell below the level needed to keep employment at some notional equilibrium level.
His theory was accepted by a frightened British political class. After all, Hitler, with his massive re-armament programme of Germany, had already proved that the idea worked. Roosevelt had done the same thing in the US. So, by the time the Second World War ended in 1945, Keynesian theory had become the received wisdom. Governments had moved centre stage which meant politicians now wrote the script.
My argument in 2014 was that the role of ‘G’ in the Keynesian identity had undergone the most fundamental change imaginable. Instead of governments intervening to stabilise the real sector – Keynes’ famous let them dig holes and fill them dictum – they had intervened to stabilise the financial sector. Government spending, from once being something that sought to boost investment and activate the employment multiplier, had become the grease in the wheels of finance. It is my submission that Keynesian theory needs an upgrade. But none seems to be forthcoming because now Keynes has ceased to be the property of economics and has become an indispensable instrument of politics.
***
So where do we go from here? I think we need to discard the Western mode of economic thought which is based, at its core, on legitimising the third deadly sin of Christianity, greed. The fancy name for it is maximising utility – at the level of the individual – and maximising profit, at the firm level. The system seeks to incorporate greed into an intellectual system and, when things get out of hand, to regulate it.
Even then, in the first instance the regulation is left to the markets and when those fail, to state intervention. The premise that it is perfectly all right for greed to drive economic activity has never been questioned. Its natural consequence has been environmental depredation, climate change and the now fashionable calls for ‘sustainable’ development.
If this is accepted, it becomes necessary to move away from one-size-fits-all economic theory. The 20th century saw two modes of economic thought being universalised: Marshalling analysis at the microeconomic level and Keynesian solutions at the macroeconomic level. Every country needs to develop adaptations of these which are best suited to its context. That is, a ‘socialism with Chinese characteristics’ sort of approach is needed.
In their own ways, both Marx and Gandhi had pointed this out but both lost out because neither was able to go beyond appeals to human nature. A more robust intellectual framework is needed. This is not going to happen overnight but some Indian economists need to start thinking about these issues.
*The ideas in this paper first appeared in a series of articles I wrote for the Hindu Business Line in 2009. They were distilled over the previous 10 years when I wrote a weekly column for the Business Standard on economics research. I stopped that column in September 2008.
(Acknowledgement: This paper has been published in Uma Kapila, Ed. (2017) 'India's Economy: Pre-liberalisation to GST - Essays in honour of Raj Kapila' Academic Foundation, New Delhi)