For the sake of argument, let’s follow the lead of behavioral economics and assume for the sake of discussion that people are not rational. Consistency requires we apply this description to all representatives of homo sapiens, including those who make government policy. As the public-choice school shows, officials and members of parliaments are not angels, but the same people as everyone else. If so, cognitive errors, so publicized by behavioral economists, should also apply to them. Daniel Kahneman in his book Thinking, Fast and Slow describes the phenomenon called “planning fallacy.” It means formulating overly optimistic forecasts about the results of undertaken ventures. Although it affects companies and people, it also applies to politicians. Kahneman gives an example of the Scottish
Mises Institute considers the following as important: News
This could be interesting, too:
Peak Prosperity writes Off The Cuff: The System Is Purposely Insidious
Mises Institute writes Economic Nationalism: From Mercantilism to World War II
Chris Vermeulen writes VIX Warns Of Imminent Market Correction
Charles Hugh Smith writes Political and Social Conflict Is Accelerating: Here’s Why
For the sake of argument, let’s follow the lead of behavioral economics and assume for the sake of discussion that people are not rational. Consistency requires we apply this description to all representatives of homo sapiens, including those who make government policy. As the public-choice school shows, officials and members of parliaments are not angels, but the same people as everyone else. If so, cognitive errors, so publicized by behavioral economists, should also apply to them.
Daniel Kahneman in his book Thinking, Fast and Slow describes the phenomenon called “planning fallacy.” It means formulating overly optimistic forecasts about the results of undertaken ventures. Although it affects companies and people, it also applies to politicians. Kahneman gives an example of the Scottish Parliament building in Edinburgh: in 1997 it was estimated that it would cost up to 40 million pounds. Ultimately, after many revisions, the building was completed at a cost of around 431 million pounds, which is over ten times greater! Other well-known examples are the protracted construction of a new and still more expensive Berlin airport, the IMF overly rosy forecasts, or the Fed’s optimistic projections about the date of normalization of its monetary policy after the Great Recession.
[RELATED: “The Problem with Prescriptive “Rationality” in Economics” by Arkadiusz Siero?]
This is not the only cognitive error that the rulers are susceptible to. They often mistakenly respond to risk, which results from the availability heuristics. Kahneman describes the story of Alar, a plant growth regulator that regulated the ripening of apples and improved their appearance. However, under the influence of media reports of its negative impact on health and the resulting public concern, Alar was withdrawn from the market, despite the lack of any scientific evidence confirming its harmfulness. As a consequence, consumers began to consume fewer apples, which caused real health damage.
Similar examples of exaggerated and costly regulations can be multiplied: dignity-offending searches at airports that result from the overestimation of the terrorist threat, or the abandonment of nuclear energy by developed countries despite the lack of scientific research confirming its relative harmfulness.
Think also about a self-serving bias, which means attributing successes to yourself and failures to external factors. If the economy is growing, it is obviously only thanks to the government; but if it slows down, it is slowing despite the government’s actions.
Another example is confirmation bias — a tendency to prefer information confirming the belief held. Public debate perfectly illustrates this phenomenon. After the implementation of a given policy, all subsequent government activity focuses on justifying it, ignoring the facts. The flagship example is the Vietnam War, which was a lost cause from the beginning. Nevertheless, the US government ignored uncomfortable information and continued the military operations. A more modern example is drug prohibition.
Last but not least, according to the bias called “What You See Is All There Is,” people tend to jump to conclusions based only on information they have available. And isn’t this actually the description of the broken window fallacy Bastiat wrote about? Lawmakers often focus on the short-term and particular positive effects of an intervention, ignoring, however, its long-term, overall economic negative consequences. A classic — and unfortunately still relevant — example is tariffs, which bring short-term benefits to domestic producers of goods subject to tariffs, but this hurts the economy as a whole.
It should be clear now that simply demonstrating that people are irrational is not a sufficient argument in favor of greater interventionism. It is ironic that those who preach this view all themselves to fall into the trap of Nirvana’s fallacy, which Harold Demsetz wrote about. Those who fall victim to this compare certain shortcomings of reality (market participants often make mistakes) not with a real alternative, but with an ideal, in this case with officials and politicians who are completely rational.
Cognitive Biases of Voters
As Bryan Caplan showed in his book The Myth of The Rational Voter, citizens systematically vote for parties and programs that are not necessarily in their long-term economic interest, thus making irrational choices. Caplan distinguishes four main groups of systematic errors:
- Anti-market bias — a tendency to underestimate the economic benefits of the market mechanism;
- Anti-foreign bias — a tendency to underestimate the economic benefits of interaction with foreigners;
- Make-work bias — a tendency to underestimate the economic benefits from conserving work;
- Pessimistic bias — a tendency to overestimate the severity of economic problems and underestimate the (recent) past, present, and future performance of the economy.
It is not difficult to find examples confirming Caplan’s analysis. Suspicious treatment of the wealthiest members of society (see Oxfam’s annual reports) is anti-market prejudice. Recent populist tendencies — the US-China trade war and the aversion of the supporters of Brexit toward immigrants — result from anti-foreign bias. The common fear of robotization is the next incarnation of make-work bias. Fear of secular stagnation, ecological catastrophe or income inequalities results from pessimism bias. Voters’ beliefs about economics are systematically wrong.
Why is this a problem? People take part in elections and choose irrational solutions that harm the society. Voting practically costs nothing, but gain significant psychological benefits in the form of virtue signaling, expressing their patriotism, concern for the environment, or simply support for a given group. This is quite different from action in the marketplace, in which pursuit of gain motivates to limit irrationality and behave reasonably.
What is the conclusion? Simply put: since the electoral mechanism leads to irrational results, we should reduce the scope of political power and expand the scope of the market. And it is not necessarily about eliminating democracy, but about ensuring that the government does not deal with almost everything as it does today.
This takeaway gains strength in the light of the latest research, according to which the justification of the existence of the (welfare) state results from cognitive error. Philipp Bagus and Eva María Carrasco Bañuelos, in an unpublished paper “The Welfare Bias,” suggest that people systematically underestimate the tendency of others to help those in need. As in the case of driving skills — 90 percent of drivers consider themselves to be better than average — in the moral dimension there is also the effect of universal above-average. “We would certainly help,” many will tell themselves, “but if others do not support the poor in the same way, then the government safety net is needed.” The modern welfare state is therefore based on cognitive error.
Michael Huemer goes even further in his excellent book The Problem of Political Authority, suggesting that the main reason why people favor governments is because they have strong pro-authority psychological biases or even suffer from Stockholm Syndrome. Thus, according to Huemer, not only the modern welfare state, but the political authority of the state in general, results from cognitive biases.
Conclusions: Less Policy, More Market
Behavioral economists say they have refuted the myth of homo economicus. Since individuals are irrational, they tell us, the state must manage the economy. However, the blade of criticism of behavioral economics can be turned 180 degrees. It is based on the belief that officials and politicians behave more rationally than ordinary people and therefore can nudge others in a socially desirable direction. But this is not the case. Hence, one cannot simply argue that people are irrational, so we need the guiding hand of the state. Rather, one could easily demonstrate that public officials are less irrational, or more precisely, that the political system is less irrational than the market system.
Moreover, the market rewards careful decision-making, while the political system does not. Thanks to the market’s reward system, known as the profit and loss mechanism, we are able to know which actions were appropriate and which were not, and modify our behavior on an ongoing basis. There is an objective test of the adequacy of our activities. Material losses very quickly discourage ill-considered decisions. However, there is no such direct test in bureaucracy. A voter who votes for a ban on migration or trade does not bear the full cost of his decision. An entrepreneur who refuses to employ a more efficient migrant workers bears many costs indeed.
Moreover, even if politicians were rational, we can’t assume their voters would be. And from the point of view of politicians, it is rational to meet the needs of voters, even if they are aware of their erroneous views. In an ideal world, voters are well informed and vote based on reliable analysis. Unfortunately, in reality this is not the case, and voters’ biases systematically come into play.
Powered by WPeMatico