Category Archives: Economics

Econometrics and Nonobserved Facts

Theoretical Assumptions and Nonobserved Facts

Wassily Leonteif

Economics today rides the crest of intellectual respectability and popular acclaim. The serious attention with which our pronouncements are received by the general public, hard-bitten politicians, and even skeptical businessmen is second only to that which was given to physicists and space experts a few years ago when the round trip to the moon seemed to be our only true national goal. The flow of learned articles, monographs, and textbooks is swelling like a tidal wave; Econometrica, the leading journal in the field of mathematical economics, has just stepped up its publication schedule from four to six issues per annum. (Leontief 1971, 1)

And yet an uneasy feeling about the present state of our discipline has been growing in some of us who have watched its unprecedented development over the last three decades. This concern seems to be shared even by those who are themselves contributing successfully to the present boom. They play the game with professional skill but have serious doubts about its rules. (Leontief 1971, 1)

Much of current academic teaching and research has been criticized for its lack of relevance, that is, of immediate practical impact. In a nearly instant response to this criticism, research projects, seminars and undergraduate courses have been set up on poverty, on city and small town slums, on pure water and fresh air. In an almost Pavlovian reflex, whenever a new complaint is raised, President Nixon appoints a commission and the university announces a new course. Far be it from me to argue that the fire should not shifted when the target moves. The trouble is caused, however, not by an inadequate selection of targets, but rather by our inability to hit squarely any one of them. The uneasiness of which I spoke before is caused not by the irrelevance of the particular problems to which present day economists address their efforts, but rather by the palpable inadequacy of the scientific means with which they try to solve them. (Leontief 1971, 1)

If this simply were a sign of an overly high aspiration level of a fast developing discipline, such a discrepancy between ends and means should cause no worry. But I submit that the consistently indifferent performance in practical applications is in fact a symptom of a fundamental imbalance in the present state of our discipline. The weak and all too slowly growing empirical foundation clearly cannot support the proliferating superstructure of pure, or should I say, speculative economic theory. (Leontief 1971, 1)

Much is being made of the widespread, nearly mandatory use of modern economic theorists of mathematics. To the extent which the economic phenomena possess observable quantitative dimensions, this is indisputably a major forward step. Unfortunately, any one capable of learning elementary, or preferably advanced calculus and algebra, and acquiring acquaintance with the specialized terminology of economics can set himself up as a theorist. Uncritical enthusiasm for mathematical formulation tends often to conceal the ephemeral substantive content of the argument behind the formidable front of algebraic signs. (Leontief 1971, 1-2, emphasis added.)

(….) By the time it comes to the interpretation of the substantive conclusions, the assumptions on which the model has been based are easily forgotten. But it is precisely the empirical validity of these assumptions on which the usefulness of the entire exercise depends. (Leontief 1971, 2)

What really is needed, in most cases, is a very difficult and seldom very neat assessment and verification of these assumptions in terms of observed facts. Here mathematics cannot help and because of this, the interest and enthusiasm of the model builder suddenly begins to flag: “If you do not like my set of assumptions, give me another and I will gladly make another model; have your pick.” (Leontief 1971, 2)

(….) To sum up with the words of a recent president of the Econometric Society, “… the achievements of economic theory in the last two decades are both impressive and in many ways beautiful. But it cannot be denied that there is something scandalous in the spectacle of so many people refining the analysis of economic states which they give no reason to suppose will ever, or have ever, come about…. It is an unsatisfactory and slightly dishonest state of affairs.” (Leontief 1971, 2)

But shouldn’t this harsh judgment be suspended in the face of the impressive volume of econometric work? The answer is decidedly no. This work can be in general characterized as an attempt to compensate for the glaring weakness of the data base available to us by the widest possible use of more and more sophisticated statistical techniques. Alongside the mounting pile of elaborate theoretical models we see a fast-growing stock of equally intricate tools. These are intended to stretch to the limit the meager supply of facts. (Leontief 1971, 2-3)

(….) However, like the economic models they are supposed implement, the validity of these statistical tools depends itself on the acceptance of certain convenient assumptions pertaining to stochastic properties of the phenomena which the particular models are intended to explain; assumptions that can be seldom verified. (Leontief 1971, 3)

In no other field of empirical inquiry has so massive and sophisticated a statistical machinery been used with such indifferent results. (Leontief 1971, 3)

(….) Continued preoccupation with imaginary, hypothetical, rather than with observable reality has gradually led to a distortion of the informal valuation scale used in our academic community to assess and to rank the scientific performance of its members. Empirical analysis, according to this scale, gets a lower rating than formal mathematical reasoning. (Leontief 1971, 3)

A natural Darwinian feedback operating through selection of academic personnel contributes greatly to the perpetuation of this state of affairs…. Thus, it is not surprising that the younger economists, particularly those engaged in teaching and in academic research, seem by now quite content with … building more and more complicated mathematical models and devising more and more sophisticated methods of statistical inference without engaging in empirical research. Complaints about the lack of indispensable primary data are heard from time to time, but they don’t sound very urgent. (Leontief 1971, 3)

(….) To deepen the foundation of our analytical system it will be necessary to reach unhesitatingly beyond the limits of the domain of economic phenomena as it has been staked out up to now…. To penetrate below the skin-thin surface of conventional [econometric mathematical models], it will be necessary to develop a systematic study of the structural characteristics and functioning of [economic targets, e.g., households, etc.], an area in which description and analysis of social, anthropological and demographic factors must obviously occupy the center of the stage. (Leontief 1971, 4, emphasis added.)

Establishment of systematic cooperative relationships across the traditional frontiers now separating economics from these adjoining fields is hampered by the sense of self-sufficiency resulting from what I have already characterized as undue reliance on indirect statistical inference as the principal method of empirical research. (Leontief 1971, 4, emphasis added.)

Formally, nothing is, of course, wrong with such an apparently circular procedure. Moreover, the model builder in erecting his hypothetical structures is free to take into account all possible kinds of factual knowledge and the econometrician in principle, at least, can introduce in the estimating procedures any amount of what is usually referred to as “exogenous” information before he feeds his programmed tape into the computer. Such options are exercised rarely and when they are, usually in a casual way. (Leontief 1971, 4-5, emphasis added.)

(….) The same well-known sets of figures are used again and again in all possible combinations to pit different theoretical models against each other in formal statistical combat…. The orderly and systematic nature of the entire procedure generates feelings of comfortable self-sufficiency. (Leontief 1971, 5)

This complacent feeling … discourages … attempts that would involve crossing the conventional lines separating ours from the adjoining fields. (Leontief 1971, 5)

An exceptional example of a healthy balance between theoretical and empirical analysis and the readiness of professional economists to cooperate with experts in the neighboring disciplines is offered by Agricultural Economics as it developed in this country over the last fifty years…. Preoccupation with the standard of living of the rural population has led agriculture economists into collaboration with home economists and sociologists, that is, with social scientists of the “softer” kind. While centering their interest on only one part of the economic system, agricultural economists demonstrated the effectiveness of systematic combination of theoretical approach with detailed factual analysis. They were also the first among economists to make use of the advanced methods of mathematical statistics. However, in their hands, statistical inference became a complement to, not substitute for, empirical research. (Leontief 1971, 5, emphasis added.)

The shift from casual empiricism that dominates much of today’s econometric work to systematic large-scale factual analysis will not be easy. (Leontief 1971, 5, emphasis added.)

Possessive Individualism

Behavioural and experimental economics — not to speak of psychology — show beyond any doubt that ‘deep parameters’ — peoples’ preferences, choices and forecasts — are regularly influenced by those of other participants in the economy. And how about the homogeneity assumption? And if all actors are the same – why and with whom do they transact? And why does economics have to be exclusively teleological (concerned with intentional states of individuals)? Where are the arguments for that ontological reductionism? And what about collective intentionality and constitutive background rules?

— Lars Syll, RWER, Economic modeling — a constructive critique, Jan 12, 2023

The Enlightenment gave us the idea of the individual as an independent sapient being. Freed from superstition and official dogma, the individual was now in comprehensive control of her innate and acquired capabilities. Fearing the loss of moral certitude, Immanuel Kant found a substitute—it was moral to be rational and rational to be moral. The young discipline of economics embraced this newly created autonomous agent and assigned that individual the task of rendering rational—self-interested—choices. Such choices revealed true value: the rational individual could not be wrong about her choices. Individualism was now wedded to acquisitive behavior. (Bromley 2019, ix)

Max Weber’s The Protestant Ethic and the Spirit of Capitalism provided theological gloss to the emergent culture of enterprise and striving. By the middle of the twentieth century, economics had become the civic religion of the modern industrial state whose purpose was to provide ever-increasing living standards for its citizens. As with daily weather updates, certain economic data came to define life under industrial capitalism. The unemployment rate, daily movements in the stock market (including those in far-off places), and the latest perturbations in the consumer price index became the contemporary equivalent of ancient astrological sightings. (Bromley 2019, ix)

By the end of the twentieth century, industrial capitalism—with a new global reach—had given way to financial capitalism. As the twenty-first century dawned, there was yet another transition underway: managerial capitalism. The Great Recession of 2007-2009 delivered a surprising destructive shock to large swaths of the population in western Europe and the United States. The angst and anger produced by that disruption have not abated. An abrupt loss of faith in the presumed beneficence of capitalism coincided with mounting despair—and political revolts—in the Middle East, parts of Latin America, and much of Africa. Immigrants became a threat to the comfortable social compact that had defined life since the end of World War II. The political class, regardless of party affiliation, was now bereft of ideas. (Bromley 2019, ix)

The lack of comprehension is to be expected. Govening elites have been mentally nurtured on the defining fiction of modernism. The autonomous acquisitive individual was both rational and moral. The civic religion provided the necessary benediction. The destructive paralysis was abetted by a fiction within a fiction. The civic religion had managed to insist that the economy—the “market”—is a separate and quite delicate sphere of efficiency and rectitude. When held up against the self-dealing incoherence of politics, tampering with the economy can only inflict harm. Kenneth Arrow proved that social choices were inconclusive and contradictory. Only markets offered consistent clarity. Politicians must not interfere with the mystical workings of the economy. Private firms, praised as “job creators,” now comprise the sacred temples of modern capitalism. Government intervention in the markeet is dangerous and must be avoided. This protected realm is the fragile fount of future well-being. (Bromley 2019, x)

Ironically, the autonomous individual is now unwittingly accomplice in his own economic marginalization. Dependent on the constellation of sanctified private firms for his precarious livelihood, he is unknowingly enlisted in the self-defeating cause of laissez faire. As politicians in western Europe and the United States quake and bluster before the alleged hordes of migrants seeking a better life, their constituents—nervous victims of the abundant caprice of managerial capitalism—exhibit behavior that further confounds the anomie and paralysis. (Bromley 2019, x)

Possessive individualism both reigns and incapacitates. (Bromley 2019, x)

(….) [T]he discipline of economics—building on the created individual of the Enlightenment—has crafted a suite of concepts and practices that constitute justificationism. … [M]uch of contemporary economics is misleading in its assumptions, willful in its presentation, and contrived in its conclusions. The central idea of rationality—the rational individual and rational choice—is circular and thus a profound deception. Its purpose has been to situate economic calculation—maximizing utility—at the center of human behavior. To borrow from Gertrude Stein, I will show that there is no there there. (Bromley 2019, xi)

The popularity of contemporary economic concepts and the acquired habits of mind on those concepts will be exposed as the perverse contamination that has undermined the aspirations of the Enlightenment. Freed from the chains of ancient superstition, the individual was soon reenslaved in the service of industrial capitalism and its offspring. Today the liberated individual is a nervous striver seeking something that relentless acquisitiveness cannot provide. (Bromley 2019, xi)

Spreading the Manure

[O]ur generation’s economists will reply they did not cause the crisis – that it was not they who deregulated the financial sector to allow gamblers to swoop in, not they who demanded higher and higher returns on their investments so they could sustain their spending, not they who took out all the debt to bail out the banks, and above all, not they who designed the models and invented the complex instruments that nearly brought the world economy crashing down around us. It was the politicians who changed the rules, the mathematicians and theoretical physicists working for the banks who came up with the crazy inventions, and the rest of us who demanded more and more money. (….) Throughout the boom, no small number of economists in influential positions helped to create a climate of conviction that made it possible for the magicians to peddle their wares – and, as the documentary film Inside Job would later reveal, some of those economists did so while running lucrative side businesses consulting for the banks behind the crisis (apparently vindicating Upton Sinclair’s adage that it’s difficult to get a man to believe something when his pay cheque depends on him believing the opposite). Alan Greenspan was hardly alone in repeatedly using his pulpit to assure the faithful there was nothing to fear. When, for instance, the IMF chief economist Raghuram Rajan presented a paper to the Federal Reserve’s annual conference in 2005 warning that the new speculative instruments posed tremendous risks, Lawrence Summers ridiculed him as a Luddite who simply disliked technological change. At that, the chorus present cried Amen. Rajan later described the experience of speaking to the neoliberal high priesthood, which included not just celebrity economists like Summers but a ‘who’s who of central bankers’ that, ‘I felt like an early Christian who had wandered into a convention of half-starved lions.’

(Rapley, John. Twilight of the Money Gods: Economics as a Religion and How it all Went Wrong.)

Does the story you tell of the Great Financial Crisis leave out the irrational, spreading panic that arbitrarily devalued perfectly fine mortgage-backed security assets well in excess of actual defaults, which have been higher since (during the pandemic) without causing the same crisis?

~ rsm spreading some manure

Spreading bullshit is a practice similar to the phenomena of self-deception. Whether done consciously or not, the bullshitter has no care for the actual facts or truth, only that they can make others believe their bullshit. Case in point is the bullshit spread by RSM above, who makes the claim that mortgage defaults were higher during the pandemic and yet caused no global crisis. In fact, his claim is false. The fact is one can shop around for the numbers (massaged statistics) one wants and that fit one’s ideological bias; the charts are all over the place, clearly showing that they have been massages to fit a specific narrative. Consider that even by more recent measures RSM’s bullshit claims are refuted:

Under the effects of the coronavirus crisis, the mortgage delinquency rate in the United States spiked to 8.22 percent in the second quarter of 2020, just one percent down from its peak of 9.3 percent during the subprime mortgage crisis of 2007-2010. Following the drastic increase directly after the outbreak of the pandemic, delinquency rates started gradually declining and reached 3.45 percent in the third quarter of 2022.

statista, Mortgage delinquency rate in the U.S. 2000-Q3 2022, Jan 13, 2023

As the evidence below shows the prevalence of predatory lending and its consequent mortgage defaults were the cause of the GFC, not some irrational panic, albeit once the wider societal ramifications of the near collapse of the global financial system did spread fear and panic. But this was an effect of not the cause of the GFC. RSM seeks to put the cart-before the horse because (as his comments on RWER show) he favors an unregulated securities market where banks and inventors can turn the global economy into a casino economy. Sophistry indulges in half-truths, falsehoods, and outright lies to make others believe their bullshit. In RSM’s story the shit is wearing the pants; there never was a case of predatory lending; homeowners were not conned out of stable and safe conforming loans into predatory loans inevitably destined to cause them to default. And the rising default rates we not due to the toxic debt of homeowners foreclosed on by predatory banks. They just turned in their keys out of irrational panic!

[T]he age of chivalry is gone. That of sophisters, economists, and calculators, has succeeded; and the glory of [the world] is extinguished for ever.

~ Edmund Burke (1790) cited in Bernstein (2004), A Perilous Progress, 185.

I always love that kind of argument. The contrary of a thing isn’t the contrary; oh, dear me, no! It’s the thing itself, but as it truly is. Ask any die-hard what conservatism is; he’ll tell you that it’s true socialism. And the brewers’ trade papers: they’re full of articles about the beauty of true temperance. Ordinary temperance is just gross refusal to drink; but true temperance, true temperance is something much more refined. True temperance is a bottle of claret with each meal and three double whiskies after dinner.

~ Aldous Huxley, Eyeless in Gaza (1936), in Hardcastle’s Bullshit and Philosophy.

Bullshit and Truth … [B]ullshit results from the adoption of lame methods of justification, whether intentionally, blamelessly or as a result of self-deception. The function of the term is to emphatically express that a given claim lacks any serious justification, whether or not the speaker realizes it. By calling bullshit, we express our disdain for the speaker’s lack of justification, and indignation for any harm we suffer as a result.

~ Scott Kim Brough, On Letting it Slide, in Hardcastle’s Bullshit and Philosophy.

When … [one adopts] poor methods, such as the “method” of cherry-picking facts to support a political agenda, the result is bullshit. And it’s bullshit to repeat the results not only because what is repeated is bullshit, but because the method of arriving at the opinion in question is not to be trusted. Warmed over bullshit is not merely a stale imitation of the original, but a fresh deposit that compounds the methodological faults of the original.

~ Scott Kim Brough, On Letting it Slide, in Hardcastle’s Bullshit and Philosophy.

The subprime crisis didn’t have to happen. It could have been stopped. It is the story of how greed unchecked can bring down the global financial system. Thanks to Greenspan and ilk like him predatory lending was given a green light. Under Greenspan’s watch:

“Federal regulators gave subprime lending their blessing by leaving subprime loans untouched, even though many of the loans violated the most basic tenet of lending: that no loan should be made unless the borrower can repay. Worse yet, federal regulators actively resisted using their substantial powers of rule-making, examination, and sanctions to crack down on the proliferation of virulent loans. At the same time, they gave banks the green light to invest in subprime mortgage-backed securities and CDOs, leaving the nation’s largest financial institutions awash in toxic assets (Subprime Virus).”

Predatory lending goes back thousands of years, but subprime loans that underpinned the GFC were being reported even in the early 1990s. What started as a niche market of selling risky subprime loans to the poor and credit challenged using dodgy predatory practices eventually moved from sketchy to mainstream institutions on Wall Street. Feeding the mortgage machine was the name of the game. From lenders’ perspective, the mortgage machine needed constant feeding in order to generate constant fees. Volume was what mattered.

Lenders dropped underwriting standards and conned borrowers who qualified for cheap prime loans into agreeing to costlier subprime loans and in addition saddled borrowers the with predatory fees. Countless sad examples of these predatory financial practices have been documented. These subprime loans were by design meant to default so upon a borrower’s’ default they could be flipped. Lenders targeted just such individuals with NINJA loans knowing they would default so they could “flip” the borrowers into more expensive loans under the pretense they were helping them.

Each “loan flip” resulted in more fees for the brokers and lenders, which they tacked onto the principal. With each flip, the borrowers’ equity shrank and their monthly payments went up, until their equity disappeared and they could no longer qualify for loans. The easiest loans to flip were those that borrowers couldn’t afford in the first place. There is damming evidence of other professionals (banks, accountants, lawyers etc. engaging in collusion with lenders/mortgage brokers to commit outright fraud with lenders.

It is easy to blame the SEC for failing in its enforcement, but this ignores the ideologiical (type III) bias of individuals like Alan Greenspan et. al. whose ideological prejudices precluded them from actually doing their jobs and regulating the financial markets as they were tasked to do. In addition, regulatory capture was a real issue resulting in the foxes in the hen house and/or defunding of agencies so they were under staffed and unable to properly perform their regulatory roles. Yet, many so-called “liberals” who were actually neoliberals pushing neoliberal economic doctrines about the “free market” did just that, blaming the SEC rather than looking for deeper causes. And this form of pernicious Ideological III bias persists to this day with predatory finance still remaining largely unregulated in the United States.

Another frequent trope trotted out is that the government GFCs Fannie Mae and Freddie Mac caused the GFC. Half-truths really are nothing more than propaganda and lies used for ideological puposes. The full truth is in the details. The GSEs eventually became private entities with shareholders and up until Wall Street got into the subprime game only accepted what is called conforming loans opposed to non-confroming loans. The difference between these two forms of underwriting are crucial to understanding how Wall Street and putting foxes in the hen house corrupted the role of GSEs and the entire housing market.

A conforming loan is a mortgage that meets the dollar limits set by the Federal Housing Finance Agency (FHFA) and the funding criteria of Freddie Mac and Fannie Mae who set the gold standard up until the crowding out effect of Wall Street’s’ predatory financing practices. Conforming loans adhere to strict underwriting standards; incomes are verified, credit records checked, and all due diligence is performed to assure that the borrower meets such strict standards that they qualify for the loan. And most importantly, the terms of the loan are designed to assure that they are not so onerous as to be the cause of default. In addition, the financial industry lobbied hard to keep hidden their default rates so they could milk the cash cow of predatory lending as long a they possibly could. So while Fredde Mae and Freddie Mac were only processing conforming loans Wall Street and its predatory lenders were flooding the market with non-conforming loans that guaranteed massive defaults while enriching the 1% at the top and wiping out regular homeowners conned into accepting these loans, sometimes even switching from a safe conforming loan into a non-conforming predatory loan. By 2006 80% of predatory “private label” non-conforming subprime loans were being securitized by Wall Street and given AAA ratings by ratings agencies despite the real underlying rot that was being sold (i.e., pass the trash) by market makers like Goldman Sacks ([13:55_14:14], [14:54_15:27]).

Eventually the crowd out effect undermined the ability of conforming loans to compete thereby driving what before had been responsible lending institutions to “get into the game” and start selling subprime loans (or in the GSEs case, buying them) and join the race to the bottom. Predatory lenders and their greedy Wall Street financiers who were packaging up these toxic time bombs didn’t worry about solvency; they were to busy making profits and “passing the trash” to unwary investors around the world who were being told by ratings agencies these little turds were AAA rated. In other words, the shite was wearing the pants!

The predatory banks couldn’t get enough of these toxic turds. They flooded middle-class neighborhoods with freshly minted salespersons who would canvass neighborhoods door-to-door selling these time bombs, often trying to get homeowners out of safe fixed-rate conforming loans and into their predatory products. “According to the Wall Street Journal, 55 percent of all subprime loans in 2005 went to people with sufficiently high credit scores to qualify for prime loans (Subprime Virus).” We know, for they tried to do it to us. Before the subprime driven GFC my wife and I were daily barraged with mortgage brokers seeking to move us from a 30-year fixed into one of Wall Street’s new subprime adjustable-rate loans that Greenspan crowed so glowingly about. I knew this was a shell game and Ponzi scheme even then. We finally invited one of the poor ignorant salespersons into our home so we could witness their BS selling points for ourselves (never once intending to fall for this short-sighted Ponzi scheme being waged against middle class Americans with little financial education). She sat there flipping pretty pie and bar charts reciting her sales mantra (“You will save $600 a month on payments”) never once discussing the downside — when interest rates change so does your payment, or interest only payments don’t decrease the principle, or when balloon payments kick-in and you can’t refinance as promised was so easy, you are essentially screwed. She made a point of telling us she herself had one of these exotic little financial monstrosities birthed on Wall Street. Most likely she lost her home with the rest of the fools who bought into this Ponzi scheme. It should not come as a surprise that by 2009 default rates were skyrocketing, going from 5 percent in 2004, to 8 percent in 2008, to 16 percent in 2006, when finally, the music stopped, and many homeowners ended up without a chair to sit in.

Behind Mathematical Notation

[I]t is a common trope of theoretical modeling exercises to begin by citing a “stylized fact” about some aspect of the economy and to present the exercise as developing a possible explanation. Theoretical modeling exercises may vary in how much the foreground the questions about the target world that ultimately motivate the exercise, but some recourse of this kind is implied whenever these exercises are presented as economically relevant (as opposed to simply mathematical exercises). … This matters because it suggests that even highly abstract theoretical models are ultimately justified on the basis of their correspondence to some features of the target world. (Spielgler 2015, 67)

~ ~ ~

Henry Louis Mencken [1917] once wrote that “[t]here is always an easy solution to every human problem — neat, plausible and wrong.” And neoclassical economics has indeed been wrong. Its main result, so far, has been to demonstrate the futility of trying to build a satisfactory bridge between formalistic-axiomatic deductivist models and real world target systems. Assuming, for example, perfect knowledge, instant market clearing and approximating aggregate behaviour with unrealistically heroic assumptions of representative actors, just will not do. The assumptions made, surreptitiously eliminate the very phenomena we want to study: uncertainty, disequilibrium, structural instability and problems of aggregation and coordination between different individuals and groups. (Syll 2016, 56)

The punch line of this is that most of the problems that neoclassical economics is wrestling with, issues from its attempts at formalistic modeling per se of social phenomena. Reducing microeconomics to refinements of hyper-rational Bayesian deductivist models is not a viable way forward. It will only sentence to irrelevance the most interesting real world economic problems. And as someone has so wisely remarked, murder is unfortunately the only way to reduce biology to chemistry — reducing macroeconomics to Walrasian general equilibrium microeconomics basically means committing the same crime. (Syll 2016, 56)

— Lars Pålsson Syll. On the use and misuse of theories and models in economics


On December 5, 1871, John Stewart Mill wrote to his friend and disciple John Elliot Cairnes expressing dismay at the work of William Stanley Jevons, one of the pioneers of the new abstract mathematical style in economics. Jevons had “a mania for encumbering questions with useless complications,” Mill wrote, “with a notation implying the existence of greater precision in the data than the questions admit of” (Mill 1972).

At the time of writing, Mill had not yet read Jevons’ recently published Theory of Political Economy, but if he had, he would have found no reason to change his view. Jevons, for his part, was equally critical of Mill’s work — and used remarkably similar language to make his complaint. According to Jevons, it was Mill’s economic doctrines — and those of the then-dominant British Classical School more generally — that were unnecessarily complicated, because they were based on “mazy and preposterous assumptions” about the basic concepts of political economy (Jevons 1965: xliv). (Spiegler 2015, 1)

What Mill and other classical political economists failed to see, Jevons argued, was that despite the apparent complexity of human social activity there was a fundamental simplicity and unity at its core. Standard economic notions such as utility, wealth, value, commodity, labor, land, and capital all reflected a single underlying theme: the basic human tendency to “satisfy our wants to the utmost with the least effort — to procure the greatest amount of what is desirable at the expense of the least that is undesirable — in other words, to maximize pleasure” (Jevons 1965: 37). This tendency manifested itself in human behavior in a manner that was uniform across people, quantitatively (Jevons thought cardinally) measurable, and separable from influences that were more context-dependent, such as morality or culture. Recognizing this, Jevons argued, would allow many of the issues that had troubled classical political economists to be bracketed, enabling the articulation of a precise “mechanics of utility and self-interest” on the model of physical mechanics (Jevons 1965: 21, emphasis original). (Spiegler 2015, 1-2)

According to Jevons, the analogy with physical mechanics ran deep. The “laws and relations” governing utility mechanics had to be “mathematical in nature,” because they “dealt with quantities,” i.e., “things … capable of being greater or less” (Jevons 1965: 3, emphasis original). These laws could also be isolated from potentially disturbing factors, not only conceptually but also empirically. Although the economist could not conduct controlled experiments to affect this isolation directly, Jevons believed that the effects of disturbing factors could be dealt with systematically, even when economists were largely in the dark about their nature and operation. (Spiegler 2015, 2)

Consequently, it seemed to Jevons that scepticism about the possibilities of a precise science of political economy, like that expressed by Mill in his letter to Cairnes, was merely conservatism standing in the way of progress. This sentiment was expressed clearly in the concluding comments to the Theory of Political Economy, in a section titled “The Noxious Influence of Authority” Jevons wrote:

I think there is some fear of the too great influence of authoritative writers in Political Economy. I protest against deference for any man, whether John Stewart Mill, or Adam Smith, or Aristotle, being allowed to check inquiry. Our science has become far too much a stagnant one, in which opinions rather than experience and reason are appealed to … Under these circumstances it is a positive service to break the monotonous repetition of current questionable doctrines, even at the risk of new error. (Jevons 1965: 276-7)

Looking back on the disagreement between Mill and Jevons from the perspective of 2015, it would seem that Jevons has been vindicated. Contemporary academic economics is a thoroughly mathematical enterprise, reflecting many features of Jevons’ approach. And one finds few doubts within the professional mainstream as to the aptness of the mathematical analysis of economic behavior. To most contemporary economists, Mill’s views on methodolgy of political economy are at best an interesting piece of intellectual history. They are irrelevant to the actual practice of economics. (Speigler 2015, 2)

Yet Mill’s skepticism toward Jevon’s approach to political economy may be more than a mere historical curiosity. Mill’s position, especially when understood in the context of his broader philosophy of science, poses a fundamental and formidable challenge to those who, like Jevons, would wish to use the power and precision of mathematics to investigate social phenomena. In fact, the issues Mill discerned continue to vex mathematical economics to this day. To see that, however, we need to understand the basis of his misgivings. (Spiegler 2015, 2-3)

As a committed empiricist, Mill held fast to the value of experience. The general principles of science were, in Mill’s eyes, contrivances in its service and subject to its discipline. Although abstractions were necessary to formulate general principles, Mill insisted that one must not make the mistake of taking abstractions to be the object of scientific inquiry, rather than the phenomena they were supposed to represent. If a scientist lost focus on the actual phenomena of interest in that matter, the concepts advanced in their service might well become detached from them. It would then become unclear what, if any, epistemic value the principles formulated using those concepts would have. As Mill explained,

If anyone, having possessed himself of the laws of phenomena as recorded in words, whether delivered to him originally by others, or even found out by himself, is content from thenceforth to live among these formulae, to think exclusively of them, and of applying them to cases as they arise, without keeping up his acquaintance with the realities from which these laws were collected — not only will he continually fail in his practical efforts, because he will apply his formulae without duly considering whether, in this case or in that, other laws of nature do not modify or supersede them; but the formulae themselves will progressively lose their meaning to him, and he will cease at last even to be capable of recognizing with certainty whether a case falls within the contemplation of his formulae or not. (Mill 1974: Bk. IV, ch. vi, sec. 6, 711)

The prime example of mechanical subject matter, according to Mill, was the physical universe. In his view, it was appropriate to express (for example) Newton’s principle of universal gravitation in mathematical language because human beings are capable of discerning specific quantities corresponding to “mass,” “force,” and “radius” (or, more generally “distance”) with sufficient precision that there could be no relevant qualitative differences among observations within each category. From the standpoint of Newtonian mechanics, it would not matter if one set of forces, masses, and distances occurred in France and another in England (or on the Moon or anywhere else in the universe), or if one set of observations were associated with a morally reprehensible purpose and another not. The only relevant difference between observations of the same type was in their magnitude. (Spiegler 2015, 4)

When confident that one was dealing with mechanical subject matter, it was not only appropriate but ideal to articulate general principles in mathematical language. Doing so enabled scientists to take full advantage of its purely quantitative nature. In particular, they could use their observations to derive and test precise empirical laws from those general principles. This, for example, is what Henry Cavendish did when estimating the value of the gravitational constant G, in Newton’s principle of universal gravitation, F = Gm1 m2/r2 (which expresses the force exerted by a body of mass m1 on a body of mass m2, and visa versa, at a distance of r) (Cavendish 1798). That calculation would have been impossible — or rather, its result would have been meaningless — if Cavendish had not been warranted in taking each successive observation of mass (or the distance between the two objects, or degree of displacement of the objects due to gravity) as qualitatively identical to his preceding observations. (Spiegler 2015, 4-5)

Mathematical language is thus extemely useful in investigating mechanical subject matter. But, Mill argued, it would be perilous to use it to investigate subject matter that was not mechanical. There were two possible causes of concern. First, in such cases mathematical principles might simply project an underlying mechanical structure onto the subject matter whether or not the latter was mechanical in nature. That is, mathematical language might generate a purely quantitative conceptual map of the subject matter it purported to outline, with no way of telling whether the outlines on the map corresponded to the subject’s own contours. As a result, scientists would not be able to feel confident that data gathered according to the conceptual map accurately reflected the underlying subject matter. And because of that, it would be inappropriate to interpret any apparently precise empirical laws derived from that data as empirical laws applying to the actual subject matter. (Spiegler 2015, 5)

Second, and still more worryingly, Mill argued that the commitment to mathematical language could actually prevent scientists from detecting when their conceptual map had become untethered from the subject matter under study. As will be recalled, Mill’s prescribed defense against this kind of detachment was ongoing close contact between the scientist and the object of study. But if exploration of the subject matter itself developed only through the lens of mathematical language — which necessarily obscured any qualitative distinctions among the observations being made within each category — then the scientist would become blind to signs of that mismatch arising. As a result, the mismatch might persist indefinitely. Because of this danger, Mill warned that when the scientist was not certain of the mechanical character of the subject matter, the language of any general principles “should be so constructed that there will be the greatest possible obstacles to a merely mechanical use of it” (Mill 1974: Bk. IV, ch. vi, sec. 6, 707). (Spiegler2015, 5)

The risk that mathematical principles might ascribe mechanical features to non-mechanical subject matter, and thus become untethered from the subject matter they were meant to represent, was precisely what concerned Mill about Jevons’ approach to political economy, and indeed about mathematical social science generally. Human social activity was, for Mill, a paradigmatic example of a non-mechanical subject. It was a realm of almost unfathomable complexity, in two important ways. First, social phenomena were subject to innumerably more causes than physical phenomena. And second, crucially, the operations of those causes were inextricably intertwined. (Spiegler 2015, 6)

Whatever affects, in an appreciable degree, any one element of the social state, affects through it all the other elements. The mode of production of all social phenomena is one great case of Intermixture of Laws. We can never either understand in theory or command in practice the condition of a society in any one respect, without taking into consideration its condition in all other respects. (Mill 1974: Bk. VI, ch. ix, sec. 2, 899)

Thus, although Mill believed it was possible to form reliable general principles (perhaps even mathematical ones) about certain aspects of human nature in isolation, the fact that human beings always and only observe behavior in the welter of society meant it was impossible to discern whether and to what extent those general principles operated empirically. If indeed one knew, as Jevons presumed one would, that the influence of economic factors on human behavior was cleanly separable from the influence of all other factors, and one possessed a reliable method for screening off those influences, then a precise empirical science of political economy might be possible. But for Mill, whether the social world was parsable in this way was an empirical question — and, moreover, a question that could only be addressed through continual immersion in the social world itself — not a simple statement of fact or a self-evidently valid postulate, as Jevons assumed. To take Jevons’ route was to invite a split between model and target that would be undetectable using mathematical methods alone. One could go blithely on with mathematical explorations — gathering data, estimating the precise functional forms and parameters of the principles, and testing them against new data — unaware that in point of fact one had ceased to be exploring the phenomena of interest in any meaningful way. (Spiegler 2015, 6)

Mill’s challenge to Jevons may seem distant from the modern discipline of economics. Yet it finds strong echoes in the debate over the implications for economic methodology of the recent financial crisis. A central question in that debate has been whether the highly abstract mathematical modeling methods that dominated macroeconomics in the years leading up to the crisis — in particular, Dynamic Stochastic General Equilibrium (DSGE) modeling — actively prevented economists from seeing the gathering storm. Critics of DSGE have charged that these models became untethered from the phenomena they were meant to represent in precisely the manner Mill feared. In a 2010 review of DSGE modeling in the Journal of Economic Perspectives, for example, Ricardo Caballero wrote that the practice of DSGE modeling “has become so mesmerized with its own internal logic that it has begun to confuse the precision it has achieved about its own world with the precision that it has about the real one” (Caballero 2010: 85). The primary culprits in that confusion, critics charged, were the extreme simplifying assumptions necessary to ensure the tractability of DSGE models — in particular, (i) the representation of aggregate economic activity as being generated by a small number of representative agents; (ii) the expression of the macroeconomy as a linear (generally log-linear) system; and (iii) the assumption of efficient financial markets. These assumptions rendered the model incapable of taking into account many kinds of complexity that turned out to be crucial factors in the crisis — for example, the perverse incentive structures at play in the financial sector in the late 1990s and 2000s. In effect, the models became mere mathematical exercises — toy models that were not models of the late 1990s‒2000s economy in any meaningful sense. (Spiegler 2015, 6)

Critics have also been concerned with the manner in which the mismatch between DSGE models and the actual economy gave rise to certain analytical blind spots. In a 2009 New York Times Magazine piece cataloguing the failures of economic methodology in the lead-up to the crisis, Paul Krugman argued that DSGE models caused a kind of tunnel vision in which the central causes of the crisis lay outside the realm of consideration. Conceiving of the economy through the lens of the model essentially required the economist to 

[turn] a blind eye to the limitations of human rationality that often lead to bubbles and busts; to the problems of institutions that run amok; to the imperfections of markets — especially financial markets — that can cause the economy’s operating system to undergo sudden, unpredictable crashes; and to the dangers created when regulators don’t believe in regulation. (Krugman 2009) (Spiegler 2015, 7)

Evidence Based Economics

When one has worked one’s entire career within the framework of a powerful paradigm, it is almost impossible to look at that paradigm as anything but the proper, if not the only possible, perspective one can have on (in this case) biology. Yet despite its great accomplishments, molecular biology is far from the “perfect paradigm” most biologists take it to be. This child of reductionist materialism has nearly driven the biology out of biology. Molecular biology’s reductionism is fundamentalist, unwavering, and procrustean. It strips the organism from its environment, shears it of its history (evolution), and shreds it into parts. A sense of the whole, of the whole cell, of the whole multicellular organism, of the biosphere, of the emergent quality of biological organization, all have been lost or sidelined.

(Woese, Carl R. (2005, 101) Evolving Biological Organization. In Microbial Phylogeny and Evolution: Concepts and Controversies (Jan Sapp, ed.). Oxford: Oxford University Press.)

“Whether you can observe a thing or not depends on the theory which you use. It is theory which decides what can be observed” (Albert Einstein speaking to Werner Heisenberg during his 1926 Berlin lecture, quoted in Salam 1990).

Edward Fullbrook (2016, 3) Narrative Fixation in Economics

It is essential to recognise here that the alternative to an explicit philosophy of science is not an absence of philosophy. Rather, it is an implicit and often bad philosophy. And, whatever may be the theoretical or substantive orientations of contemporary economists (whether econometricians, axiomatic-deductive theorists, hermeneuticists, and so on) their practices are all underpinned or informed by (competing) science-oriented philosophies of some sort. Of course, much of this is often tacit or unacknowledged, and it may be in contradiction with other beliefs. But it is precisely because of this that philosophical analysis can go to work. There is always the possibility that explicit methodological investigation of scientific practice, or other social forms, can make a contribution by rendering explicit some knowledge that is already implicit but unrecognised, and perhaps, in the reporting of economists (or whoever), openly contradicted. As Kant argued it is a function of philosophy to analyse concepts which are already given but confused.

(Lawson 2005, 44, Economics and Reality)

Michael Joffe uses a complementary, comparative approach, examining theory development in the natural sciences from a historical perspective to generate insight into how other fields of science use diverse types of evidence combined with causal hypotheses to generate empirically based causal theories. Using the history of natural science (i.e., germ theory, plate tectonics, money and banking, growth of the state, etc.) the goal is to learn useful methodologies for theory development. (Joffe 2017, 1, Abstract)

The history of natural sciences provides exemplars of how to develop causal theories based upon multiple sources of evidence and can be useful as a guide in reforming economics. One element of good scientific practice is cross-disciplinary, comparative perspective using a bottom-up focus on the actual practice of scientists. (Joffe 2017, 2-3, Introduction)

Empirically informed causal theories are developed over time, incrementally, and have an ontic rather than an epistemic focus. They place an emphasis on the role of evidence of multiple interlocking kinds (qualitative and quantitative, experimental and observational) in a dynamic iterative process in which diverse types of evidence are considered in light of hypotheses and theory production utilizing a full range of styles of reasoning. Where contextually appropriate, they use empirical research including experimental, observational, and historical analysis. For example,

The way that the correct description of the money-generating mechanism was achieved was by the patient documenting of what actually happens in the financial system, describing how banks really behave (Joffe 2017, 8).

Another example of theory development based on systematic empirical work is a two-volume study of the growth of the modern state (Lindert, 2004). This describes the growth of the state qualitatively and quantitatively in each of the major countries that developed rapidly after the industrial revolution, together with an analysis of the causal factors in that country. It then provides an over-view of the forces behind state growth, while acknowledging the between-country heterogeneity. Thus, it encompasses description, generalisation and explanation, as well as the limits to generalisation imposed by factors specific to each country. This use of comparative economic history is a good model for developing theory, not least because it ensures that any explanation or suggested causal mechanism corresponds to the spatial and temporal patterns that actually occurred, as well as paying attention to specific factors that may have been present in certain countries. (Joffe 2017, 8, emphasis added)

Theory can become a barrier to causal understanding when it becomes myopic; instead of seeing the world as it is, the scope of what can be examined and seen is determined by the dominant theoretical perspective—its starting point is epistemic (axiomatic) not ontic (Joffe 2017, 9). In such situations what can be studied and observed become restricted by philosophical and/or methodological presuppositions. “Economic analysis should be data-first not theory-first (Juselius, 2011).” When substantive (obvious) knowledge is ignored and not incorporated into theory development to maintain either theory or model “purity” of the axiomatic deductive methodology this is frequently done to maintain an implied universality of stories/models of human behavior even in the face of obvious evidence that shows fundamental dissimilarities between different types of economic systems. The purity of the theory must be maintained so it can be explained in terms of universal human attributes or other postulated attributes of human behavior regardless of how unrealistic such postulates are in the real world. An example of such theory induced blindness can be seen in certain economists search for micro foundations akin to physics (Joffe 2017, 9):

Thus, there is a danger that bad theory can be protected by the co-existence of substantive knowledge with “theory” that does not incorporate it. An important instance is the idea that any macro concept, such as that of economic growth, requires “micro-foundations”, in the sense of “optimal decision rules of economic agents” (Lucas, 1976). But this would imply that economic phenomena are universal, whereas the evidence shows that there are fundamental dissimilarities between different types of economic system. For example, the difficult task of explaining the observed patterns of growth across time and across countries is made more difficult, possibly intractable, by the insistence on a universal principle of this kind. Modern economic growth was unknown before the industrial revolution, and since that time it has been experienced by some economies but not others—all economists surely know this, even if their grasp of economic history is not profound. But it means that growth cannot be explained in terms of a universal human attribute, or postulated attribute, such as optimisation. The insistence on the need for micro-foundations is held by many economists, but the “news” that growth has had a specific spatial/temporal distribution is not news to them—and therefore it would not be accepted as evidence against the theory. (Joffe 2017, 10, emphasis added)

Addelson similarly notes:

The language of economic theory, like any language provides a framework for thought: but at the same time it constrains thought to remain within that framework. It focuses our attention; determines the way we conceive of things; and even determines what sort of things can be said…. A language, or conceptual framework is, therefore, at one and the same time both an opportunity and a threat. Its positive side is that (one hopes) it facilitates thought within the language or framework. But its negative side arises from the fact that thought must be within the framework. (Coddington 1972: 14-15) (Addelson, Mark. Equilibrium Versus Understanding [Towards the Restoration of Economics as Social Theory]. London: Routledge; 1995; p. 12)

The “conventional starting point” for neoliberal and even some heterodox economics is a top-down axiomatic deductive methodology. Joffe (2017) proposes an evidence-based bottom-up approach in which theories are generated from evidence rather than based on a story or parable about universal human behavior and/or upon hypothetical stylized behavior (i.e., axiomatic deductive methodology). Neither just-so story telling and/or axioms of universal human behavior start with observations of actual occurring processes of observed human behavior. Such abstractions are derived from axioms not observed human behavior and therefore have limited scope and applicability since they don’t take into account actual historical context of time and place.

A different approach is to study human behaviour as it is, e.g. truth-telling (Abeler, Nosenzo, & Raymond, 2016) and cooperation and altruism (Rand, Brescoll, Everett, Capraro, & Barcelo, 2016). This has the potential for developing a theory of economic behaviour that is based on the heuristics people actually use, and to link this with an evolutionary account of the causal processes that led to their existence in our brains (Gigerenzer, Hertwig, & Pachur, 2011; Gigerenzer, Todd, & ABC Research Group, 2000). (Joffe 2017, 12)

Realistic theory can be derived from observations. Evidence, whether experimental or observational, is used in generating new theory. (Joffe 2017, 12) Lines of evidence can be combined. Evidence is diverse, qualitative and quantitative, historical and experimental, and strands of evidence can be combined and/or generate new insights with broad explanatory hypotheses in support of empirically informed theory.

This would be a natural way of developing conceptual categories that correspond to natural categories (“carving nature at its joints”) with strong ontic emphasis and focus on causation. A theory in this sense can also be said to be true or false—or perhaps better, that it is able to possess some degree of truth. (Joffe 2017, 12)

~ ~ ~

Joffe aspires for a value neutral practice of science that elevates evidence over bias, presuppositions, and prior beliefs. It requires discipline and sincerity to put these prejudices and biases aside and let the evidence lead one wherever it is heading. Just as evidence is the basis of fairness in any judicial context, evidence is also the basis of hypotheses generation in any scientist’s mind when endeavoring to generate theoretical understanding.

Bad Samaritans

After he had come to power in a military coup in 1961, General Park turned ‘civilian’ and won three successive elections. His electoral victories were propelled by his success in launching the country’s economic ‘miracle’ through his Five Year Plans for Economic Development. But the victories were also ensured by election rigging and political dirty tricks. His third and supposedly final term as president was due to end in 1974, but Park just could not let go.Halfway through his third term, he staged what Latin Americans call an x`‘auto-coup’. This involved dissolving the parliament and establishing a rigged electoral system to guarantee him the presidency for life. His excuse was that the country could ill afford the chaos of democracy. It had to defend itself against North Korean communism, the people were told, and accelerate its economic development. His proclaimed goal of raising the country’s per capita income to 1, 000 US dollars by 1981 was considered overly ambitious, bordering on delusional. (Chang, Ha-Joon. Bad Samaritans (p. 6). Bloomsbury Publishing. Kindle Edition.)

President Park launched the ambitious Heavy and Chemical Industrialization (HCI) programme in 1973. The first steel mill and the first modern shipyard went into production, and the first locally designed cars (made mostly from imported parts) rolled off the production lines. New firms were set up in electronics, machinery, chemicals and other advanced industries. During this period, the country’s per capita income grew phenomenally by more than five times, in US dollar terms, between 1972 and 1979. Park’s apparently delusional goal of $1,000 per capita income by 1981 was actually achieved four years ahead of schedule. Exports grew even faster, increasing nine times, in US dollar terms, between 1972 and 1979.4 (Chang, Ha-Joon. Bad Samaritans (p. 7). Bloomsbury Publishing. Kindle Edition.)

The country’s obsession with economic development was fully reflected in our education. We learned that it was our patriotic duty to report anyone seen smoking foreign cigarettes. The country needed to use every bit of the foreign exchange earned from its exports in order to import machines and other inputs to develop better industries. Valuable foreign currencies were really the blood and sweat of our ‘industrial soldiers’ fighting the export war in the country’s factories. Those squandering them on frivolous things, like illegal foreign cigarettes, were ‘traitors’. I don’t believe any of my friends actually went as far as reporting such ‘acts of treason’. But it did feed the gossip mill when kids saw foreign cigarettes in a friend’s house. The friend’s father – it was almost invariably men who smoked – would be darkly commented on as an unpatriotic and therefore immoral, if not exactly criminal, individual. (Chang, Ha-Joon. Bad Samaritans (pp. 7-8). Bloomsbury Publishing. Kindle Edition.)

Spending foreign exchange on anything not essential for industrial development was prohibited or strongly discouraged through import bans, high tariffs and excise taxes (which were called luxury consumption taxes). ‘Luxury’ items included even relatively simple things, like small cars, whisky or cookies. I remember the minor national euphoria when a consignment of Danish cookies was imported under special government permission in the late 1970s. For the same reason, foreign travel was banned unless you had explicit government permission to do business or study abroad. As a result, despite having quite a few relatives living in the US, I had never been outside Korea until I travelled to Cambridge at the age of 23 to start as a graduate student there in 1986. (Chang, Ha-Joon. Bad Samaritans (p. 8). Bloomsbury Publishing. Kindle Edition.)

(….) Korea’s economic ‘miracle’ was not, of course, without its dark sides. Many girls from poor families in the countryside were forced to find a job as soon as they left primary school at the age of 12 – to ‘get rid of an extra mouth’ and to earn money so that at least one brother could receive higher education. Many ended up as housemaids in urban middle-class families, working for room and board and, if they were lucky, a tiny amount of pocket money. The other girls, and the less fortunate boys, were exploited in factories where conditions were reminiscent of 19th-century ‘dark satanic mills’ or today’s sweatshops in China. In the textile and garment industries, which were the main export industries, workers often worked 12 hours or more in very hazardous and unhealthy conditions for low pay. Some factories refused to serve soup in the canteen, lest the workers should require an extra toilet break that might wipe out their wafer-thin profit margins. Conditions were better in the newly emerging heavy industries – cars, steel, chemicals, machinery and so on – but, overall, Korean workers, with their average 53–4 hour working week, put in longer hours than just about anyone else in the world at the time. (Chang, Ha-Joon. Bad Samaritans (pp. 9-10). Bloomsbury Publishing. Kindle Edition.)

Urban slums emerged. Because they were usually up in the low mountains that comprise a great deal of the Korean landscape, they were nicknamed ‘Moon Neighbourhoods’, after a popular TV sitcom series of the 1970s. Families of five or six would be squashed into a tiny room and hundreds of people would share one toilet and a single standpipe for running water. Many of these slums would ultimately be cleared forcefully by the police and the residents dumped in far-flung neighbourhoods, with even worse sanitation and poorer road access, to make way for new apartment blocks for the ever-growing middle class. If the poor could not get out of the new slums fast enough (though getting out of the slums was at least possible, given the rapid growth of the economy and the creation of new jobs), the urban sprawl would catch up with them and see them rounded up once again and dumped in an even more remote place. Some people ended up scavenging in the city’s main rubbish dump, Nanji Island. Few people outside Korea were aware that the beautiful public parks surrounding the impressive Seoul Football Stadium they saw during the 2002 World Cup were built literally on top of the old rubbish dump on the island (which nowadays has an ultra-modern eco-friendly methane-burning power station, which taps into the organic material dumped there). (Chang, Ha-Joon. Bad Samaritans (p. 10). Bloomsbury Publishing. Kindle Edition.)

Mathematics as Ornament

One of the founders of neoclassical economics, William Stanley Jevons, thought that economics should be a mathematical science, and this is why, even today, most neoclassical economists use a large amount of mathematics in their work. The father of rational expectations theory, Robert Lucas, claimed, in a lecture at Trinity University in 2001, that: “Economic theory is mathematical analysis. Everything else is just talk and pictures”. Motivated, among other things, by these positions, British philosopher of science Donald Gillies wrote an interesting article on the comparison of the use of mathematics in physics and in neoclassical economics. (Sylos Labini, Francesco. Science and the Economic Crisis (Kindle Locations 1788-1793). Springer International Publishing. Kindle Edition.)

Gillies first recalled that physicists have learned to critically consider each theory within the precise limits that are dictated by the assumptions used and by the experiments available. From the times of Galileo and Newton, physicists have, therefore, learned not to confuse what is happening in the model with what instead is happening in reality. Physical models are compared with observations to prove if they are able to provide precise explanations: an example of this type is represented by the procession of the perihelion of Mercury, which we discussed in the previous chapter. Otherwise, theoretical models can provide successful predictions. For example in 1887, Hertz generated the electromagnetic waves postulated by Maxwell in 1873. The question is therefore: can one argue that the use of mathematics in neoclassical economics serves similar purposes? Otherwise, this usage is reduced to a mere rhetorical exercise, which employs the flaunted use of a relatively refined tool to precisely calculate irrelevant quantities. Gillies’s conclusion is that, while in physics mathematics was used to obtain precise explanations and successful predictions, one cannot draw the same conclusion about the use of mathematics in neoclassical economics in the last half century. This analysis reinforces the conclusion about the pseudo-scientific nature of neoclassical economics we reached previously given the systematic failure of the predictions of neoclassical economists. (Sylos Labini, Francesco. Science and the Economic Crisis (Kindle Locations 1794-1804). Springer International Publishing. Kindle Edition.)

To show this, Gillies has examined the best-known works by a selection of the most famous neoclassical economists (Paul A. Samuelson, Kenneth J. Arrow, Gerard Debreu and Edward C. Prescott) in the period from 1945 to the present. The most famous work of Samuelson is one of the classics of mathematical economics, “Foundations of Economic Analysis”. Gillies notes that Samuelson, in his book of over 400 pages full of mathematical formulas, does not derive a single result that can be compared with the observed data. There is even no mention of any empirical data in the book Samuelson! (Sylos Labini, Francesco. Science and the Economic Crisis (Kindle Locations 1804-1808). Springer International Publishing. Kindle Edition.)

As for the seminal work of Kenneth Arrow and Gerard Debreu, published in 1954 and previously discussed, Gillies highlights that the general equilibrium models considered by the authors are based on such simplistic assumptions of reality that they cannot be compared with the observed data. In fact, as Samuelson, they do not derive any result that can be compared with the empirical data, which are indeed absent in their work. (Sylos Labini, Francesco. Science and the Economic Crisis (Kindle Locations 1809-1812). Springer International Publishing. Kindle Edition.)

Finally, Gillies takes into account the article by Edward C. Prescott called “The Equity Premium. A Puzzle”, written in collaboration with Rajinish Mehra. In this article, the authors try to compare the general equilibrium model of Arrow-Debreu with theoretical data obtained from a real economy, namely the US economy in the period 1889– 1978. In this case, there is no agreement between the theoretical results and empirical data. In conclusion, neoclassical economics, unlike physics, has not achieved either precise explanations or successful predictions through the use of mathematics. Thus, this is the main difference between neoclassical economics and physics. (Sylos Labini, Francesco. Science and the Economic Crisis (Kindle Locations 1812-1817). Springer International Publishing. Kindle Edition.)

Change is Coming

For more than a decade now, dissatisfaction with the state of economics as a discipline has been growing within its ranks. Much of it has been driven by students and young people who are increasingly aware of the many limitations of what they are being taught at universities across the world, and much more willing to challenge existing dogmas and power structures.

This book is the outcome of a collective effort by such young people, to identify more precisely the source of their unhappiness with the current state of economics and, even more importantly, to highlight how this state of affairs can be changed.

It highlights a wide range of problems within the profession including a lack of diversity and inclusion; harmful hierarchies between countries; a dominant paradigm that fails to address structural inequalities, whitewashes histories of oppression, and undermines democracy and development; and incentive structures that punish economists who seek to venture beyond this paradigm. By presenting these concerns in clear-eyed and courageous ways, it also provides much hope for the future of economics.

We know that much of this dominant paradigm in economics is simply wrong and is being continuously exposed as being wrong: from being over-optimistic about how financial markets work and whether they are or can be ‘efficient’ without regulation, to misplaced arguments in favour of fiscal austerity or the deregulation of labour markets and wages. Critical relationships between humans and nature that form the basis of most material production are dismissed as ‘externalities’. These are only some of the ways in which mainstream economic thinking is either irrelevant or downright misleading in understanding contemporary economic processes and useless or counterproductive in addressing humanity’s most important challenges.

One reason is that much of the mainstream discipline has been in the service of power, effectively the power of the wealthy, at national and international levels. By ‘assuming away’ critical concerns, theoretical results and problematic empirical analyses effectively reinforce existing power structures and imbalances.

Deeper systemic issues like the exploitation of labour by capital and the unsustainable exploitation of nature by forms of economic activity, of labour market segmentation by social categories that allows for differential exploitation of different types of workers, of the appropriation of value, of the abuse of market power and rent-seeking behaviour by large capital, of the use of political power to push economic interests including of cronies, of the distributive impact of fiscal and monetary policies – all these are swept aside, covered up and rarely brought out as the focus of analysis.

This is associated with strict power hierarchies within the discipline as well, which suppress the emergence and spread of alternative theories, explanations and analysis. Economic models that do not challenge existing power structures are promoted and valorised by gatekeepers in the senior ranks of the profession. Alternative theories and analyses are ignored, marginalised, rarely published in the ‘top’ journals, and obliterated from textbooks and other teaching materials. (Ambler, Lucy; Earle, Joe; Scott, Nicola. Reclaiming economics for future generations (Manchester Capitalism) (pp. 16-17). Manchester University Press. Kindle Edition.)

The disincentives for young economists to stray from the straight and narrow path are huge: academic jobs and other placements as economists are dependent on publications, which are ‘ranked’ according to the supposed quality of the journal they are in, in a system that demotes articles from alternative perspectives; promotions and further success in the profession depend on these markers. (Ambler, Lucy; Earle, Joe; Scott, Nicola. Reclaiming economics for future generations (Manchester Capitalism) (pp. 17-18). Manchester University Press. Kindle Edition.)

This combines with the other pervasive forms of social discrimination by gender, racialised identity and location. A macho ethos permeates the mainstream discipline, with women routinely facing the consequences. Along with widespread patriarchy, the adverse impact of relational power affects other socially marginalised categories, according to class, racial and ethnic identities, and language. The impact of location is enormous, with the mainstream discipline completely dominated by the North Atlantic in terms of prestige, influence, and the ability to determine the content and direction of what is globally accepted. The enormous knowledge, insights and contributions to economic analysis made by economists located in the Global South in Asia, the Middle East, Africa, Latin America and the Caribbean are largely ignored. (Ambler, Lucy; Earle, Joe; Scott, Nicola. Reclaiming economics for future generations (Manchester Capitalism) (p. 18). Manchester University Press. Kindle Edition.)

Then there is disciplinary arrogance, expressed in insufficient attention to history and a reluctance to engage seriously with other social sciences and humanities, which has greatly impoverished economics. Arrogance is also evident in the tendency of economists to play God, to engage in social engineering, couched in technocratic terms which are incomprehensible to the majority of people who are told that particular economic strategies are the only possible choice, in an attitude that collapses into the unethical. (Ambler, Lucy; Earle, Joe; Scott, Nicola. Reclaiming economics for future generations (Manchester Capitalism) (p. 18). Manchester University Press. Kindle Edition.)

Fortunately, there is growing pushback against these tendencies, globally and within the current bastions of economics in the North Atlantic. This book is very much part of that response: challenging the rigidities and power structures within the mainstream discipline, and calling for a more varied, sophisticated, nuanced and relevant understanding of economies. This is, of course, greatly welcome; it is also hugely necessary and urgent, if economics is to reclaim its position as a relevant social science that had origins in both moral philosophy and statecraft. (Ambler, Lucy; Earle, Joe; Scott, Nicola. Reclaiming economics for future generations (Manchester Capitalism) (pp. 18-19). Manchester University Press. Kindle Edition.)

Jayati Ghosh
Professor of Economics,
University of Massachusetts Amherst, USA;
formerly Professor of Economics,
Jawaharlal Nehru University, New Delhi, India

Bismark on Rice

Late-Victorian economic doctrine answered the need for an intellectual response to the workers’ challenge, to trade unions, to socialism, to the land reform movement, and to Social Democracy. Liberal economists upheld the existing property order and its inequalities. In Western Europe, North America and Australasia, Social Democracy eventually prevailed over fascism and communism, established welfare states, safeguarded the structures of capitalism, and dominated policy during the first three post-war decades. It sustained economic growth and distributed it more equally. To do this, it had to challenge the assumptions of neoclassical economics, and sometimes to reject them. (Offer, Avner; Söderberg, Gabriel. The Nobel Factor (p. 6). Princeton University Press. Kindle Edition.

In contrast to the competitive free-for-all of orthodox economics, Social Democratic parties in post-war Europe (and in the English-speaking countries) defined a cluster of collective aspirations:

•  Collective insurance against life-cycle periods of dependency, regulated and administered by government and paid for through progressive taxation.
•  Good-quality affordable housing, by means of rent control, new construction, mortgage subsidies, and public or collective ownership.
•  Secondary and higher education, land use planning, scientific research, culture, sports, roads and railways.
•  A mixed economy with extensive public services, some nationalized firms, but leaving private ownership to manage production and distribution.
•  A special concern for disadvantaged groups.13

The United States also went along with a good deal of this programme, and if it failed to provide universal healthcare entitlement, it did provide one for the old and the indigent.

Offer, Avner; Söderberg, Gabriel. The Nobel Factor (pp. 6-7). Princeton University Press. Kindle Edition.

Japan: Bismark on Rice

DURING A DEBATE AMONG THE PRESIDENTIAL CANDIDATES in the spring of 2008, former New York mayor Rudy Giuliani offered a picture of health care in foreign countries: “These countries that say they provide universal coverage—they pay a price for it, you know,” Giuliani explained. “They do it by rationing care, by long waiting lines, and by limiting, or I should say by eliminating, a patient’s choice.” Judging from that, it seems safe to say that Rudy Giuliani has never visited Dr. Nakamichi Noriaki at the Orthopedic Surgery Department of Keio Daigaku Hospital in Tokyo. (Reid, T. R.. The Healing of America (p. 82). Penguin Publishing Group. Kindle Edition.)

In a society that is acutely conscious of hierarchy and rank, Dr. Nakamichi is generally recognized as one of the top orthopedic surgeons in all Japan; his clinic at Keio is perhaps the most respected place in the country for the repair of stiff, aching shoulders like mine. I was first told about him one Thursday morning in Tokyo when I was complaining, as usual, about my shoulder. I called his office to schedule an appointment—and was told to come in that same afternoon. After the familiar poking, patting, massage, and manipulation, Dr. Nakamichi suggested an assortment of different treatments that might work for me; in fact, it was the widest variety of care any doctor had proposed. The treatment available in Japan ranges from acupuncture to injections to manipulation to the total shoulder arthroplasty that my doctor back home had recommended. All the options, he told me, are covered by Japanese health insurance. When I asked how long I would have to wait if I chose the full-scale shoulder-replacement surgery, the doctor checked his computer. “Tomorrow would be a little difficult,” he said. “But next week would probably work.” (Reid, T. R.. The Healing of America (pp. 82-83). Penguin Publishing Group. Kindle Edition.)

In other words: no waiting, no gatekeeper, no rationing, and a broad array of patient choice. Prices are low; as we’ll see, the Japanese system has a rigid cost-control mechanism that favors the patient, at the expense of doctors and hospitals. My out-of-pocket cost for an office visit with the prestigious Dr. Nakamichi in his prestigious clinic came to ¥2,060, or $19 (the doctor charged $64, and insurance pays 70 percent of the bill in Japan). (Reid, T. R.. The Healing of America (p. 83). Penguin Publishing Group. Kindle Edition.)

(….) It’s worth noting that this happens in a largely private-sector system; Japan relies on private doctors and hospitals, with the bills paid by insurance plans. In fact, Japanese doctors are the most capitalist, and most competitive, that I’ve seen anywhere in the world. (Reid, T. R.. The Healing of America (p. 83). Penguin Publishing Group. Kindle Edition.)

(….) Since medical care is so readily available, so easy to get, and so cheap, you might think that the Japanese use an awful lot of medical care. And you’d be right. The Japanese are the world’s most prodigious consumers of health care.1 The average Japanese visits a doctor about 14.5 times per year—three times as often as the U.S. average, and twice as often as any nation in Europe. If you can’t get to the doctor, no problem: Nearly all general practitioners in Japan make house calls, either daily or weekly. (Reid, T. R.. The Healing of America (p. 84). Penguin Publishing Group. Kindle Edition.)

(….) “Japan’s macro health indices, such as healthy life expectancy and infant mortality, are the best, or among the best, in the world,” says Professor Ikegami Naoki, the country’s best-known health care economist. “Now, that’s not all the result of health care. Japan has lower rates of violent crime than most countries, less illicit drug use, fewer traffic accidents, lower rates of HIV infection, less obesity. In terms of keeping people alive and healthy, those factors obviously help. But you also have to give some credit to the health care system for providing universal coverage and egalitarian access without long waiting lists, and we need to credit the doctors and the medical schools for providing a high quality of treatment.” (Reid, T. R.. The Healing of America (p. 85). Penguin Publishing Group. Kindle Edition.)

The Japanese system, in short, provides care to every resident of Japan, for minimal fees, with no waiting lists—and excellent results. This is a good deal for the people of Japan, and they take advantage of it, flocking to clinics and hospitals. To an American, it seems natural that this formula—heavy demand by an aging population, with almost no rationing of care—would add up to a huge national medical bill. But when it comes to costs, Japan has turned the predictable formula upside down. Despite universal coverage and prodigious consumption, Japan spends a lot less for health care than most of the developed nations; with costs running at about 8 percent of GDP, it spends about half as much as the United States. (Reid, T. R.. The Healing of America (p. 85). Penguin Publishing Group. Kindle Edition.)

(….) As we’ll see shortly, not everybody in Japan is happy with the system and its strict cost controls, because the system squeezes cost by sharply limiting the income of medical providers—doctors, nurses, hospitals, labs, drug makers. But if your goal is to provide quality care for everybody at a reasonable cost (which is not a bad goal for any health system), then the Japanese model could be a good one to follow. (Reid, T. R.. The Healing of America (p. 86). Penguin Publishing Group. Kindle Edition.)

Dr Pangloss’s Economism

For a little over a century, a mere blink of the eye in human history, western and westernized leaders, politicians, policymaker, and the public have operated on the belief that there can be a scientific discipline of economics, a field of study separate from moral philosophy and the natural sciences. Never mind that economics coevolved with a political discourse driven by power. Economics seemingly explains how society should be organized and people should live. The modern economic world arose around ideas generated by economists, and this world has been supported by corresponding public economistic beliefs that I refer to as “economism”.

Fullbrook, Edward ; Morgan, Jamie. Post-Neoliberal Economics (p. 97). World Economics Association Books. Kindle Edition.

It is proved that things cannot be other than they are, for since everything was made for a purpose, it follows that everything is made for the best purpose.

—Pangloss, in Voltaire’s Candide, 1759


This invocation of basic economics lessons to explain all social phenomena is economism.* It rests on the premise that people, companies, and markets behave according to the abstract, two-dimensional illustrations of an Economics 101 textbook, even though the assumptions behind those diagrams virtually never hold true in the real world. Economism is an interpretive lens through which people make sense of reality. Like any such framework, it also implies a certain set of value judgments and policy choices. For example, if a simple supply-and-demand model shows that taxes reduce employment, then it follows that high tax rates are bad and should be lowered. Because it claims the authority of “economics,” economism can be a powerful rhetorical tool. And while superficial economic arguments can serve multiple purposes, in today’s world they most often justify the existing social order—and the inequality that it generates—while explaining the futility of any attempt to change it. (Kwak, James. Economism (pp. 6-7). Knopf Doubleday Publishing Group. Kindle Edition.)

For every well-intentioned proposal to help ordinary working people, economism provides an answer. Raise the minimum wage so the working poor take home more money? That’s a nice idea, but that’s not how the world works. According to Jude Wanniski, one of the pillars of The Wall Street Journal’s editorial page in the 1970s, “Every increase in the minimum wage induces a decline in real output and a decline in employment.” Wanniski was an adviser to Ronald Reagan, who echoed, “The minimum wage has caused more misery and unemployment than anything since the Great Depression.” Raise taxes on the rich to pay for services for everyone else? Good try, but, Gregory Mankiw (author of one of the world’s most popular economics textbooks) explains, “as [high-income taxpayers] face higher tax rates, their services will be in shorter supply.” Or, in the words of the 2012 vice presidential candidate Paul Ryan, “if you want faster economic growth, more upward mobility, and faster job creation, lower tax rates across the board is the key.”16 The examples go on and on. The problems of financial markets, health care, education, and many other fields can all be reduced to economic first principles that dictate simple solutions. (Kwak, James. Economism (pp. 7-8). Knopf Doubleday Publishing Group. Kindle Edition.)

These claims are made so often in the media and by politicians that they appear to be a natural feature of the landscape. But they all come from somewhere. They are based on a lesson that economics students learn in their first semester: the model of a competitive market driven by supply and demand. In this model, the supply and demand for any product determine its price; prices create incentives for individuals and businesses; and those incentives ensure that consumers get what they want, companies are as efficient as possible, and resources are allocated optimally across the economy. As the pathbreaking economist Paul Samuelson wrote in 1948, this basic lesson is “all that some of our leading citizens remember, 30 years later, of their college course in economics.”17 (Samuelson was well aware of the power of introductory courses: “I don’t care who writes a nation’s laws—or crafts its advanced treatises,” he once said, “if I can write its economics textbooks.”18) (Kwak, James. Economism (p. 8). Knopf Doubleday Publishing Group. Kindle Edition.)

This elegant model, however, rests on a set of highly unrealistic assumptions. The definition of a competitive market requires that all suppliers offer the same product—there are no differences in features, quality, or anything else—and that each company is so small that its behavior has no effect on overall supply. If this assumption does not hold—such as in the market for cell phone service, or air travel, or automobiles, or books, or almost anything—then supply and demand do not necessarily produce the optimal price, and the allocation of resources may be distorted.19 The argument that a minimum wage increases unemployment assumes that employees are currently being paid the entire value of their work; otherwise, employers would be willing to pay slightly higher wages in order to keep them. Again, this premise is unlikely to be true in the real world of fast-food restaurants or hotels, where workers have little bargaining power and companies are therefore able to claim most of the value that their employees create. (Kwak, James. Economism (pp. 8-9). Knopf Doubleday Publishing Group. Kindle Edition.)

Economism ignores these uncooperative facts and assumes the necessary assumptions, reducing all real-world questions to simple models and answering them in the same terms. In this sense, economism is like an ideology. Communism explained industrial society as the product of class struggle, with the inevitable outcome of proletarian revolution. Nationalism, the other great European ideology of the nineteenth century, saw rivalry between groups of people with a common background as the motor of history. Its lesson was that each nation should achieve political unity to promote its interests in the world. (Kwak, James. Economism (p. 9). Knopf Doubleday Publishing Group. Kindle Edition.)

“Economism” is a somewhat obscure academic term, generally used to criticize someone for overvaluing economics—by overestimating the importance of material conditions, focusing exclusively on economic metrics, applying economic methodologies when they are inappropriate, or accepting economic theory too readily.14 In this book, I use “economism” in a more specific sense, as the belief that a few isolated Economics 101 lessons accurately describe the real world. The economist Noah Smith calls this phenomenon “101ism.”15 (Kwak, James. Economism (p. 17). Knopf Doubleday Publishing Group. Kindle Edition.)