Continued from part 1. The story in its popular outline is well known. Facing persistent stagflation, the new chair of the Federal Reserve, Paul Volcker, set out to cure the American economy via a treatment of shock therapy, rapidly hiking interest rates. From 1979 to 1983 the effective federal funds rate scarcely dipped below 10 percent and nearly ...
Joseph Solis-Mullen considers the following as important:
This could be interesting, too:
Tim Worstall writes What horrors! How could this be?
Don Boudreaux writes Elizabeth Warren Encourages Resource Waste
Don Boudreaux writes Some Non-Covid Links
Don Boudreaux writes Applauding Jeff Jacoby’s Busting of Trade-Deficit Myths
The story in its popular outline is well known.
Facing persistent stagflation, the new chair of the Federal Reserve, Paul Volcker, set out to cure the American economy via a treatment of shock therapy, rapidly hiking interest rates. From 1979 to 1983 the effective federal funds rate scarcely dipped below 10 percent and nearly touched 20 percent. Rising from the ashes of Volcker’s manufactured recession, Ronald Reagan’s program of tax cuts, deregulation, and increased military spending powered the American economy through the decade, culminating in the capitulation of its exhausted Cold War rival as the country rode triumphantly on into the 1990s and the end of history.
Except, of course, that history did not end. And for all the general story gets right, it glosses over much. For example, the late 1980s and early 1990s also featured multiple recessions and crises that usually go unmentioned. Further, within the solutions arrived at to maintain the dollar’s hegemony in the post–Bretton Woods period were the ingredients of the eventual global financial crisis. When it occurred in 2007–08 it triggered a global recession, and further political and geopolitical upheavals that have yet to be decisively settled. For all their significance, it may seem strange to look to the seemingly esoteric history of money and banking in the United States for insight as to why events transpired as they did. But once it had been accepted in the 1970s that currency values would henceforth be determined by market forces, the logic of the proposition generated its own slew of corollaries, liberalized capital controls and domestic banking among them.
This was happening, furthermore, at a time of broader social, political, and economic change. And while these dynamics often interact in ways that make it difficult if not impossible to discern the extent to which any or all are involved in driving a given event, in this case several things can be said with certainty about the emergence of a broad consensus behind the program of tax reforms, tax cuts, deregulation, cuts to welfare, and the standardization of global trade that came to define the period 1976–2007.
First, inefficiency at the level of the firm had been diagnosed as a critical problem in the face of onrushing imports for almost a decade by the 1980s, and a new school of antitrust enforcement centered on the University of Chicago was increasingly successful in arguing that the overactive antitrust enforcement of the 1960s needed to be relaxed so that internal economies of scale could be allowed to develop. And with the focus of antitrust enforcement increasingly on consumer price, a series of merger waves took place in the 1980s, 1990s, and 2000s.
Next, the early stages of this process coincided with the need to deregulate the US banking sector following the monetary instability and inflation in the years following the post–Bretton Woods breakdown. We will see how a series of bills in the 1980s and 1990s gradually tore down all the ossified government regulations that had so recently imperiled the industry, and how these translated into changes in the wider economy. In terms of understanding the general context of the period, however, what needs noting here is that the increased ceding of power over the value of currencies to international credit markets necessitated the freeing up and empowering of the US banking industry. The US banking sector became key to upholding global dollarization, with Wall Street acting to vacuum up any surplus dollars abroad, thereby allowing Washington to maintain its twin fiscal and trade deficits.
Lastly, though Jimmy Carter’s initial experiments in deregulation and tax cuts were targeted, they were sweepingly implemented by his successors on both sides of the aisle and the Atlantic. Margaret Thatcher was surely not joking when she replied that her greatest accomplishment was the emergence of the much-moderated New Labor, under Tony Blair, in the early 1990s. In the United States, too, the success of these policies enabled Bill Clinton and the Southern Democratic Leadership Council to successfully seize hold of the party as it struggled to move beyond the identity politics of the late 1960s and ’70s. Indeed, so powerful was their example and embrace by the Transatlantic community that they became the basis of International Monetary Fund (IMF) and World Bank policies: the Washington Consensus.
Indeed, out of the initial chaos of the 1970s and early 1980s emerged the so-called Great Moderation, a period of controlled inflation, low unemployment, and steady economic growth—all based on these policies. As before, this relatively tidy narrative elides the heated debates of the time, protests from both the right and left, the piecemeal nature of the project, as well as the tenuous foundations upon which the apparent economic success of the period depended.
It is appropriate, if coincidental, then, that the beginning of Jimmy Carter’s presidency marks the halfway point of the period 1945–2007, the major bookends of our period, for though his initial thrusts in the direction of the free market were not programmatic or ideological but pragmatic, they set important groundwork. Elected in 1976, before his single term was out Carter would deregulate the airline, trucking, and rail industries as well as enact corporate tax reforms. Carter also oversaw the first of the three major deregulatory bank measures of the period.
The Depository Institutions Deregulation and Monetary Control Act of 1980 helped stabilize the situation of banks at the same time it enhanced the ability of the Fed to dictate monetary policy. First, the act phased out the interest payment ceilings that had seen banks losing billions in deposits to similarly unregulated institutions like mutual funds; while second, the law mandated all deposit-taking institutions follow Fed rules, thus widening the ranks of those forced to hold dollar securities as reserves.
While Reagan’s presidency saw no legislation regarding money and banking per se, the 1985 Plaza Accords had the same effect on the US balance of trade via Japan—though it set the conditions for the later crash that stagnated Japan’s economy for over a decade. Reagan also picked up where Carter had left off regarding the reimagination of the macro- and microeconomic landscape. Together, with the rise of the modern multinational corporation with its intricate supply chains, US policy tilted toward openness at the Cold War’s end. In trade and fiscal policy, George H.W. Bush and Clinton thus followed suit. From lower taxes and smaller welfare states to fewer regulations and their standardization via free trade agreements, whatever one wishes to call it, the animation of the world economy by newly freed global capital flows was to make for a tumultuous ride to the end of the century—and beyond.
With the 1994 Interstate Banking and Branching Efficiency Act US banks quickly grew into large networks, which quickly grew to include overseas offices. By the end of the century increasing liberalization and digitization precipitated the doubling of cross border global capital flows. Wall Street’s relationship with corporate and political America grew over this time as well, as junk bond and overnight money markets made possible a much leaner existence than before, and Washington became ever more dependent on the US financial sector. We have already seen how Wall Street financial engineering became central to attracting surplus dollars abroad, allowing Washington to maintain its twin fiscal and trade deficits; but this, along with accommodative Fed policy, also allowed for the proliferation of consumer credit necessary to account for the stagnant and declining wages paid to American workers since the mid-1970s, when the transition from factories to finance, documented by Judith Stein, began.
With the Latin American, Russian, and East Asian currency crises of the decade driving ever more investors to US equities and securities as a port of choice in an otherwise stormy international investment scene, it was with this increasing demand in mind that the Financial Services Modernization Act was passed in 1999. With the removal of this last remnant of the New Deal–era banking regulations, institutions of all kinds were now involved in the trading of masses of opaque securities and derivatives in secondary markets. The digitization of finance increasingly meant it was being done by computer algorithms at lightning speed. The increasing centrality of finance to American corporate business meant companies as far afield as AIG and General Motors became involved. Over the same period the Federal Reserve held interest rates low, creating the cheap money conditions for multiple asset bubbles in tech and housing to form. Moral hazard also entered the equation. Apart from Fannie and Freddie Mac’s implicit government guarantee of its mortgage book, the Fed and Treasury had proven active in rescuing any beleaguered institution, facilitating multiple takeovers and rescues. It was that same overactivity which drove Fed chair Alan Greenspan's decision to cut rates multiple times during the 1990s despite clear signs of an overheated economy, because of what he perceived to be potentially threatening exogenous events—such as the alarm surrounding the Y2K scare.
Therefore, while there were certainly bad actors in the runup to the 2007–08 financial crisis, from captured regulators and compromised ratings agencies to unscrupulous lenders and mortgage originators who sold impossible-to-pay variable rate mortgages to uniformed buyers only to package them up and pass them off in exchange for commission—at bottom, the conditions that made this possible were created by the policies and precedents set by the administrations leading up to the crisis. It is inconceivable the major actors involved ever would have behaved so recklessly if they hadn’t believed there was a net—and in the end the decision they had counted on came through: the banks were not allowed to fail.
“Too Big to Fail,” that was how advocates of the Troubled Assets Relief Program (TARP) and the Emergency Economic Stability Act of 2008 sold the naked overturning of the most basic capitalist orthodoxy. Apart from whatever role campaign finance or the well-being of their own portfolios played in the votes members cast in Congress, within the intellectual establishment the above initiatives reflected the belief among policy makers at the Fed and Treasury, led at that point by Ben Bernanke and Timothy Geithner, that the reason the Great Depression had gone on for so long was because of a lack of government intervention. Ignoring that Fed policy played a key role in causing the initial conditions during the 1920s for the crash, the 1929 crash itself, and the later double-dip recession of 1937, they posited that the failure of the Fed and Treasury to combine and step in early to save the US banking system was the primary reason the Depression was as prolonged and severe as it was.
Whatever one makes of such counterfactual assertions, it must be appreciated the extent to which this determination to avoid the apparent mistakes of their predecessors motivated thinking during the late George W. Bush and Obama years. This determination to preserve something of a functional financial system would see the Fed wildly exceed its mandate. The US economy being inextricable from the wider world, and the dollar facilitating much of the trade and commerce circulating throughout, the already dubious program of quantitative easing was augmented by an unbelievable backstopping of foreign banks by the Fed.
This, of course, all took place out of the public view—as did assurances from the Chinese that they would not dump their enormous dollar holdings, despite a Russian invitation to do so. Thus, just as it had survived calls for its replacement in the early 1970s the dollar did so again. Though it is still only muddling through, there is no serious alternative on the immediate horizon, the Renminbi included. Indeed, apart from the increasingly radicality of central banking norms there has been no discernible shift in the domestic or international monetary order in the years since 2007–08. Nor, for all its pomp in unveiling, did the Dodd-Frank Act do much to change banking in the United States.
In short, amid the social, political, and economic turmoil of the postcrisis years, brought about in no small part by the changes in money and banking in the United States since 1971, it is telling of where real power lies that it is precisely these institutions and arrangements which have survived intact—at least so far.
Whether the entire interdependent global system will be able to bear the continued accumulation of public and private debt, particularly in the face of the growing need to tighten monetary policy, remains to be seen.