“The Classical Theory has been accustomed to rest the supposedly self-adjusting character of the economic system on an assumed fluidity of money-wages; and, when there is rigidity, to lay on this rigidity the blame of maladjustment” – Keynes
“[A]rticles in the Keynesian tradition suggest that unemployment arises from nonrational expectations or wage and price rigidities” – New Keynesian Economics: Coordination Failures and Real Rigidities, ed. Mankiw and Romer
Intro
As we start Book 5, we’re getting close to the end! The approach to model-building has already been laid out, and Keynes has also put forward his accounts of how employment, interest rates and money are determined within the economy. The rest of the book is either mopping up objections and precursors, or figuring out what this account of the economy might entail politically or philosophically.
Chapter 19 is particularly rare, in that it more or less explains why “New Keynesian” models sit entirely outside of Keynes’ work. Because it is so rare that you get something so perfectly tailored to future obscure arguments in the original source material, I’m going to let this month’s Keynes chapter be a bit of a complaint against a particular family of models that are, unjustifiably, trading on Keynes’ name.
New Keynesians and their canonical models frequently come under fire from more heterodox thinkers, but that fire has proven fairly easy to ignore. I don’t want to do a screed here, because those screeds often fail to get at what is actually unique about the New Keynesian approach, as well as the institutional factors that make it durable. Things are rarely as simple as “person X believed the wrong thing at time T, and now they will forever keep things Bad,” and economics is no exception. The thing is, to supplant the NK body of knowledge – something that the effective management of a modern macroeconomy almost certainly requires – we have to first know all the different functions it fulfills for folks even tangentially related to it.
The argument of chapter 19 is extremely simple: making wages and prices perfectly flexible would not generally prevent recessions or increase employment in an economy well-described by Keynes’ theory. This is particularly embarrassing for what is today called “New Keynesian” economics, given that the specific argument underlying their approach to theory construction that they see as linking their work to Keynes’ is: “Keynes discovered that involuntary unemployment can result from insufficiently flexible prices and wages.” So how did this…happen?
What Was NK Econ?
One of the things that makes it difficult to get into Keynes is the fact that there are about a dozen incompatible schools of economic thought which all claim to be “Keynesian” in some way. Even worse is the fact that “Keynesian” appears to have unique definitions within sections of the left-wing AND right-wing literature that either don’t have to do with economics or don’t directly map onto any of the other contingents called Keynesians.
Pedagogically, this is an absolute disaster. Hopefully the chart below helps a bit, but in making it, I started to realize just what an absurd task it is to try to figure out where all Keynes’ work went and was taken up, so apologies for its unfinished character.
Hopefully that helps, feel free to sound off in the comments about anything wrong with it.
Today, we’re mostly going to be talking about the New Keynesians, who are an especially slippery bunch from a sociology of knowledge perspective. It’s one of those terms that is a bit like neoliberalism: gained wide currency in the 1990s, became something people noticed in order to criticize starting in the 2000s, was blamed for the poor response to the Great Recession in the 2010s, and now it seems like nobody’s sure what it is or whether it actually captured a group of people who made sense as a thought movement rather than a sociocultural clique.
The fact that the same characters keep recurring throughout the legacy of things called “Keynesianism” is not particularly helpful. Paul Samuelson, Kenneth Arrow, Robert Solow and other “American Keynesians” continued publishing papers well into the 1990s and 2000s. Solow still publishes op-eds occasionally, if I remember right.
It leads to an easy genealogical explanation that says, in essence, Keynes wrote the General Theory in the 1930s; the “neoclassical synthesis” tried to reintegrate marginalist models of the labor market and pricing behavior in the 1950s and 60s; the “New Classical” approach predominated in the 70s and 80s despite rearguard actions by people like James Tobin and Alan Blinder; in the 90s, “New Keynesians” attempted a second neoclassical synthesis to try to accommodate the policy goals that the original neoclassical synthesis saw in the General Theory to the theory-structure of the “New Classical” approach; today, we are assumed to still use their models. I think this is probably right genealogically, and explains well how little of Keynes is present in the New Keynesian literature, but it’s also unsatisfying.
Part of the reason they are so hard to identify is because it is often hard to identify the content of “The New Keynesian Model.” Most economics textbooks present a version of the New Keynesian Model as essentially “the” economic model, as opposed to a specifically New Keynesian model. This is essentially the familiar midcentury Walrasian general equilibrium model: you have markets for goods and markets for factors; bargaining sets wages and interest rates set the cost of capital; Pareto Efficiency is achieved because prices are informed by the cost of production and the structure of consumer preferences; competition is broadly perfect, but monopolistic can be accommodated easily in an ad hoc way. In an attempt to refute Robert Lucas, these models are most often formulated in ways such that aggregate behavior is the simple sum of individual rational/maximizing/optimizing behaviors.
When these models are scaled up for the purposes of prediction or modelling, they are usually implemented under an approach given the name “Dynamic Stochastic General Equilibrium,” which has become something of a slur within heterodox economics communities. These really aren’t so different from many other modelling approaches. They’re all just equations after all.
In the words of Kocherlakota:
“Dynamic refers to the forward-looking behavior of households and firms. Stochastic refers to the inclusion of shocks. General refers to the inclusion of the entire economy. Finally, equilibrium refers to the inclusion of explicit constraints and objectives for the households and firms.”
All of this is pretty unobjectionable! That is, until you get to the ways in which all of these things are included and modeled.
If we fight through chapter 3 of Jordi Gali’s Monetary Policy, Inflation, and the Business Cycle: An Introduction to the New Keynesian Framework, we find out that in the benchmark New Keynesian model there are around thirty equations formalizing these intuitions. In them, we find three major groups of actors: households, firms and the government. The households are treated as a unit, and seek to maximize their lifetime utility subject to a budget constraint given by the prices of goods and the number of hours they choose to work. Firms all produce different goods, but use the same techniques of production and the same capital goods to do it. They maximize profits by setting price equal to marginal cost, but only get to change prices every now and again. The government sets the interest rate based on a rule that is sensitive to inflation and unemployment, and the actual level of that interest rate is solved in a circular fashion with the unemployment rate and measured inflation rate to figure out the values for all three variables. The function one uses to model the determination of the interest rate is usually where the model closes, and so is massively important for the model as a whole, more so than any other policy variable, if any others are included.
Everything is very endogenous, so everything is very sensitive to parameter selection. This allows models to be very precise, but very fragile to differences in parameter value, or to the causal structure of parameters and functions being different in real life than the model. You can get very exact values out of a recursive structure like this, but errors also propagate very quickly.
Once that structure is built, you can just shock different variables or parameters to see how the system of equations responds, and based on that response, you make assertions about how changing some policy variable will change economic outcomes in real life.
In the main, this kind of work looks like this:
I leave it to the reader to determine the relevance of work like this for policy, especially when applied to approaches other than exhortation.
When versions of this model are put together for academic papers, the general format is to take the benchmark model above, and figure out one particular friction or stylized fact you would like to add in. You run some tests for stability, and then add a handful of equations to cover something like “wage share has been decreasing relative to profit share” or “covered interest parity has broken down” or “where are all the workers.” Once that’s done, you calibrate the parameter values in such a way that the model output gives you backtests that look basically like past values of the indicators you have constructed the system around.
Now, to my knowledge, there is no grand project to figure out how all of these individual restrictions relate to each other, so it is not exactly clear how all of these benchmark + one stylized fact models add together. I’m not sure that parameter stability would remain if you tried to take, say, the ten or twenty most-cited models and combined their additional stylized facts. They might be! But figuring out how things translate from the academic literature to something like the Penn-Warton Budget Model or FRB/US is a bit beyond our scope here.
The biggest problem here is that many of the dynamics underlying the benchmark model have become completely stale. It is not immediately obvious that interest rates are a big dial that one can turn in different ways in order to achieve determinate ends at a microeconomic level. Similarly, it is not immediately obvious that there is a clear enough relationship between inflation and unemployment to justify feeding all reasoning about the two through a Phillips Curve-type structure.
Worst of all, these are mathematical models of the behavior of statistical aggregates that are extremely far-removed from the conditions that produced them, or which cause changes in them. The form that the guarantee that movements in one aggregate lead to movements in the other aggregate take is usually that the relevant agents are looking to optimize or maximize. How that is kept free from overdetermination by aspects of the econometric apparatus is not exactly clear. Overall, we have grown up enough, and our informational and computational environments have grown up enough, that it is time to start thinking bigger than NK models.
The pedagogy that has grown up around this approach to model-building has, I think, been one of the worst impacts NK theorizing has had. Teaching students how to construct the particular mathematical undergirding of these forecasts is not really a good strategy for teaching working economists to have facility with the wide range of economic activity that we have reliable data for nowadays. There is very little method: no structural parts of the theory are really up for grabs, so figuring out how to build theory-structures isn’t central to the approach. Your best bet is usually to look through the news for something that’s going on and then figure out how to express it within the context of the New Keynesian mathematical formalisms. This is, for obvious reasons, a uniquely bad approach when there are substantial and also specific shifts in the world that needs to be analyzed.
Climate change is probably the simplest place to see this in action. From an economic perspective, the core problem in climate change is that we have some capital goods and some consumption goods whose production and use create carbon emissions beyond those consistent with the maintenance of the climate to which we have adapted our capital and consumption goods. The solution then is to stop using certain capital goods, and start using other, particular, capital and consumption goods instead.
The distinctively New Keynesian approach to model-building can be seen in Bill Nordhaus’ DICE model. Since all capital goods are homogeneous, and all consumption goods are rolled into a homogeneous output, it is not clear where one could intervene to change particular capital and consumption goods. Instead, emissions are posited as an “externality” to the market system, which must be measured and priced. If they are, then rational/optimizing/maximizing behavior will ensure that businesses eventually shift to less carbon-intensive methods of production automagically. This is tantamount to an admission that New Keynesian economics has nothing to offer to the project of figuring out how actually to reorganize the economy in a way that avoids or ameliorates climate change. The most that can be done is change some prices.
But New Keynesian theorizing is more than just a pretext for proposing and calibrating DSGE models. It has a distinct worldview that feels driven by its intellectual context and methods, but which is meliorative in a distinctly uninspiring way. It works well enough when there aren’t major problems, but is hardly a battle cry people can take with them. I’ll try to give a sense of the gestalt here:
Markets are basically efficient at getting the most goods and services to the most people who value them most, but sometimes the real world throws sand in the gears. That sand in the gears can take on many forms, “nominal rigidities” (like Calvo/menu cost models), informational asymmetries (like Akerlof’s Lemons or the dastardly workers demanding Efficiency Wages), unpriced externalities (whether Coasean Bargains, Market Failure, or Climate Change), or something new that’s just now being reasoned about (I don’t know, since Trombley is here now, let’s say Alchian’s “ship the good apples out” model). These sand in the gears prevent the market from being truly efficient, and make it so that unemployment or other maladies crop up from time to time. The main way to fix problems is by coming up with a better specification of the Fed’s reaction function.
The political flavor of the belief system this worldview encourages is also extremely distasteful to Americans’ sensibilities: because we have not yet perfected the market, the government has to step in and do regulations and welfare. It’s not hard to see how “well, fuck it, let’s just use that effort on Perfecting the Market instead” was a far more appealing answer to most workaday Americans in the 1990s and 2000s. It would seem to produce the same outcomes for less hassle. It’s not that it’s a good answer, it’s just that the NK position pretty much concedes the substance of the New Classical position, while simply deploring that the world had to be made insufficiently perfect for markets.
Some Sociological Thoughts on New Keynesian Economics
To be clear, I don’t think anyone involved in this is a bad person or a bad thinker on their own terms. I think there is a sociological problem with the institutional structures that have cropped up in the context of trying to figure out how to study the economy.
At the end of the day, I find it’s easiest to think of the NK position as a defensive intellectual posture mostly taken up by the folks responsible for figuring out how to run the Federal Reserve during and after Reaganism. The administrative approach to model design and paper output makes the most sense in this view: publication is a reasonably straightforward path to advancement, and so nobody’s paper should call for any radical revisions.
The problem is, this administrative structure introduces a kind of model drift. If the goal is to produce advances that are identifiable, but not large enough so as to make other practitioners have to revise their approach, it is unsurprising that folks would lose sight of the fact that they are working in a determinate tradition that arose for determinate reasons at a determinate point. A quick skim through nearly any Economics textbook will find a version of the New Keynesian model presented as simply, The Economy After Some Sensible Abstractions.
Now, to be fair, they were also operating in a discipline where a decades-long red scare had weeded out a lot of the more ambitious thinkers, so it is not really their fault that they arrived at this system. The problem is, outside of the conditions that led to the development of NK econ, it’s not clear why this would be a good way to study the economy. It’s not a problem that they came up with these ideas—more ideas are never a problem. The problem is in continuing to institutionally enforce the ideas once they’ve outlived even their strategic usefulness internal to the community of people who study the economy (to say nothing of their accuracy or policy adequacy).
It also seemed to work pretty well for a while, at least using metrics internal to the relevant theory or institutions. NK Econ brought us the “Greenspan Put” and the “Great Moderation,” as well as some beautifully intricate proposals for modelling reaction functions the Fed could use if it wanted. Sadly, the theory wound up having a hard time with an environment where fiscal spending was needed, and became markedly less-useful after 2008.
The problem is, there wasn’t (and still isn’t) anything more coherent to replace it as the hegemonic “Standard Economics,” much less anything better. The main alternatives today are crazed goldbug-crypto-Austrian Economics types and Marxos with legitimate gripes but little ability to think at the level of scale required to engage a modern economy. Real Business Cycle work isn’t more coherent than any of this, and the models used in finance are extraordinarily narrow, despite being more frequently accurate. I think the broad post-keynesian program could prove a useful way to integrate most contemporary best practices, but the pedagogical equipment simply isn’t there to run that at scale.
Usually arguments against this kind of status quo receive the answer “it takes a model to beat a model.” This is necessary, but not sufficient. It takes institutions to displace institutions, and it takes alternative sources of legitimacy to displace existing sources of legitimacy. The NK edifice took decades of dedicated work by thousands of people to arrive in the shape it’s in now. I don’t have a comparable administrative-intellectual edifice to slot in place.
The epistemic situation now, now that dramatically more people are paying attention to the economy, is a bit like a civil war with so many factions that they have run out of colors to use to differentiate themselves well. Everyone is picking up bits of different theories and using them to whack one another over the head. It’s not every person for themselves, but the battlespace does not make considerably more sense than it does in that situation. Everybody’s taking wild swings, but no one’s quite sure who at, and legitimizing institutions are few and far between.
With all of this laid out, the issue here should be clear to longtime readers of this substack: none of this has anything to do with Keynes or the General Theory at all. What Keynes would have to say about it may well be worse than anything I’ve said above.
Keep reading with a 7-day free trial
Subscribe to Continuous Variation to keep reading this post and get 7 days of free access to the full post archives.