Pages

.

Why I could be a much better economist than I am


Most economists would not admit in public that they are anything less than virtuosos at what they do. But I am not "most economists". So I'll admit it: in many ways I am not that great of an economist. The part that gives me trouble is not the math - there's nothing in most econ papers that I didn't do as a sophomore in undergrad. The part that gives me trouble is the intuition.

Applied math disciplines, like physics and economics, are about 30% math skill and 70% intuition. History bears this out. Einstein, generally considered the greatest physicist of the 20th century, was no great mathematician; he needed help from Poincare, Hilbert, Minkowski, and other pure math people to work out some of the trickier aspects of relativity. But where Poincare, for example, failed to intregrate the math of special relativity (which he invented) into the rest of physics theory, Einstein's unmatched intuition allowed him to do this.

Physics was always my best academic subject by far, because I had that knack for intuition. Friends of mine who were absolutely brilliant at math (including one who published math papers in high school and now teaches at Harvard) would try taking physics classes with me, and I would end up helping them, because for some reason I could "think like a physicist" in way that allowed me to circumvent the need for tedious calculations. For example, what's the force on a charged particle that is near to a charged sheet? If you start from the math, you integrate the force from all elements of the charged sheet; it takes a few minutes of your time. But if you understand intuitively how electric fields work, you can use something called Gauss' Law to find the answer instantly; no time-wasting integral needed. If you can switch rapidly back and forth between this "symmetry" approach and the standard "furious calculation" approach as needed, you'll do well on physics tests and theory research too.

Anyway, to make a long story short, this approach never worked as well for me in economics. There are some things I seem to have good intuition for, like Bellman equations and search models. There are some things I didn't have any intuition for when I started, but got more with practice (game theory). And then there are some things that have stubbornly resisted my attempts to understand them instinctively. Much of macroeconomics falls under this heading. Reading most macro papers, I can follow the math, but I have no idea why the modeler chose to do things the way they did, and I wouldn't have been able to invent the same model if someone had told me to model the same phenomenon. This is true not only for original, breakthrough stuff, but for pedestrian models that are small tweaks on existing models.

I've wondered why this is, and I've concluded that it's mostly because there are few bedrock principles I can go back to. Example: I was sitting around with an economist friend and spinning a story about why financial market failures might make people stop working during recessions. He furrowed his brow and asked me "But how can that make people work more during booms?" To which I had no answer. It hadn't occurred to me, until that moment, that increased labor supply (or increased capital usage) was necessary for a boom. That's a simple concept, and it was built into all the models I had studied and worked through in class, but the fact of it had eluded me!

So of course I answered "Well, maybe there are no such things as 'booms', maybe what we see as 'booms' are just a normal-functioning economy, and any downward deviation in employment or growth is just a market failure." And of course, it's possible to build models like that, and maybe it's even true! But still, without understanding and accepting the idea that labor supply exceeds its sustainable long-term level during a "boom," it's hard to understand most of the popular business cycle models on a deep, intuitive level, because most of them attempt to explain the "phenomenon" of fluctuations of output and unemployment around a trend.

Another example is labor market clearing. I never really managed to pound into my head the notion that labor markets clear (because hey, unemployment exists!). And of course there are models in which labor markets don't clear, but these models are generally built by starting with a model in which labor markets clear, and then adding a friction (e.g. sticky wages) to stop them from clearing. I would have had a hard time making such models. And, perhaps of more immediate importance, I sometimes forgot the labor market clearing condition on tests! I'd be sitting there thinking "How do I close this system of equations?" for 5 or 10 minutes before I slapped my head and thought "Duh, labor market clearing!" This is in contrast to physics, where I'd whip out Gauss' Law pretty much instantly. This is the kind of dumbness that mathematical sophistication can't prevent; you can't solve a system of equations until you know which equations to write down!

(Side note: Another problem I have when making macro models is that I can't really decide which stylized facts I want to try to explain, mainly because I don't really believe in time-series econometrics, and hence I don't believe in most stylized facts. This is a somewhat different issue, though.)

Now, anyway, what I could do would be to say "Actually, it's not my fault, it's the discipline's fault. The economics Overmind has pounded a bunch of notions into people's heads that aren't really true; the only reason good macroeconomists can make their models so easily is that they share the same bunch of wrong assumptions." That would fit with the irreverent, pugnacious tone of this blog. But is isn't really true. A good economist - especially a macroeconomist - can master a form of doublethink. "I know this assumption isn't true," (s)he thinks, "but to make this model I need to act like I believe it's true." This is a trick I haven't really mastered, and it probably has more to do with my lack of an undergrad economics education than with the wholesale brainwashing of the entire econ profession.

In physics, you don't have to use doublethink, because the laws and principles you use really are true; even if you neglect friction, for example, you've probably got a good description of the motion of a hockey puck or a satellite. Econ is more about building toy worlds that don't exist - can't exist - anywhere, just to clarify your thinking. This is a mental skill that is probably best learned at age 18, and I was 26 when I started learning econ.

So, for example, I wouldn't easily be able to write anything like Larry Summers' critique of RBC models. Summers has the intuition to understand quickly, but at a very deep level, how these toy economies work and what makes them tick. I don't. And so my critique of RBC would be something more along the lines of "What are you even talking about?!" Which makes more for a cathartic blog rant than a thoughtful academic rebuttal.

Anyway, fortunately for me, I discovered (late in the game) that I have a much better intuition for finance theory (and, increasingly, for game theory!) than for macro. You don't always have to use doublethink. The CAPM might be an excellent description of risk under certain conditions. For worlds where only a certain kind of frequent, routine change occurs, Black-Scholes is probably a great model of option value.  And the decision problems of single agents trying to maximize simple utility functions in complex environments are a lot easier for my brain to work with. So I think I made the right move, in terms of transitioning toward a field where the mental techniques I learned in my undergrad physics classes have a better chance of succeeding.

But I still wish I was better at doing the other kind of economics - the doublethink, the useful oversimplification, the internally consistent fantasy storytelling. Well, I guess everyone needs goals to work toward...
reade more... Résuméabuiyad

Creeds, screeds, mockeries, Thackerays...

In my last post, I snarked that "I also think, incidentally, that maybe [libertarians] should reconsider the perfection and rightness of their ideology."

To which one of my commenters responded with an off-the-cuff masterpiece of sheer lulzy brilliance. From commenter JohnR:
Oh, c'mon now; no religion ever wants to be the first one to do that. What we'll see first is a schism, where the Galtians decide that the Roarkians are virulent heretics deserving of the harshest punishment ("She's a witch!") and they both agree that Ron Paul is an apostate if not a vile blasphemer. Soon the fun will start and there will be sects, sub-sects, deviant sects, consensual sects, chemically-enhanced sects, incense-free sects, creeds, screeds, mockeries, Thackerays, Inquisitions, Crusades, Reformations, Restorations, Abominations and Free Love. Only then will the One True and Holy Church of Rand be fully established as the single pure essence of Libertarianism, with the ruling that the other 7,843 "Libertarian" churches are merely cults to be stamped out with an iron boot.
Wow. All I can say is, when this happens, I hope my house is well-stocked with popcorn, because it's going to be fun to watch.
reade more... Résuméabuiyad

Giant Thursday Roundup (4/26/2012)



Work is getting more demanding again, as I teach myself how to teach finance classes. Thursday Roundup will have to take a break in the month of May. But in the meantime...

1. Simon Wren-Lewis has a great piece about modern macro methods and publication bias, and by "great" I naturally mean "agrees with stuff I've said in the past"...check out this money quote he pulls from John Muelbauer:

While DSGE models are useful research tools for developing analytical insights, the highly simplified assumptions needed to obtain tractable general equilibrium solutions often undermine their usefulness. As we have seen, the data violate key assumptions made in these models, and the match to institutional realities, at both micro and macro levels, is often very poor.
I feel like I've been saying this for quite some time...good to see I'm not alone...

2. Go read this great piece in The Atlantic on why we need more high-skilled immigration, and we need it NOW! If you still are afraid that high-skilled immigrants will TAKE YER JERBS, do a big and immediate rethink! High-skilled immigrants will CREATE YER JERB.

3. JW Mason argues against Roger Farmer's assertion (on this blog) that disequilibrium dynamics should be ignored.

4. Tyler Cowen has more on Baby Boomer retirement and the labor force participation rate. Still doesn't seem to explain the bulk of the current unemployment, but could make a difference as the economy recovers.

5. John Cochrane explains how a run on a money market fund works. A money market fund is not really "money"!

6. Mark Thoma and Tim Taylor point out something incredibly important that few people realize: U.S. government purchases have been going down, down, down. What has been going up are transfer payments.

7. Matt Yglesias attempts a rebuttal of my grandadvisor Greg Mankiw on the notion that rich people move to flee high taxes.

8. Paul Krugman argues that no, printing money doesn't distribute money away from the middle class.

9. Krugman also referees an Yglesias/Avent debate on the Zero Lower Bound, and concludes that the ZLB is not just psychological.

10. The Economist magazine has a neato series on the rise of 3D printing and what it means for manufacturing.

11. Tyler Cowen on Japan's cursed financial equilibrium. Someday it will end in a Japanese default. That day is likely to come within the decade. I may be the only person on the planet who thinks this will be a good day for the Japanese economy...

12. Frances Woolley does a great (and lengthy) Econ 101 explanation of why tax cuts are unlikely to increase government revenues. I expect every Republican and supply-sider out there to read it immediately, be convinced, and change their policy stance on this important issue .

13. Calculated Risk shows us the updated state of the housing bubble, in pictures. Noah summary: the bubble has entirely deflated, but "undershoot" may still push prices a bit lower before they bottom.

14. Joseph Stiglitz, Nobel Prize-winning economist and former physics major, blames the economics discipline for the global financial crisis.

15. Brad DeLong: What the world needs is for the "strong dollar" policy - in other words, the dollar's reserve currency status - to end, and end now.

16. Guess which sector has been responsible for the bulk of the job losses during this unusually weak recovery? Construction? No. Finance? Ha. It's government, yo. Paul Krugman has more. This of course reflects on the (lack of) wisdom of austerity. But it also makes me wonder if the real wages of government workers should be made more flexible...

17. Mike the Mad Biologist unleashes Philosophy of Science against Zombie Milton Friedman. Black box models are not enough!

18. Paul Krugman discusses how slowly "internal devaluation" takes effect. This is something I've argued with JW Mason about in the past. I still maintain that internal devaluation is better than nothing, especially when other countries peg their exchange rates to yours. But I of course agree that other countries not pegging their exchange rates to yours is the best solution, if you can make it happen.

19. Julian Sanchez argues that libertarians shouldn't use meritocracy as an argument for their ideology, and I agree. I also think, incidentally, that maybe they should reconsider the perfection and rightness of their ideology...
reade more... Résuméabuiyad

Neither Real, nor Business, nor Cycles


It has often been said of the Holy Roman Empire that it was "neither Holy, nor Roman, nor an Empire." However, that joke has gotten a bit stale since Voltaire wrote it in the 1700s, so I think it's time for a new one. Real Business Cycle models, it turns out, are neither Real, nor about Business, nor about Cycles.

They are, however, the macro models that annoy me far more than any other (and I'm not alone). I'll explain the joke in increasing order of the things that annoy me.

First, "Cycles". The "business cycles" in RBC models are not periodic, like cycles in physics. But they are also not "cycles" in the sense that a bust must follow a boom. Booms and busts are just random shocks. The "business cycle" that we think we see, according to these models, is simply a statistical illusion. (Actually, RBC shares this property with New Keynesian and Old Keynesian models alike. Very few people dare to write down a model in which knowing you're in a boom today allows you to predict a bust tomorrow!)

Next, "Business". Businesses are called "firms" in economic models. But if you look at the firms in an RBC model, you will see that they bear very little resemblance to real-life firms. For one thing, they make no profits; their revenues equal their costs. For another thing, they produce only one good. (Also, like firms in many economic models, they are all identical, they live forever, they make all their decisions to serve the interests of households, and they make all decisions perfectly. Etc. etc.) In other words, they display very few of the characteristics that real businesses display. This means that the "business cycle" in an RBC model is not really the result of any interesting characteristics of businesses; everything is due to the individual decisions of consumers and workers, and to the outside force of technological progress.

Finally, "Real". This is the one that really gets me. "Real" refers to the fact that the shocks in RBC models are "real" as opposed to "nominal" shocks (I've actually never liked this terminology, since it seems to subtly imply that money is neutral, which it isn't). But one would have to be a fool not to see the subtext in the use of the term - it implies that business-cycle theories based on demand shocks are not, in fact, real; that recessions and booms are obviously caused by supply shocks. If RBC is "real", then RBC's competitors - Keynesian models and the like - must be fantasy business cycle models.

However, it turns out that RBC and reality are not exactly drinking buddies. I hereby outsource the beatdown of the substance of RBC models to one of the greatest beatdown specialists in the history of economics: the formidable Larry Summers. In a 1986 essay only slightly less devastating than his legendary dismissal of the Winklevoss twins, Summers identified three main reasons why RBC models are not, in fact, real:

1. RBC models use parameter values that are almost certainly wrong,

2. RBC models make predictions about prices that are completely, utterly wrong, and

3. The "technology shocks" that RBC models assume drive the business cycle have never been found.

I encourage everyone to go read the whole thing. Pure and utter pulpification! Actually, this essay was assigned to me on the first day of my intro macro course, but at the time I wasn't able to appreciate it.

So Real Business Cycle models are neither Real, nor about Business, nor about Cycles. Are they models? Well, sadly, yes they are...of a sort. You actually can put today's data into an RBC model and get a prediction about future data. But see, here's the thing: that prediction will be entirely driven by the most ad-hoc, hard-to-swallow part of the model!

Basically, here's how an RBC model works. You take every factor of production you can measure - capital, labor, inventories, etc. - and you factor it out, and then you're left with the part of production you can't explain, which is called the "residual". You then label this residual "technology", and you assume that it moves according to some sort of simple stochastic process - for example, an AR(1). The rest of the model is just a description of the ways in which the rest of the economy responds to that AR(1) technology "process". In RBC models, this response is usually as simple and uninteresting as possible; pretty much everything is driven by the uber-simplistic movement of "technology".

In other words, if I want to make a forecast using an RBC model, that forecast will be based on the assumption about tomorrow's level of "technology" - i.e. the part of the model that doesn't come from data we can directly measure - and that level, in turn, will be "predicted" by nothing more than the simplest stochastic equation imaginable! As you might therefore expect, RBC models do not do so well at forecasting the future (though, to be fair, a few disagree with that assessment).

(Note that this makes RBC a glaring example of what I call "Label-the-Residual Economics", in which the economist assumes that the part of the world that we can't measure is the Mysterious Force that drives everything, but that we can accurately predict the future behavior of this Mysterious Force.)

Now, some work has been done on improving RBC models since their inception - instead of technology, for example, some modelers try to tie business cycles to news about technology. But most of the macro profession has moved on to other types of DSGE models, especially new Keynesian models. And yet the RBC paradigm persists, especially at certain universities. Why? In a recent blog post, Simon Wren-Lewis lays out this case that's it all about the politics:

In RBC models, all changes in unemployment are voluntary. If unemployment is rising, it is because more workers are choosing leisure rather than work. As a result, high unemployment in a recession is not a problem at all...workers choose to work less and enjoy more free time... 
If anyone is reading this who is not familiar with macroeconomics, you might guess that this rather counterintuitive theory is some very marginal and long forgotten macroeconomic idea. 
You would be very wrong. RBC models were dominant in the 1980s, and many macroeconomists still model business cycles this way. I have even seen textbooks where the only account of the business cycle is a basic RBC model... 
One explanation [for RBC's popularity] is ideological. The commonsense view of the business cycle, and the need to in some sense smooth this cycle, is that it involves a market failure that requires the intervention of a state institution in some form. If your ideological view is to deny market failure where possible, and therefore minimise a role for the state, then it is natural enough (although hardly scientific) to ignore inconvenient facts. For the record I think those on the left are as capable of ignoring inconvenient facts: however there is not a left wing equivalent of RBC theory which plays a central role in mainstream macroeconomics.
Whether Wren-Lewis is right about this or not, I think the continued semi-popularity of RBC models definitely shifts the field of macro toward more politically conservative policy recommendations. It does this by shifting the "Overton Window". Without a strong RBC presence, macro might be primarily a debate between New Keynesians and Old Keynesians, or New Keynesians and complexity theorists, or New Keynesians and people who think we just don't know enough to really model the business cycle yet. In other words, it might be a debate between people who think that the economy can be managed effectively by central bank monetary policy, and people who think deeper government interventions are warranted. Instead, for the past two decades, academic macro has been primarily a debate between New Keynesians and RBC people - a debate between minimal-interventionists and those who oppose any sort of government intervention at all. In the "freshwater-saltwater" debate, supporters of things like fiscal stimulus were left high and dry.

At any rate, this bit about politics is a digression. The central point of this post is that Real Business Cycle models, whether politically motivated or not, are massively oversold as descriptions of the recessions and booms that we observe and live through. People should know that they contain neither Reality, nor Business, nor Cycles.

How long did it take the Holy Roman Empire to finally give up the ghost? Depressingly, it was more than half a century after Voltaire made his little joke...
reade more... Résuméabuiyad

Venture capital is sucking (your money)


Many of America's tech startups are funded by venture capital. But why on earth would any venture capitalist invest in a risky, unproven new company with a risky, unproven new technology? Answer: High risk goes hand in hand with high returns. Most of the new companies will fail, but the ones who succeed will be huge, more than making up for all the failures. 

Well, that's the theory anyway. If venture capital is taking all that risk and not making stellar returns, then something is severely broken.  

Friday's finance seminar here at UMich was by Steven Kaplan of the University of Chicago's Booth Business School, who presented a paper he's writing with Robert Harris and Tim Jenkinson entitled "Private Equity: What Do We Know?". In this context, the term "private equity" refers both to buyout firms (i.e. what we normally call "private equity") and venture capital firms. The paper is all about measuring the returns that these industries have earned in recent decades.

Most of the talk focused on the buyout industry; only in the last few minutes did Kaplan actually get to talk about VC. But when he did, what I saw made my eyes bug out! Here is the key picture from the paper:

The different lines represent not different VC firms, but different data sources - each line is the average across all VC firms studied. On the x-axis we have "vintage" year, which is the year a firm started investing. On the y-axis we have the Private Market Equivalent ratio, which is a measure of how well funds did relative to the S&P 500 (a PME of greater than 1 means that a fund beat the S&P). Thus, if a line is at PME=2.5 at 1995, it means that, on average, VC firms that started investing in 1995 made 2.5 times the return of U.S. public stocks in general.

So what happened was that before the dot-com bubble burst in 2000, VCs did amazingly well. In the decade since, they've done slightly worse than the S&P 500 - in other words, they've done so poorly that you'd have been  better off buying Ye Olde Vanguard 500. And when you factor in risk, the comparison isn't even close; venture capital has a beta of well over 1, meaning that VCs are exposed to more aggregate risk than an index fund.

In other words, since the end of the dot-com bubble, venture capital has proven to be a sucker's bet. 

Now, you can respond by saying "OK, sure, but that's only one decade. What we really care about are longer-term returns." And maybe that's right. Maybe VC returns will eventually bounce back, justifying this long fallow period. BUT, whether poor VC performance in the 2000s was structural or random is something we won't be able to know for a long long time - not until we've gathered a statistically large sample of VC performance. By that time, you and I will probably be retired or dead. 

In the meantime, all we can do is guess. And dang it, but that graph sure looks like a structural break to me.  Something looks like it broke the VC business model after the dot-com crash. Maybe the new tech bubble (Facebook, etc.) will pump those returns back up, but there have been some big IPOs and some big acquisitions in Tech Bubble 2.0, and VC returns haven't really bounced back yet, so I'd be cautious. What's more, I'm starting to read about a slump in venture funding...

Could this be the (temporary) end of the VC industry? If so, is it a harbinger of technological stagnation, or simply the passing of a financial fad?

(Oh, one more thing. Kaplan et al. find that buyout firms, in contrast to VCs, have consistently beat the market over the last three decades or so. Maybe that has something to do with that enormous tax break they get for buying up firms, loading the firms up with debt, and then paying themselves a dividend while leaving the firm to die...)

Update: An anonymous commenter suggests that a few VC firms manage to consistently beat the market. It turns out she's right (at least as of 2005). This paper, also by Steve Kaplan, shows that VC firm performance is persistent; those firms that make good returns in one period are likely to make good returns in the next period. VC firm performance also appears to be highly skewed - a few firms are making most of the money. Together, these two facts suggest that there are a few really skilled VCs out there who invest successfully year after year, and a large number of truly abysmal VC firms that drag the total return way down. This begs the question of who is throwing money at all these awful VC firms when there are proven winners out there! One possibility is that the "winner" firms limit the amount of capital they are willing to accept; they know that there is a limited number of good projects, and that if they get too big they won't be able to keep returns high. This means that if an investor wants to invest in VC, she may simply not be able to give her money to one of the "winner" firms. This explanation would be bad news for proponents of market efficiency, but would be consistent with the picture of overall stagnation in the VC-funded part of the tech industry, since the driver of low returns would still be the limited number of good new tech ventures.

Update 2: Peter Thiel explains some reasons why a lot of VC funds fail.

Update 3: More reasons the VC model is broken.

Update 4: Here's a Felix Salmon blog post saying essentially the same thing as this post, except with many more charts, graphs, and explanations.
reade more... Résuméabuiyad

Joel Kotkin lives in a Reaganite fantasy adventureland


I am in a bad mood, and like many bloggers, when I am in a bad mood I go looking for stupid things that people have said on the Internet, in order that I may slap these stupid things down like errant hockey pucks. Sometimes this gets me into trouble, when I mistakenly think something non-stupid is, in fact, stupid. Fortunately I have discovered a fool-proof solution to this problem - a bottomless gold mine of stupid, an immortal cyborg goose that forever pops out eggs of pure burnished 24-karat stupid. This, of course, is the Wall Street Journal editorial page. Thank you, Rupert Murdoch, for providing me with late-night cathartic therapy.

Today's golden egg of stupid comes courtesy of writer Allysia Finley, who to her credit has an awesome name, but to her opposite-of-credit (detriment? discredit?) has apparently attained her current job by mastering the cringe-inducing pastiche of late-70s-vintage conservative buzzwords, dog-whistles, and sheer poppycock that represents the overhead cost WSJ writers must pay in order to convince their readers that they are In The Tribe. Her article is actually an interview with Joel Kotkin, a rotund mustachioed man who seems to be quite a Respected Scholar in conservative circles, despite the fact that his official job seems to the untrained observer to be nothing more glamorous than "fellow at Chapman University and the Legatum Institute, a London-based think tank". According to Kotkin's Wikipedia page, he is famous for extolling the suburbs and denouncing rail transit. No wonder they love him.

In the interview with Allysia Finley (gosh I love that name, I am totally naming my son daughter "Allysia"), Kotkin unleashes a long diatribe against the state of California. He first calls attention to what appears to be a massive exodus from the Golden State:
Nearly four million more people have left the Golden State in the last two decades than have come from other states. This is a sharp reversal from the 1980s...
Wow, California must be well on its way to being a ghost town. Oh wait, no. Because during those past two decades, immigration more than made up the difference. California's population has risen from 29.8 million to 37.3 million.

Now here, early on, comes the part where Kotkin says something smart. Stupidity is far more annoying when it is mixed with a dash of smart. Here he goes:
Part of California's dysfunction, he says, stems from state and local government restrictions on development. These policies have artificially limited housing supply and put a premium on real estate in coastal regions.
Yes! And doesn't this explain the internal migration away from California? As immigrants have poured in from Asia, Mexico, and elsewhere, California residents have been priced out of the real estate market.

So shouldn't we...um...lift those restrictions on development? NO, sayeth Kotkin:
And things will only get worse in the coming years as Democratic Gov. Jerry Brown and his green cadre implement their "smart growth" plans to cram the proletariat into high-density housing. "What I find reprehensible beyond belief is that the people pushing [high-density housing] themselves live in single-family homes and often drive very fancy cars, but want everyone else to live like my grandmother did in Brownsville in Brooklyn in the 1920s," Mr. Kotkin declares.
So people are moving away because housing is too expensive, but increasing housing density is a no-go because it's undignified? "Government development restrictions" are bad, but high-density housing - which is currently prevented by government development restrictions, as Matt Yglesias will tell you - is going to "cram the proletariat" into some kind of communist dachas?!

What alternative does Mr. Kotkin suggest? Building huge expanses of far-flung exurbs that let everyone have a McMansion and drive an hour and half each way to work every day? Oh wait, they have tried that, it's called the Inland Empire. It's in California (Hello! Anybody home??). And it is being abandoned, as it proves to be economically non-viable. Across the country, in fact, Americans are fleeing the exurbs in record numbers and heading for higher-density urban areas.

So much for Joel Kotkin's proletarian sprawltopia.

Next, the author injects a little dash of concern-trolling:
Mr. Kotkin describes himself as an old-fashioned Truman Democrat. In fact, he voted for Mr. Brown[.]
Ah yes, so a laundry list of standard Republican talking points is really just hard-nosed centrist wisdom, because it comes from a "Truman Democrat" and alleged Jerry Brown voter! Riiiiiiight. Uh-huh.

After concern-trolling, Kotkin takes aim at California's push for green energy:

Notwithstanding all of the subsidies the state lavishes on renewables, green jobs only make up about 2% of California's private-sector work force—no more than they do in Texas. 
Of course, there are plenty of jobs to be had in energy, just not the type the new California regime wants. An estimated 25 billion barrels of oil are sitting untapped in the vast Monterey and Bakersfield shale deposits. "You see the great tragedy of California is that we have all this oil and gas, we won't use it," Mr. Kotkin says.
Why would shale jobs be a much bigger deal than solar and wind jobs? In fact, both are highly capital-intensive industries. But much of the research for renewable technology is done in California, while much of the research for fossil fuel technology is done in Texas. Kotkin's dream that "drill here, drill now" could save the California economy is nothing more than the exact same fantasy that has become the Republican Party's hallucinogen of choice since 2008.

(In fact, California would do a lot more business from legalized marijuana. Unsurprisingly, Kotkin fails to mention this.)

Next on the list of off-the-shelf Republican talking points is - unsurprisingly - taxes!
Meanwhile, taxes are harming the private economy. According to the Tax Foundation, California has the 48th-worst business tax climate. Its income tax is steeply progressive. Millionaires pay a top rate of 10.3%, the third-highest in the country. But middle-class workers—those who earn more than $48,000—pay a top rate of 9.3%, which is higher than what millionaires pay in 47 states.
Wow, Cali is blasting down the Road to Serfdom in a Tesla Roadster. Only...no. Because Kotkin fails to mention the reason state income taxes are that high. You see, since 1978 (when normal people might have actually used the word "proletarian" in casual conversation; I don't know, I wasn't born yet), California has had something called Proposition 13, which caps property taxes at very low levels. Property taxes (which are one of the most efficient forms of taxation) go to pay for things like schools and local government, so in the absence of property taxes, California has had to raise other taxes quite a lot. In fact, Cali's taxes are too low, not too high. They're just the wrong kind of taxes.

But that doesn't fit Joel Kotkin's Reagan-throwback narrative...

Next up, we have some pure unadulterated old-fashioned bullshit:
"[I]f you're a guy working for a Silicon Valley company and you're married and you're thinking about having your first kid, and your family makes 250-k a year, you can't buy a closet in the Bay Area," Mr. Kotkin says.
The median home price in the Bay Area is indeed high: about $358,000. For an income of $250,000, that gives a price-to-income ratio of about 1.4. This is lower than the average price-to-income ratio for housing in any major U.S. city, and far lower than the national average. In other words, while housing in the Bay Area is not cheap, if you have $250k you will easily be able to buy a house. I have friends and family in the Bay who make less than this and own very nice houses in Silicon Valley and North Berkeley. So Kotkin is just spewing forth falseness.

And it doesn't stop! This little jaw-dropper may be the most awe-inspiring of all:

As a result, California is turning into a two-and-a-half-class society. On top are the "entrenched incumbents" who inherited their wealth or came to California early and made their money... 
It's "a very scary political dynamic," he says. "One day somebody's going to put on the ballot, let's take every penny over $100,000 a year, and you'll get it through because there's no real restraint."
So California's decadent rich are intent upon having the state government take every penny of their riches, thus rendering themselves instantly non-rich? California is about to become a communist plutocracy?

Does Joel Kotkin even, you know, think before the words bounce trippingly forth from his tongue?

Now, to round out the whole spectacle, we get the obligatory hagiographies of Reagan, Texas, conservatism, the South, the other Red States, and fossil fuel industries:

California used to be more like Texas—a jobs magnet. What happened? For one, says the demographer, Californians are now voting more based on social issues and less on fiscal ones than they did when Ronald Reagan was governor 40 years ago... 
Mr. Kotkin lists four "growth corridors": the Gulf Coast, the Great Plains, the Intermountain West, and the Southeast. All of these regions have lower costs of living, lower taxes, relatively relaxed regulatory environments, and critical natural resources such as oil and natural gas.
But really, I'm a Truman Democrat! Durr hurr hurr!

OK, that is the end of the article, and man am I glad, because I am getting low blood pressure from lying here so long. But I had to do it, because there was just so much nonsense in this article. It was nothing more than a dog-whistle symphony packed chock-a-block with contradictory claims, bad numbers, and Republican talking points.

In other words, it was great cathartic therapy. Rest easy, depressed people of the world...someone has said something stupid in the Internet. And you never need to look very far to find it.
reade more... Résuméabuiyad

Down with particle physics, up with Big Energy Research!


Nobel Prize winning physicist Steven Weinberg has a big article in the New York Review of Books lamenting the crisis in Big Science. He focuses on two areas that are in danger of being deprived of funding: 1) particle accelerators, and 2) telescopes.

Weinberg is a particle physicist, one of the heroes who developed the Standard Model. Thus it is not surprising that most of his article concentrates on particle physics experiments. Unfortunately, I think that appeals for governments to pour more money into particle accelerators are A) doomed to fall on deaf ears, and B) not really very convincing in the first place. Let me explain why.

First of all, the Standard Model of particle physics is good. Really good. In fact, we've never conducted an experiment where it makes an incorrect prediction at any level of precision!! In that sense, it is one of the most successful theories ever. Now, the Model may or may not fail at ultra-high energies (such as those that could be produced inside a black hole or a multibillion-dollar particle accelerator), or at galactic distances. But these are not environments that will ever matter for human beings on Earth.

As Weinberg points out, the Standard Model is incomplete. It doesn't include gravity. But we have another theory, general relativity, whose track record is just as good, to describe gravity. Unifying these theories would increase our understanding of the nature of the Universe, but it's not clear whether it would improve our ability to predict our immediate surroundings.

In other words, new particle accelerators may be able to answer interesting questions, but they are unlikely to produce much of technological value.

In fact, this has proven true for the last several generations of particle accelerators. We've discovered a zoo of new particles, and these discoveries have improved our theories greatly. But none of these new particles has been something we can exploit for technological applications. In the early 20th century, new fundamental physics led rapidly to applications like nuclear bombs, semiconductors, lasers, and GPS. But to my knowledge, nobody is even trying to make a device that exploits the properties of B-mesons or neutrino mass.

To this, add another problem, which Weinberg discusses: We actually have no idea if the "next generation" of particle accelerators would find anything useful. In the past, we always had new theories that predicted stuff we should expect to see if bigger accelerators were built (for example, the Large Hadron Collider was built to search for the predicted Higgs Boson). As of now, new physics theories have made no new concrete predictions about what should come out of bigger and more expensive accelerators. If we build those accelerators, it will purely for speculative, exploratory purposes - to see what might be out there.

So basically, if physicists focus on asking for billions of dollars for new particle accelerators (the Large Hadron Collider cost $9B and the next one would certainly be a lot more), they are almost certain to be disappointed.

And guess what? It is hard for me to label this a tragedy. Yes, I think it is very important to push the boundaries of our understanding of fundamental physics. But our society is facing huge, immediate problems - most pressingly, the imminent end of the fossil fuel era.

The increasing cost of fossil fuels is an absolutely huge problem. It is nipping at the heels of our civilization like a T-Rex in the rearview mirror. At its most apocalyptic, the fossil fuel crunch threatens to yank back most of the gains our species has made in the last three centuries. Even a more reasonable assessment puts us in danger of shrinking economies, transportation breakdowns, declining living standards, and technological stagnation. And as for global warming, the only way we are going to halt climate change is by inventing clean energy sources so cheap that we simply leave coal and shale oil and tar sands sitting in the ground.

The energy crunch is a problem that Big Science is uniquely equipped to fight. Burning stuff is easy. Converting sunlight into electricity and transporting it hundreds of miles, then storing the energy in an electric car, is hard. Designing genetically engineered organisms to suck CO2 out of the atmosphere and combine it with sunlight to produce oil is even harder. Creating controlled nuclear fusion is the hardest of all.

But if we are going to replace fossil fuels, we are going to have to do one or more of these hard things. There is just no other option. It's Big Science or bust. Our nation needs to be spending many, many billions of dollars - tens of billions each year, at the very least - on Big Energy research to create better solar power, better biofuels, and better nuclear power.

This may mean temporarily halting the progress of big-budget particle physics. Sure, the money for Big Energy Research could (and should!) be taken from other, even less useful government spending, such as the V-22 Osprey, oil company subsidies, or "health care". In fact, putting sufficient money into Big Energy Research is almost certainly going to require shifting money from all of these sources. But to the extent that the public is reluctant to shift unlimited government dollars into Big Science, there will have to be tradeoffs.

And more important than the money tradeoff may be the talent tradeoff. Currently, thousands of our best physicists are being shunted into careers in experimental particle physics, spending their lives working at CERN or Fermilab. These are our very best physics brains, and they are a very scarce commodity. In my opinion, we need these people to be working on solar power, biofuels, and nuclear power. Applied physics is not as intellectually thrilling or as nerd-glamorous as fundamental physics, but we can ill afford to pay our super-nerds to indulge their philosophical whimsy at a time like this.

So I am suggesting, not an abandonment of Big Particle Physics, but a pause. If and when energy stops getting more expensive and resumes its march toward abundance, our species will have the breathing room to look for answers to questions like how to combine gravity with the Standard Model. Until then, we are in crisis mode, and all of our Big Science resources - and more - should be going into fighting the T-Rex.

(Update: Some people have taken issue with me putting "health care" in quotes and saying we need to cut it. Actually I think we should nationalize healthcare, drive costs down, and implement a system where doctors are incentivized to improve health outcomes, not increase the number and cost of procedures...hence the scare quotes.)
reade more... Résuméabuiyad

Thursday Roundup (4/19/2012)


Ladies, gentlemen, and extraterrestrial beings, your Thursday Roundup:

1. Mark Thoma attends the INET conference in Berlin, and offers his thoughts. Predictably, the conference was just as awesome as I knew it would be, making me all the more bitter that I was too poor to go. (shakes fist and bellows at the sky: "SOROSSSSSSSSSSSSSS!!!")

2. John Taylor discusses a paper analyzing the impact of fiscal stimulus using several different New Keynesian DSGE models. The paper finds that multipliers are substantial if stimulus is A) composed mainly of government expenditures, and B) temporary rather than permanent. In other words, if stimulus is exactly what Keynesians say it ought to be! Good to see Taylor coming out of the closet, as it were...

3. Unfortunately, exponential economic growth can't continue forever, and here's the physics to explain why. I've actually been aware of this for a long time. As my understanding of the general numbers goes, we've got a century or a century and a half of Twentieth Century level growth left before we start to cook the planet with waste heat; if we slow our rate of growth to half of what we enjoyed in the 20th, we'll have two or three more centuries. If we expand into the solar system, that number will be higher, but one thing is certain; we will not grow at 20th Century rates for the next 1,000 years. (Of course, this is moot, since in a few decades we'll all be robots, right?)

4. Foreign Policy has an extremely good (and long) interview with Ed Luce about American decline. Two key facts: 1) America's share of world output is NOT holding steady, but has declined steadily. 2) America's military spending will not dwarf that of China for very much longer. Thus, two of the key talking points of the anti-declinist school of punditry are just factually false.

5. Mark Thoma and some NY Fed economists report that if you feed the right info into a DSGE model, it can forecast the economy as well as the judgment of professional forecasters. However, Robert Waldmann argues that really you're just rigging the model to give the right answers.

6. Alex Tabarrok has a good post on vocational education. We may finally be getting high-quality technical vocational education in the U.S.! Take my advice, policymakers: make vocational school a good place to meet people of the opposite sex, and all the kids will want to go.

7. Ryan Avent reports that China is probably not overinvesting to the degree that many believe. I am not surprised; China has labor and needs capital, so it should be doing exactly what it's doing, i.e. building capital at a furious rate. However, note that with a trade deficit instead of a surplus, China can build up its capital even faster!

8. However, China has other fish to fry, as this article on Chinese military corruption reveals. Note the relevance of this for my discussion with Scott Sumner about culture...personal-level pragmatism (rational rent-seeking) is causing the quality of the Chinese military's institutions to decrease. It's just hard to know how culture is going to affect institutions...

9. Steve Randy Waldman continues to call for an alliance between Keynesians, market monetarists, and New Keynesians. His post, unfortunately, is probably way too smart, erudite, and sensible to receive much attention.

10. Steve Williamson discusses the idea that Ben Bernanke is just a "wimp" who refuses to reflate the economy. In the process, Steve describes what the Fed actually can do, policy-wise, which is definitely worth a read.

11. John Cochrane discusses why asset prices and volatility seem to move in opposite directions.

12. Alex Tabarrok reports on how debtor's prison is making a comeback. Just another way that the U.S. is moving back toward the ideals of free markets and personal responsibility that gave us such unbridled prosperity in the 18th century.

13. The American Enterprise Institute appears to have jumped onto the New Urbanism bandwagon! File under "Things I never thought I'd see in my lifetime."

14. Mike Konczal offers some arguments for why better regulation, not tighter monetary policy, is the proper way to rein in the excesses of the financial sector.

15. Simon Wren-Lewis does not like my criticism of the Wieland model-aggregation approach. I may respond more if and when I figure out what Wren-Lewis is talking about. This has not yet happened.
reade more... Résuméabuiyad

Equilibria, unique and not-so-unique (guest post by Roger Farmer)


Roger Farmer had some very interesting things to say regarding my last post on equilibrium analysis, but unfortunately Blogger somehow imposes a maximum comment length! And really, it's too interesting to languish down in the comments section, so now it's a guest post! Without further ado:

***


As a proponent of models with multiple equilibria, let me say a few words about your very interesting post on disequilibrium. I am a big fan of Bob Lucas' insistence on restricting ourselves to equilibrium models. Why?

The idea of disequilibrium is borrowed from the physical sciences where it has meaning in the context of, for example, Newtonian mechanics. A ball rolling down an inclined plane is an example of a physical system in disequilibrium. When it reaches the bottom of the plane, friction ensures that the ball will come to rest. That is an equilibrium. But it is not what we mean by an equilibrium in economics.

An economic equilibrium, in the sense of Nash, is a situation where a group of decision makers takes a sequence of actions that is best, (in a well defined sense), on the assumption that every other decision maker in the group is acting in a similar fashion. In the context of a competitive economy with a large number of players, Nash equilibrium collapses to the notion of perfect competition.  The genius of the rational expectations revolution, largely engineered by Bob Lucas, was to apply that concept to macroeconomics by successfully persuading the profession to base our economic models on Chapter 7 of Debreu's Theory of Value, as opposed to the hybrid models of Samuelson's neoclassical synthesis. In Debreu's vision, a commodity is indexed by geographical location, by date and by the state of nature.  Once one applies Debreu's vision of general equilibrium theory to macroeconomics, disequilibrium becomes a misleading and irrelevant distraction.

The use of equilibrium theory in economics has received a bad name for two reasons.

First, many equilibrium environments are ones where the two welfare theorems of competitive equilibrium theory are true, or at least approximately true. That makes it difficult to think of them as realistic models of a depression, or of a financial collapse, since the welfare of agents in a model of this kind will be close to the best that can be achieved by a social planner. An outcome that is best, in this sense, does not seem, to me, to be a good description of the Great Depression or of the aftermath of the 2008 financial crisis.

Second, those macroeconomic models that have been studied most intensively, classical and new-Keynesian models, are ones where there is a unique equilibrium. Equilibrium, in this sense, is a mapping from a narrowly defined set of fundamentals to an outcome, where  an outcome is an observed temporal sequence of unemployment rates, prices, interest rates etc. Models with a unique equilibrium do not leave room for non-fundamental variables to influence outcomes. It is not possible, for example,  for movements in markets to be driven by sentiment; what George Soros has called "the mood of the market". It is for that reason that conventional theory seeks to explain large asset price swings as disequilibrium phenomena.

Multiple equilibrium models do not share these shortcomings (see, for example, this) And they do not need to appeal to disequilibrium explanations to account for phenomena that are anomalous when one adopts the unique-equilibrium perspective. But although multiple equilibrium models have advantages in these respects, they lead to a new set of questions. It is easy enough to write down an economic model where more than one outcome is possible. But how would a rational economic agent behave if placed into an environment that was the real world analog of the economist's model?

The answer, I believe, is that a model with multiple equilibria is an incomplete model. It must be closed by adding an equation that explains the behavior of an agent when placed in an indeterminate environment. In my own work I have argued that this equation is a new fundamental that I call a belief function.  It represents the end result of a process of learning; either from past observations of economic data or from copying the behavior of ones peers.

The belief function is a mapping from past observable variables to expectations of all future variables that are relevant to a decision maker. It is this function that guides behavior when a rational expectations model is indeterminate. The belief function provides a way for social psychology to influence economic outcomes, and in my view; it should be accorded the same methodological status as that of preferences, technology and endowments in a classical or new-Keynesian model.

Some recent authors have argued that rational expectations must be rejected and replaced by a rule that describes how agents use the past to forecast the future. That approach has similarities to the use of a belief function to determine outcomes, and when added to a multiple equilibrium model of the kind I favor, it will play the same role as the belief function. The important difference of multiple equilibrium models, from the conventional approach to equilibrium theory, is that the belief function can coexist with the assumption of rational expectations. Agents using a rule of this kind, will not find that their predictions are refuted by observation. It is the belief function itself that selects an equilibrium.

I work with models of multiple equilibrium that have incomplete labor markets (see, for example, this); as a consequence of this incompleteness, these models have multiple steady state equilibria. Incomplete labor market models fit the data better than their classical or new-Keynesian counterparts ( see http://rogerfarmer.com/newweb/pdffiles/farmer_phelps_volume_revision.pdf ). And because they do not imply that all unemployment is socially optimal, they are able to account for the mass human misery caused by persistent unemployment that we observe in periods like the Great Depression or the 2008 financial crisis.

Like their classical or new-Keynesian counterparts, incomplete labor market models explain data as a unique mapping from fundamentals to outcomes. But fundamentals include more than just technology, preferences and endowments; they also include a role for market sentiment. Outcomes in these models can sometimes be very, very bad.

This brings me to your excellent post on the use of the disequilibrium assumption in economics. If by disequilibrium, I am permitted to mean that the economy may deviate for a long time, perhaps permanently, from a social optimum; then I have no trouble with championing the cause. But that would be an abuse of the the term 'disequilibrium'. If one takes the more normal use of disequilibrium to mean agents trading at non-Walrasian prices, as in the  models of Benassy and Dreze from the 1970s; I do not think we should revisit that agenda. Just as in classical and new-Keynesian models where there is a unique equilibrium, the concept of disequilibrium in multiple equilibrium models is an irrelevant distraction that has been imported, inappropriately, from the physical sciences where equilibrium means something very different from its modern usage in economics.

Roger Farmer

***

(back to Noah)


Lots of very deep stuff here, so I'll just add a few quick responses/comments:

1. Roger's post addresses, in a much more precise and well-informed way, possibilities (2) and (3) from my last post. But it rejects possibility (1), which involves non-Walrasian prices.

2. I really like the idea of a "belief function". This is basically chucking Rational Expectations - not because people are irrational, but because the notion, as Lucas thought of it, of people's beliefs coinciding with the equilibrium outcome of an economic model doesn't make a lot of sense in a world of constantly shifting multiple equilibria.

3. I heavily suspect that I am one of the people importing the idea of "disequilibrium in multiple equilibrium models" from the physical sciences. I am not yet convinced that it is an inappropriate analogy, or that non-Walrasian prices are a fruitless line of inquiry, but I will have to read a lot more to understand properly...

Anyway, much thanks again to Roger for the guest post. This is not the most attention-grabbing media-friendly stuff, but it is very deep and important to the future of both how economics is done and how the discipline is perceived by outsiders.
reade more... Résuméabuiyad

DSGE vs. Weather Forecasting


You often hear defenders of the DSGE modeling framework say something like this: "You can't attack DSGE. Its just a framework. You can use it to model anything." But this isn't true. If it were true, we wouldn't have the label "DSGE", we'd just call it "modeling".

The "E" in DSGE stands for "Equilibrium." DSGE models define an equilibrium in which markets clear. The economy is then assumed to move toward that equilibrium; the motion is called "transition dynamics", and is usually left unspecified. 

Some economists have argued that transition dynamics are not that important. For example, in "Econometric Policy Evaluation: A Critique" (1976), Robert Lucas writes:
[O]ne hears talk of a "disequilibrium dynamics" which will somehow...[go] beyond the sterility of dp/dt = k(p - p_e)...[but this] will fail...
In other words, prices will move smoothly and steadily toward the equilibrium, so we should just focus on finding the equilibrium.

But this is not necessarily true. Why would "disequilibrium dynamics" be important? I can think of several reasons.

Reason 1: The equilibrium may shift at about the same speed that convergence happens. If the economy is trying to hit a moving target, chaos will result. See this paper by Andrew Lo for a semi-technical explanation of why this true. (In math-speak, this happens when the rate of convergence, k, may be of the same order as the parameters governing the time-scale of the shock process.)

Reason 2: The equilibrium may not be stable. See this blog post by Jonathan Schlefer (whose book I just ordered off of Amazon):
In 1960 Herbert Scarf of Yale showed that [even the most ideal kind of] economy can cycle unstably. The picture steadily darkened. Seminal papers in the 1970s, one authored by [General Equilibrium inventor] Debreu, eliminated "any last forlorn hope," as the MIT theorist Franklin Fisher says, of proving that markets would move an economy toward equilibrium. Frank Hahn, a prominent Cambridge University theorist, sums up the matter: "We have no good reason to suppose that there are forces which lead the economy to equilibrium."
In other words, the smooth convergence equation that Lucas wrote down may simply not be true.

Reason 3 (the biggie): There may be multiple equilibria. You rarely see famous and influential DSGE papers with multiple equilibria, and when you do see them, there are usually only two equilibria. But I know of absolutely no reason why the real economy should have a unique equilibrium. And I know of absolutely no reason why the number of equilibrium in the economy should be small! But there seems to be a huge publication bias in favor of smaller number of equilibria (Roger Farmer's efforts notwithstanding). This annoys me. 

So in a Robert Lucas DSGE world, we have one slow-changing equilibrium toward which the economy smoothly and rapidly converges. But the real world may be one of multiple, rapidly-shiftng equilibria toward which the economy does not move smoothly.

Now, you could still formally model that sort of world with a DSGE model, but your results would be useless gibberish. You'd have classical chaos.

What do you do when you have classical chaos? One thing you can do is to use a weather-forecasting approach. Basically, you identify the economy's microfoundations, and then you use big powerful computers to make very short-term predictions using those microfoundations. Here's a Bloomberg article in which theoretical physicist Mark Buchanan suggests exactly this kind of approach:
No mathematician can “solve” the complex equations for air in the atmosphere...It’s natural to wonder if a similar mechanism might be driving the financial crises and business cycles that typify the economic “weather” we’ve experienced over the centuries. Unfortunately, today’s equilibrium theories refuse to entertain the possibility... 
While an appreciation of instability propelled atmospheric science forward, economics was binding itself into a rigid framework...when studies in the 1970s found that [economic] equilibrium is generally unstable - and so should tend to fall apart...[economic] theorists for the most part simply ignored this inconvenient fact and went on as before. Most still do. 
American economist Milton Friedman set the tone. “The study of the stability of general equilibrium is unimportant” because “it is obvious that the economy is stable,” he was quoted as saying. Friedman was notoriously mischievous and slippery in his argument, so I’m not sure he really believed what he said. But most economists today act as if they do. 
This is too bad, because a focus on the origins of instability just might help financial economics achieve a conceptual liberation akin to that which atmospheric scientists achieved in the 1950s. Economists might come to accept that equilibrium doesn’t describe everything, or even very much, and that natural elements of instability and turbulence drive the outcomes that matter most.
This is an alternative to DSGE. Go directly from microfoundations to some variety of agent-based modeling. Ignore the equilibrium and focus entirely on the micro-level "transition" dynamics. Incidentally, I think this kind of approach could have other advantages - for example, because you wouldn't have to solve for a simple equilibrium, it might be easier to add large numbers of frictions to the model. 

But the main reason to contemplate using a weather-forecasting approach is that the real economy may just be too chaotic for DSGE to be very useful in modeling it.
reade more... Résuméabuiyad

Greg Mankiw's touching faith in Tiebout Equilibrium


Greg Mankiw has an op-ed extolling the benefits of competition between governments (at the local, state, or national level). The basic argument is that competition reduces the ability of governments to redistribute wealth, since whoever is being taxed to pay for the redistribution can just move to somewhere with lower taxes.

Allow me to gripe for a moment. This article seems to make the same assumption that American conservatives and libertarians generally make about government: that government spending is 100% redistribution and 0% public goods provision. Pure transfers, with no contribution to economic efficiency. This assumption is built into conservative-favored macro models and conservative policy discourse alike.

We do not just have governments in order to rob Peter to pay Paul. We have governments because there are things they can provide that the private sector is either unable or unwilling to provide effectively - courts, police, schools, roads, other infrastructure, etc. Conservatives focus so much on redistribution that they tend to ignore this fact, but if you think about it, you'll realize public goods are why we have government in the first place.

So will competing local governments produce the optimal amount of public goods? In general, no. There is a theory about this in public economics, called Tiebout Equilibrium (the model was actually worked out by a guy named Bewley). In general, a bunch of competing local governments will not provide the efficient amount of local public goods - you need a federal government for that. Also, of course, you need a federal government for public goods that aren't localized, like defense, research, interstate highways, and the FBI.

But here's a question that might be of interest to Dr. Mankiw: Does government competition generally lead to bigger government or smaller government? Greg provides the example of international competition to lower corporate tax rates; I agree with this example and I think it's a good thing, but I don't think this is the norm.

Generally, local governments will compete to attract investment. They do this by lower business taxes and less regulation, sure. But they also do this by increasing spending on things like infrastructure and education, in order to provide businesses with the local resources they require. They also offer businesses direct subsidies. Somebody has to pay for those things. Guess who's going to foot the bill? That's right, local taxpayers.

(Aside: Yes, some of the money local governments spend to attract investment is wasted, especially the direct subsidies; see the Tiebout result above. A lot is not wasted, though. Roads and schools are necessary.)

So I think that, in his zeal to attack redistribution, Greg may be overlooking two things here. The first is that the federal government contributes to economic efficiency, not just redistribution. The second is that government competition for business investment will often make government bigger, not smaller. Conservatives should think about both of these points.

(Update: For you econo-nerds, note that there are two questions here: 1) the relative efficiencies of different systems of local-government competition, and 2) the distribution of rents under the various systems. These both depend on a lot of things in general, and there's a huge literature on this. I'm just sketching out a couple ways in which "more competition" might not achieve results to Mankiw's liking...)
reade more... Résuméabuiyad

Thursday Roundup (4/12/2012)


Way too much good stuff in this week's Thursday Roundup! So much good stuff that there may not be room for my traditional zany commentary...

1. Matt Yglesias thinks that Obama's new focus on manufacturing is total bunk. Mark Thoma, who had long held a similar position, is starting to be convinced otherwise. Note that Matt's piece is also interesting because it contains the assertion that all private R&D spending is waste!

2. If you had any doubts that monetary policy is hopelessly politicized, read all about Luigi Zingales. Very depressing. This, children, is why NGDP targeting will not happen. And this is why Fed promises to keep interest rates low for a long time are not as believable - and hence not as effective - as they should be. See also: Narayana Kocherlakota.

3. Ezra Klein digs around and discovers that a large portion - maybe more than half! - of people who have dropped out of the labor force since the recession began are simply Baby Boomers who have retired early. These workers are not coming back to the labor force, no matter what policy steps we take. Thus, using the employment-to-population ratio as the "true" measure of unemployment - as many often do - might not be the best metric to measure the recovery. (Update: Commenter Absalon notes that the graph used in the Paul Krugman post I linked to actually only goes up to age 54, meaning that it doesn't include many early retirees. So my last sentence doesn't apply! But the early retirement phenomenon still means that this recession may have caused a permanent shift in GDP, simply because of the demographic structure of the Baby Boom.)

4. JW Mason takes on the idea that low interest rates caused the financial crisis. I think he makes some errors, but he definitely makes some good points.

5. Simon Wren-Lewis has an incredibly nerdy post about why financial journalism is Teh Suck. Don't let the nerdiness daunt you. Financial journalism is indeed Teh Suck. "Stocks fell today on worries about blah-blah-blah." No they didn't. Stocks fell today, and you don't really know why.

6. Tyler Cowen reads a 2002 paper by Chang-Tai Hsieh and concludes that Singapore had much higher TFP growth during its high-growth period than people generally believe. I had read this paper long ago, and was convinced that it was right. However, Tyler then links to this recent John Fernald paper that does the calculation even more carefully and concludes that no, Singapore's TFP growth really was low. Now I'm kind of convinced Fernald is right. Conclusion: I am easily convinced of things.

7. Bryan Caplan thinks that good-and-evil "-ism" labels are a good thing, because they help people organize their thoughts and arguments. Will Wilkinson retorts that no, labels force people into accepting package-deal ideologies. Although Caplan makes a good point, I'm going to have to award the victory to Wilkinson here. Reason: Name some common words that end in "-ism". See my point?

8. Via David Andolfatto, a piece on institutional constraints on bank lending. It ain't just about interest rates!

9. Dwight Garner of the New York Times takes issue with Tyler Cowen's assertion that "technology and business are a big part of what makes the world gentle and fun." Of course, Tyler is right - technology and business make societies rich, and rich people are less violent and have more fun. But this is not something that most people have thought about a lot, and so to these people the line probably sounds a bit...um...out to lunch. Just another case of economists refusing to calibrate their speech patterns to the preconceptions of the general public...

10. Robert Waldmann lays the capstone on the Great Microfoundations Blogwar. No, I don't know what the heck a capstone is, or why it is laid last. But I've only ever missed one question on the verbal section of a standardized test, so nyaah.

11. Barkley Rosser pulpifies Robert Samuelson. This is not difficult; Robert Samuelson is one of the worst economics writers in existence. He is also one of the most highly paid. (However, this is not surprising, from a physics standpoint...if Knowledge = Power, and Time = Money, and Power = Work/Time, then Money ~ 1/Knowledge...)

12. Brad DeLong points out that American political craziness hasn't increased, but the collapse of the Dixiecrats means that all the craziness is now concentrated in one party. I have one small problem with this thesis: the "craziness" of which we speak is actually just lingering allegiance to the Old Confederacy.

13. Apparently, having been conquered by Arabs way-back-when is bad for your economy now. Huh. I wonder about having been conquered by Mongols. They were by far the most liberal regime of the medieval world. But seems like their former territories - China, Russia, Iran - are pretty autocratic.
reade more... Résuméabuiyad

The "let's lump a bunch of models together" approach


Volker Wieland and his coauthors have taken a first step toward addressing the problem of model uncertainty in macroeconomics. Basically, they have taken a large number of macro models and thrown them in a database, allowing you to compare across models or just to get the average of the models' predictions.

Now, of course, it is easily possible to point out a bunch of flaws in this approach. How will the set of models be chosen? How different are the models from each other? How would the inclusion of an utterly spurious model be detected and prevented? Is there publication bias at work? How can the database's users know what it means to give the models various weights? And so on. But in general, I think that this database is a good thing to do. Anything that makes model uncertainty explicit is good, and I think Wieland et. al. have executed this well.

However, I would like to point out specific error that I think Wieland et. al. may be committing here. In fact, Robert Waldmann has already pointed it out, in reference to the modeling approach used by the Bank of England. (Robert Waldmann does not write with a layperson audience in mind, so let me paraphrase instead of quote.) Basically, the Bank of England has what they call a "core" model, which is a microfounded DSGE model that contains only parameters that are assumed to be policy-invariant - in other words, a model Bob Lucas would approve of as "structural". Then it lumps that "core" model in with a bunch of other non-structural, ad-hoc type models to make a "hybrid" model. Then it uses the "hybrid" model for policy analysis.

Waldmann's point is this: Why bother with the "core" model in the first place? As soon as you lump it in with stuff that isn't "structural," it becomes pointless. If you're going to have a non-structural model, just use a non-structural model and don't bother with the DSGE thing.

The same problem applies to the database constructed by Wieland, et. al. It appears to contain a mix of microfounded, potentially "structural" DSGE models and non-structural ad-hoc models. But these two types of models don't mix.

As I always say, there are two things we might want from macro models: 1. forecasting, and 2. policy analysis. If all we want to do is forecast, we don't need to worry about whether our models are policy-invariant. In fact, making them policy-invariant will probably make them worse at forecasting, since it will severely shrink the number of variables in the model. So if forecasting is what we want, we don't need the structural models, and might as well toss them out, since they will probably just add noise to the consensus forecast.

But if policy analysis is what we want, then we need to answer the Lucas Critique. In this case, we don't want the ad-hoc models. We want only the policy-invariant models. The ad-hoc models should be tossed out. (Actually, in practice, it's hard or impossible to know if a parameter is really structural. But let's put this issue aside for now.)

So the full Wieland database is not the right mix of models for policy analysis, and it is probably not the right mix of models for forecasting either. To really use this database as it ought to be used, one should be able to easily toggle "only structural models".

Update: Waldmann has his own critique of the Wieland approach, which is different from (and partly contradicts!) mine.
reade more... Résuméabuiyad

Weak defenses of the Lucas/Prescott Program


In the 1970s, Robert Lucas perceived that there was a big problem in macroeconomics. Models that didn't allow for human beings to adjust their behavior couldn't be used for policy, because if you tried to use them, people would alter their behavior until the models no longer worked. This is known as the "Lucas Critique". The solution, Lucas said, was to explicitly model the behavior of human beings, and to only use macro models that took this behavior into account. This is called the "microfoundations" approach. Readers of this blog will know that I am a fan of the general idea.

But in economics, as in many sciences, simply tearing down existing theories doesn't satisfy people. You have to replace the old with the new, or people will just go on using the old. The first research program that came along and tried to answer the Lucas Critique was the "Real Business Cycle" program. This program, spearheaded by researchers such as Ed Prescott, made use of a new modeling approach called "DSGE". It also incorporated Robert Lucas' "Rational Expectations Hypothesis". Lucas didn't invent all of this stuff, but since A) it was invented in response to his Critique, B) he invented some of it, and C) he seemed to sign off on the parts he didn't invent, I feel justified in calling this new research program the "Lucas/Prescott Program" (as in the title of this post).

Anyway, as I've mentioned before, DSGE took over the macro field, and Rational Expectations became nearly as common. Now, in the aftermath of the crisis, both of these are coming under increasing skepticism, from some economists, but especially from the general public. In a recent blog post, Justin Fox expresses this skepticism and mentions some defenses of the Lucas Program (including one by Lucas):
Nobelist Robert Lucas says we're never going to have "a set of models that forecasts sudden falls in the value of financial assets." There's a certain logic to that. Lucas again: "If an economist had a formula that could reliably forecast crises a week in advance, say, then that formula would become part of generally available information and prices would fall a week earlier." But I don't think it's logically impossible to be able to judge when asset markets are at greater risk of trouble than normal. And I wouldn't be too certain that the various classes of macroeconomic models (DSGE models among them) that have evolved from Lucas's rational expectations work in the 1970s are the best possible tools for making such judgments. You could describe what today's macroeconomists are doing as the forward march of science: they're revising and adding to their theories in light of new evidence. But sometimes scientists hit dead ends. Ever heard of phlogiston? 
I'm not saying the DSGE theorists should give up — their work may turn out to be of great value. I'm just saying that policy makers and others who want to understand the short-term movements of the economy should keep their options open. And those who educate macroeconomists should be more open to different methods as well...My view of the Great Recession was very much shaped by an October 2008 phone call with Eichengreen: "I doubt that we'll be able to avoid double-digit unemployment," he told me. "But I'm still confident we can avoid 24% unemployment like in 1933." The U.S. unemployment rate hit 10% for one month in 2009, but didn't go past it. It was a very good forecast, expressed at just about the right level of precision, based mostly on historical experience and off-the-cuff judgment, without a DSGE model in sight. There's more than one valid way to do macroeconomics.
The first issue here is something I've addressed before, so I'll be brief. First off, a "financial crisis" is not necessarily the same thing as a fall in asset prices. If you knew 6 months ahead of time that the financial system was going to break down on some specific day, it's probably true that asset prices would fall as soon as the knowledge was obtained. But it would still be better to have that model, in order to prepare for the consequences of the financial system's collapse.

Second of all, models can make policy-contingent predictions without violating the weak form of the Efficient Markets Hypothesis. If you tell a policymaker, "The financial system will crash 6 months from now, UNLESS you take such-and-such an action," only the policymaker knows what he or she will do. It's private, not public, information. Again, it's better to have the model.

Third of all, market efficiency may not hold. See Abreu & Brunnermeier (2003) for an example of a market in which everyone knows that a crash is coming before the crash comes, but can't make money off of that knowledge. Lucas doesn't know that this kind of thing is impossible; hence, he is making unsupported assertions.

Now onto the second critique. Fox's story about Barry Eichengreen's successful predictions regarding the recent recession is just one data point, it's true. Eichengreen might have just gotten lucky. But the larger issue is that DSGE models have so far proven themselves to be essentially useless at forecasting the macroeconomy, relative to the judgment-based forecasts of people like Eichengreen.

Now, there are two reasons why we might value a macroeconomic model. One is forecasting ability. The other is policy advice. If existing DSGE models are crappy at the former, might they not be useful for the latter? In fact, they might be, and their supporters insist that they are. But with little consensus in the macro profession on how to choose which model applies to the economy at which time, we find ourselves at any given time with a dizzying array of contradictory models instead of one model that we can trust. 

Hence, Fox is exactly right when he says that DSGE models "may turn out to be of great value." As far as we can tell, this has not yet happened. The Lucas/Prescott Program has not yet panned out as hoped, nor are there good logical reasons why other programs couldn't possibly do better. Sometimes scientists are forced to live through long periods of doubt, where old certitudes have crumbled but new ones have not yet arrived to take their place. We appear to be living through such a time.


Note: Let no one interpret this post as an attack on Lucas as an economist. I like Lucas! The Lucas Critique was spot-on. I agree with Lucas' push for microfoundations. And I think the Lucas Islands Model is neato.
reade more... Résuméabuiyad