Pages

.

Firefox adoption just keeps going down


Browser usage data from Q4 2008 to Q4 2011, for visitors of this blog.

I was somewhat surprised (as many people were) to learn, earlier this month, that in terms of market share, Google's Chrome browser has recently surpassed Firefox in overall adoption. I shouldn't have been surprised at all: If I had taken the time to look carefully at my own blog's analytics, I would've seen this very result nine months ago.

Readers of my blog tend to be developers, techies, and early adopters, and so Internet Explorer usage has never been high for people who visit this blog, whereas Firefox usage has always been high (68% in Q4 of 2008, for example). Trends like Chrome overtaking Firefox tend to show up early in my analytics; my readers are trendsetters. I completely didn't see the Firefox death spiral coming, however.

I decided, finally, to sit down and sift through my analytics to find out exactly how browser usage has varied over time for visitors to this blog. In the graph above, I've plotted browser statistics quarter-by-quarter for the 13 quarters going back to Q4 2008 (the earliest date for which analytics were available). Note that data for the most recent quarter run only to mid-December (obviously). For the total data set going back to Q4 2008, we're talking slightly more than half a million total visits.

The vertical scale tops out at 100 percent. Firefox (in blue) starts at 68 percent and ends, most recently, at 32 percent. Chrome starts at 8 percent and finishes at an impressive 38 percent. The curves crossed, for visitors to this blog, around nine months ago (in early 2011).

What can I say? I don't think these sorts of trends bode well for Firefox. In fact I think the future is looking pretty bad for Firefox (for a variety of reasons), and there's probably not time to reinvent the product at this point.

In the meantime, if you work for a software (or other) company that currently supports Firefox but not Chrome, you've got your priorities backwards! Get to work supporting Chrome, ASAP, lest the market leave you behind.
reade more... Résuméabuiyad

6 Tips for Beginning Canvas Programmers

Lately I've spent some time programming against the <canvas> API. Predictably, I encountered all the common beginner's mistakes, and had to work through them. Along the way, I learned a number of useful things about canvas programming, some basic, some not-so-basic. Here's a quick summary:

1. To avoid security errors, always serve your HTML (and scripts) from the same server as any images you're going to be working with. (Corollary: Don't "serve" your HTML and images from the local filesystem. That's a sure way to get security errors.) Install a local instance of Apache web server (or some other web server) and serve content to your browser from localhost, if need be.

2. If you're modifying pixels using context.getImageData( ), use putImageData( ) to draw back to the image, and be sure to supply all 3 arguments to putImageData( )! Here is a common pattern:

function doSomething() {

var canvasData =
context.getImageData(0, 0, imageObj.width, imageObj.height);

for (var x = 0; x < w; x++) {
for (var y = 0; y < h; y++) {
var idx = (x + y * w) * 4;
var r = canvasData.data[idx + 0];
var g = canvasData.data[idx + 1];
var b = canvasData.data[idx + 2];

// do something to r,g,b here

canvasData.data[idx + 0] = r;
canvasData.data[idx + 1] = g;
canvasData.data[idx + 2] = b;
}
}

// draw it back out to the screen:
context.putImageData(canvasData, 0, 0);
}

Notice the three arguments to putImageData(). The final two args are the x and y position at which to draw the image. If you forget those two args, expect errors.


3. You can draw offscreen by simply creating a canvas element programmatically. Like this:
    imageObj = new Image();
imageObj.src = "http://localhost:4502/content/lena.png";

function getOffscreenContext(imageObj) {
var offscreenCanvas = document.createElement("canvas");
offscreenCanvas.width = imageObj.width;
offscreenCanvas.height = imageObj.height;
return offscreenCanvas.getContext("2d");
}

If you use this function (or one like it), you can keep an offscreen copy of your image around, which can be extremely handy.

4. You can save programmatically created/modified images offline. The trick is to slurp the canvas into a data URL and then open or display that URL in a new frame or window where you can right-click it to get the usual image-save options from the browser. Something like this:

myImage = canvas.toDataURL("image/png"); 
window.open( myImage ); // opens in new window as a PNG

This serializes the image as a (big, huge) data URL, then opens the image in a new window. The new window contains a PNG image, plain and simple.

5. Any time you assign a value to canvas.width or canvas.height, you will wipe the canvas clean! This is both weird and handy. Just doing canvas.width = canvas.width will instantly erase the canvas.


6. When all else fails, consult the HTML 5 Canvas Cheatsheet.
reade more... Résuméabuiyad

A hideous anti-immigrant attack

Dhammika Dharmapala is a law professor at the University of Illinois. My friend David Agrawal at UMich (one of our strongest job candidates this year) says that Professor Dharmapala is "the reason I'm an economist."

So I'm sad and disgusted to report that Professor Dharmapala was slashed in the throat yesterday, in what is pretty clearly a hate crime. Fortunately, and somewhat miraculously, Professor Dharmapala will live, and is making a faster-than-expected recovery. But that does nothing to diminish the awfulness of the attempted murder.

Here are the details:

Joshua Scaggs, 23...has been charged with attempted murder and two counts of aggravated battery, alleging he slashed the throat of Anurudha Udeni Dhammika Dharmapala, 41, of Champaign at the Illinois Terminal on Wednesday morning...
A male witness told police the men were both seated in the waiting area when one man suddenly jumped up and shouted that this was his country and attacked Dharmapala.
The attacker, later identified as Scaggs, then grabbed Dharmapala around the neck and appeared to be choking him. He then forced the victim to the floor. 
The witness intervened by pulling the attacker off Dharmapala. The witness then noticed that the attacker was holding a utility knife and the victim was bleeding. 
Ziegler said Dharmapala was waiting to take a train to Chicago. He had no information on why Scaggs may have been there. Police recovered the box cutter believed used to injure Dharmapala. He said Scaggs had another folding knife in his pocket.
So this guy Scaggs went out with a box-cutter and a folding knife, obviously intending to attack someone. He sees a random non-white guy in a train, jumps up, screams "I want my country back," and cuts the guy's throat. Bizarrely, the crime is not being prosecuted as a hate crime. If that's not a hate crime, what is?!

Hate crime or no, Scaggs will certainly rot in jail, as he deserves. But I hope I'm not alone in thinking that this kind of attack is a very bad sign for America in general. For two reasons.

First of all, as I've written before, I believe that racial animosity is wreaking havoc on this country's political process. Tribal animosity makes people paranoid that any government policy represents an attack on their group by another group. And so you hear people blaming "those lazy [insert nonwhite race here]" for the financial crisis and the recession, which reduces our ability to fight the recession with government policy. And you hear people saying that government spending is racial redistribution...so our roads and bridges and research institutions crumble and decay.

But taking a longer view, I believe that immigration - particularly from Asia - is key to our nation's economic success. In a globalized world where companies choose their locations based on access to large domestic markets, having a dense population will be important. Also, high immigrant fertility is the only reason why our country is managing to avoid the demographic disaster looming over East Asia and Europe. Finally, immigrants are a tremendous source of entrepreneurship.

So the idea that non-white immigrants are "taking America away" from whites is by far the most pernicious force in America today. This idea is slowly but steadily making itself an unwelcome fixture in our public discourse.

Now the reason I am saying this is not to make political hay, or to lay blame for this attack on anyone but the perpetrator. It is simply to point out that the murderous hate crimes of a few isolated psychos are not simply isolated and independent random events. They are warning signs of larger forces of hate lurking within our society, corrosively eating away at the foundations of our national polity.

Anyway, best wishes to Professor Dharmapala for a speedy recovery.
reade more... Résuméabuiyad

How Google is quietly killing Firefox



After a certain period of time spent programming, you develop a kind of sixth sense about what programs are doing. Surprisingly often, this sixth sense turns out to be right. I don't know if what I'm about to say is right. I do know that my sixth sense is telling me something.

I've been a Firefox user for years, and I still like Firefox, the way I still like my 1998 Jeep Grand Cherokee (with the cast-iron six-cylinder engine) even though it's not the latest-and-greatest model. Lately, though, Firefox has been freezing and/or quitting unexpectedly with greater and greater frequency, even though my browsing habits haven't changed.

What's changed over the past couple of years? The Web. Web content has become more and more dynamic, more AJAX-driven, more JavaScript-intensive.

JavaScript is a great language, but like a lot of languages these days it relies on programmers being careful about how they manage runtime objects. It's surprisingly easy, if you manipulate the DOM a lot, to generate code that leaks memory. See this nice writeup for more info (also see https://developer.mozilla.org/en/Debugging_memory_leaks).

My contention is that most AJAX code leaks memory like a sieve, and this is why more and more users are seeing their browsers (not just Firefox, but IE, Safari, and Chrome as well) freeze up and die in normal operation these days. The various browsers differ in how they manage memory and how they do garbage collection. Chrome, in particular, has undergone significant changes recently in how it handles garbage collection. This is no accident. It's in response to the greater challenges imposed on all browsers today by dynamic web pages.

When I leave AJAX-intensive web pages open all day in Firefox, I eventually find that Firefox is using 1.3 gigabytes or more of RAM. Usually, by the time it reaches 1.4 gigabytes, the browser freezes (goes white) and then either dies outright, or unfreezes again after 30 or 40 seconds. If it manages to unfreeze, usually about 60 megabytes of RAM have been freed up (according to Task Manager). But then memory usage marches upwards again and it freezes (goes white) again, within seconds. The freeze/unfreeze cycle continues until Firefox unceremoniously crashes.

My programmer's sixth sense tells me that when memory usage exceeds a certain level, Firefox lacks sufficient headroom to carry out a proper garbage collection cycle. Partway through the cycle, it runs out of memory and initiates another GC cycle. This repeats until the program is in what one might call a GC panic. Uncommanded program termination is the inevitable result.

Firefox is often roundly criticized for its tendency to "leak memory," but I would caution that it is not really the core program that is leaking memory. It's really the AJAX code running in JavaScript-intensive pages that's causing the memory leakage.

So ironically, Firefox's reputation is suffering not because of anything Mozilla's programmers are doing, but because of web developers who are using jQuery and tons of other popular libraries indiscriminately, without regard to memory leakage. (Be sure to have a look at this blog post about jQuery's role in memory leakage.)

I haven't done careful testing, but I can tell you (from daily experience) that if I leave Gmail open in Firefox all night, Firefox will run out of memory by morning. If I leave Facebook and Twitter open as well, I can count on running out of memory in just a few hours.

Now here's where it starts to get really troubling.

Mozilla's greatest revenue source today (accounting for more than 80 percent of annual income) is Google. Mozilla is deeply dependent on Google for operating revenue. And yet, it is in direct competition with Google for browser market-share. Recently, the Firefox and Chrome adoption curves crossed, with Firefox now lagging behind Chrome for the first time.

There's a huge conflict of interest here. If you buy the theory that most people who abandon Firefox do so because it crashes (runs out of memory) unpredictably, it stands to reason that all Google has to do to pick up market share in the browser world is publish AJAX-intensive web pages (Google Search, Gmail, Google Maps, etc.) of a kind that Firefox's garbage-collection algorithms choke on — and in the meantime, improve Chrome's own GC algorithms to better handle just those sorts of pages.

That's exactly what seems to be going on. Or at least that's what my gut says.

And my gut is sometimes right.
reade more... Résuméabuiyad

Hoover Institute recommends Hooverite policies


Via John Taylor, today's "dog bites man" story:
Why has the recovery been so slow? What can we do about it? Alan Greenspan, George Shultz, Ed Prescott, Steve Davis, Nick Bloom, John Cochrane, Bob Hall, Lee Ohanian, John Cogan and I recently met at the Hoover Institution at Stanford to present papers and discuss the issue with other economists and policy makers including Myron Scholes, Michael Boskin, Ron McKinnon and many others...In sum there was considerable agreement that (1) policy uncertainty was a major problem in the slow recovery, (2) short run stimulus packages were not the answer going forward, and (3) policy reforms that would normally be considered helpful in the long run would actually be very helpful right now in the short run.
Wow, shocking. The recession is Obama's fault for being a crypto-socialist, stimulus doesn't work, and the rich should get tax cuts. Who would have ever guessed that this team of mavericks would reach such a startling conclusion?

But I kid. Actually, the story of the Hoover conference is a little more interesting. It seems to have been pretty evenly split between people who simply re-asserted the standard conservative line, and people who supported either Keynesian solutions or an end to Republican obstructionism, but whose conclusions were spun in the writeup to fit the conference's (or Taylor's) preferred conservative policy line. So let's look at the specifics of what was said, as reported by Taylor.

First, George Schultz:
George Shultz led off by arguing that diagnosing the problem and thus finding a solution was extraordinarily important now, not only for the future of the United States but also for its leadership around world.
Mmm, you don't say...
Tax reform, entitlement reform, monetary reform, and K-12 education reform were at the top of [Schultz's] pro-growth policy list.
Because if we haven't diagnosed the disease, we might as well recommend that the patient drink lots of fluid and get plenty of exercise. That seems to be the idea here. Actually, I am pretty cool with that, since I like it when people admit how much we don't really know. It shouldn't be interpreted as blaming "uncertainty" or calling for austerity, though.

On to Alan Greenspan:
Alan Greenspan presented empirical evidence that policy uncertainty caused by government activism was a major problem holding back growth, and that the first priority should be to start reducing the deficit immediately; investment is being crowded out now.
Wow, empirical evidence that policy uncertainty is holding back growth? Show, us, please! Sadly, Greenspan's evidence appears to be proprietary, and only available to people who pay Greenspan Associates for the privilege of hearing that Obama's crypto-socialism is crippling the economy. Whereas the rest of us poor folks are forced to post all our evidence to the contrary online, for free. 

Nick Bloom, by contrast, actually does have some evidence. With Scott Baker and Steven Davis, he constructs a measure of policy uncertainty that matches historical events like 9/11, and then shows that this measure has spiked recently as well. That is well done! Of course, it's not certain which way the causality runs; large economic crises necessitate large policy responses, and there will probably be uncertainty as to what those responses will be. But anyway, Bloom appears to have produced by far the best available study on the role of uncertainty, and he should be applauded for this. Note: John Taylor fails to mention that, according to Bloom's measurements, the main sources of uncertainty in the current recession have been A) Republican brinksmanship over the debt ceiling, B) Europe, and C) efforts to sue Obama's health care bill out of existence...

Anyway, onward! Next up we have Ed Prescott:
Ed Prescott had the most dramatic policy proposal which he argued would cause a major boom and restore strong growth. He would simultaneously reform the tax code and entitlement programs by slashing marginal tax rates which would increase employment and productivity.
This line actually made me laugh out loud. A "dramatic policy proposal"...cut tax rates for the rich! Ed Prescott, you maverick, you.

But now I come to a presenter with a very differentparadigm...Robert Hall, whom a professor in my department once called the "greatest macroeconomist working today":
Bob Hall argued that fiscal policy was not working, and focused on alleviating the zero lower bound constraint on monetary policy. 
This phrasing makes Hall sound like an opponent of fiscal policy. But actually, the exact opposite is true! Hall is one of the most eminent "Keynesians" in the field, a big proponent of government expenditure as a way to get out of recessions. If you don't believe me, read this paper he wrote on fiscal stimulus. In fact, I was pretty surprised to see his name on the Hoover conference list, given this fact.

So why is Hall now saying that "fiscal policy 'was' not working"? Well, what he almost certainly means is that most of Obama's ARRA stimulus came not in the form of government purchases of things like infrastructure, but as tax credits and grants to the states, both of which were promptly saved rather than spent. This is a point that has been made by, among others, Paul Krugman and John Taylor

So what Hall actually said at the Hoover conference was almost certainly "Congress should borrow money and buy more infrastructure, but since it appears unwilling to do so, the Fed should print money and buy financial assets." In other words, pretty much the standard Keynesian line (Update: via Paul Krugman, I find out that Hall's position is that Obama's stimulus did help make the recession less severe, and that the stimulus should have been larger). As for John Taylor himself, the paper he presented at the Hoover conference was the one I discussed here, which said pretty much the same thing that Hall said - stimulus spending should have been more about spending on infrastructure, and less about handing people blocks of cash that they promptly stuck under their mattresses.

Which is a good point, but not really an argument for austerity.

Finally, there was Lee Ohanian:
Lee Ohanian showed that unemployment remained high in part because of restrictions on foreclosure proceedings which increased search unemployment by allowing people to stay in their homes for longer periods of time.
That's kind of interesting, actually.

Anyway, let's sum up. What we have here appears to be a conference to which the Hoover Institute invited A) prominent conservatives (Greenspan, Prescott, Cochrane, and Ohanian), and B) people who happened to be sitting nextdoor at Stanford (Bloom, Hall, and Taylor), with an eye to reiterating and affirming standard conservative policy prescriptions: austerity, tax cuts for the rich, etc. What they got wasn't quite that, but it was close enough where the dissenting voices could be spun to sound as if they agreed with the party line. Not sure if it was someone at Hoover or just Taylor himself doing the spinning. But either way, the conference shows that even in relatively conservative circles, substantial deviations from the party line can't help but pop up. Put enough smart people in the room, and at least a couple smart things will probably end up getting said.

Update: Paul Krugman thinks John Taylor heavily spun the conference results. Taylor begs to differ. In particular, Taylor says that things were discussed at the conference that were not contained in prior research by the presenters. That is, of course, usually the case at conferences; I am looking forward to seeing the discussion published.

I do have one quibble with Taylor, btw. In his new post, he writes:
Krugman claims that my summary mischaracterized the presentation of my Stanford colleague Bob Hall...As part of his presentation Bob said that now and going forward we should assume “no chance of conventional fiscal expansion; rather, possible cutbacks motivated by excessive federal debt.” That is why Bob focused his paper at the conference on monetary policy and the problem of the zero lower bound, and that was what all the discussion of his paper was about, rather than on his earlier work on the multiplier[.]
But in his original summary, Taylor wrote: "Bob Hall argued that fiscal policy was not working." (emphasis mine on both quotes)

"Not working" and "not politically feasible" are two very, very different things.

Update 2: Brad DeLong notices the same discrepancy between Taylor's posts.

Update 3: Menzie Chinn elaborates on how Nick Bloom's research supports the hypothesis that it is Republican fiscal brinkmanship, not Obama administration regulatory policy, that is causing uncertainty. Well worth a read.
reade more... Résuméabuiyad

What to teach intro economics students


Greg Mankiw has a good response to the walkout of his introductory econ course, which basically agrees with what I wrote. Peter Dorman, however, is not satisfied:
[T]here is a central narrative at the introductory level that has hardly changed in at least a generation, perhaps longer.  It presents a system of perfectly competitive markets composed of rational, unconnected agents as the benchmark, from which specific deviations, like externalities, behavioral anomalies, sticky prices, etc., are considered one at a time.  Most of the interesting and important work in economics is about these deviations.  If you added up all of this innovative research, you would have a composite picture that is exciting, relevant—and light years away from the introductory narrative. 
A huge gap has opened up between the introductory course and the work professional economists are actually doing.  Each departure from the narrative is considered one at a time, even though research has chipped away at all of them...Thus the introductory course still looks like a distillation of the research frontier, even though, if you put all the research results together, you would have something quite different.
First of all, I agree with Dorman completely regarding the research frontier. The whole notion of thinking of each interesting feature of the economy as a "friction," and then of considering only one or two "frictions" at a time, has been very detrimental. For one thing, it makes it hard to develop a useful model of the economy, since the actual economy contains many, many "frictions" (so many that the "frictions" together are usually more important than the "frictionless" dynamics that supposedly "underlie" them). Also, the "one friction at a time" approach makes it very difficult to generate any alternatives to the classical "core theory" of Walrasian general equilibrium. In fact, my main reason for disliking the DSGE modeling framework is that it is so unwieldy that it makes it prohibitively hard to introduce more than two "frictions."

But when it comes to intro economics - or, at least, intro macro - I don't see this dynamic in operation as much as Dorman does. Of course, I only know the classes I've taught (I've never taken an undergrad econ course). But I taught right out of Mankiw's book, and there was nothing particularly unusual about the curriculum. 

Most of what I taught was not based on a system of perfectly competitive markets. For business cycle theory, we taught the AD-AS model (beloved of montarists like Scott Sumner), the supply and demand for money, the New Keynesian Phillips curve, and the old Keynesian Cross. RBC theories were never even mentioned in the lectures (although I gave students a brief overview of them in discussion section, much to their annoyance since RBC was not on the test!). In fact, there were nothing but frictions. The theories were so different that students often asked me "How do we know which of these models to use?" (provoking a laugh from me, since that is such an excellent and often-ignored question). The focus was always on market failures, and how governments should use policy to correct them.

In sum, intro macro really doesn't look like a distillation of the research frontier. Which is a good thing.

Actually, my beef with intro macro is different. I think there is a big chunk of the research frontier that intro courses completely ignore. I'm talking, of course, about empirics. The courses I taught had nothing to say about how we know if and when a theory is right. (Note: anyone who read that sentence and immediately started typing "But ALL theories are wrong!", please think very carefully about the concept of a "domain of validity" before you start spamming the comments. Thank you.)

In natural science courses, much of the focus is on empirics. It is critical to know when a theory is a good approximation of reality and when it is not, and looking at evidence is the only way to know this. So in high school physics, you roll a ball down a ramp and measure its position at different times to see how gravity works. In chemistry you dump some silver nitrate into some potassium chloride, and you watch the silver chloride precipitate out of the solution. In biology you look at cells under a microscope.

But in introductory macroeconomics this is not done. I never once gave students a data set and said "Here, regress Y on X". There was no reason I couldn't have. OLS is easy to do, and easy to explain (simple least squares is very intuitive and can be shown with a picture). Sure, intro students aren't going to be coding DSGE models in Matlab or converging MLE routines for structural models. But doing some simple empirics would give students a feel for how economists test their theories. It would give them a hands-on feel for data. And it would allow lecturers to explain why certain statistical techniques can lead to false certainty (i.e. the Lucas Critique).

As it stands, students in introductory economics courses walk away feeling that econ theories are "received wisdom" - that a theory is just something that a smart guy dreamed up, and then concluded was right because it sort of seemed plausible (which, sad, to say, describes some econ theories all too well). And - fortunately for American society - we have somewhat of an aversion to received wisdom. We call it by the name of "bullshit." And rightly so! As Feynman said, "Science is a belief in the ignorance of experts."

So I say, if you want to give introductory economics students a better picture for what the science is really good for, teach them the part that links theory to reality. 
reade more... Résuméabuiyad

Harrison & Kreps 1978: The power of irrational expectations


In the past few weeks I've had discussions with several different people about why financial markets are different from normal markets. I've come to realize that there is a very deep and fundamental fact about financial markets that almost nobody in the lay public - and a good chunk of people in the finance industry itself - don't understand. And that fact just happens to have implications that also shake the foundations of modern macroeconomics.

So it's time for another addition of Papers You Should Know. Today's paper is "Speculative Behavior in a Stock Market With Heterogeneous Expectations" by J. Michael Harrison and David M. Kreps (Quarterly Journal of Economics, 1978).

Before we talk about Harrison & Kreps, we need to understand why financial markets are so weird. In a nutshell, it's this: In normal markets, people who know exactly what they are trading will often still be willing to trade. In financial markets, people who know exactly what they are trading will almost never be willing to trade. For a quick explanation of why this is true, I turn to Brad DeLong:
In a standard economic transaction, it is no mystery where the value to both sides comes from. When I buy a double espresso from Café Nefeli for $2.25, the coffee is more valuabe to me then $2.25 is...The sources of the gains from trade are obvious. 
But in finance neither side is getting useful commodities. Instead, both sides are trading away claims to a pile of money and getting claims to a different pile of money in return. So how is it that me selling this pile of cash I have to you for that pile of cash that you currently own can be a good idea for both of us? Doesn't one of the piles have to be bigger? And isn't the person who trades the bigger for the smaller pile losing?
Think about that for a second. If I offer to sell you a share of stock for $10, why would you buy it? Well, you might buy it because you want to diversify or otherwise adjust your investment portfolio (to give you, say, more risk or longer maturity). But - let's be frank - chances are you'd buy it because you think that sometime in the future it'll be worth more than $10. Right? So now ask yourself: If I (the seller) also thought it would be worth more than $10 in the future, why the heck would I sell it to you for $10???

The answer is: I wouldn't. If someone offers to sell you a stock for $10 and tells you it's going to go up up up, in general don't buy it. Because if the guy selling you the stock really believed his own rosy prediction, he'd keep the stock for himself. The only time you should buy the stock is when you have good reason to believe that you are smarter or better-informed than the guy who's selling - in other words, if you think he's a sucker. Of course, he's only selling to you because he thinks you're a sucker.

So what we see is that while normal markets consist of people making trades because they have different preferences, financial markets mainly consist of a bunch of people with the same preferences all trying to sucker each other. This fact is called the "No-Trade Theorem," and economists have known about it for a long time. 

So in 1978, J. Michael Harrison and David M. Kreps decided to write down a model of a financial market in which everyone was trying to sucker everyone else. As far as I know, this was the first model of its kind. Even though few people believe that the model describes exactly how the real world works, it has become enormously influential, because it gives an idea of what the world would have to be like in order for us to observe the enormous volumes of financial trading that we actually see happening.

Briefly, the model works like this: Different people have different beliefs about the fundamental value of an asset (i.e., how much money the asset will pay them). One person thinks it'll pay A if the economy is good and B if the economy is bad. The other person thinks it'll pay C if the economy is good and D if the economy is bad. They know that they have different beliefs, they know what other people's beliefs are, and they "agree to disagree." So they are willing to trade; each one thinks the other one is a sucker.

So what price do they trade at? Well, you might think that the most bullish investor (i.e. the person who thinks it's worth the most) would just set the price. But actually, the price ends up being even higher than the most bullish person thinks it's worth! The reason is that the asset has resale value. You can buy it, collect some money from it, and then when the economy changes, you can sell it at a profit to someone who is more bullish than you are. So the price ends up being the sum of the asset's fundamental value and its resale value. Ta-da: Speculation!

Now, like I said, nobody really thinks this is exactly how things work. For one thing, people should learn over time - as the asset pays out money, people should update their beliefs. But Harrison & Kreps is not about describing the world, it's about exploring ideas. And the basic idea they explored has become one of the most powerful in all of financial economics. Since 1978, a number of authors have taken this basic notion of investor overconfidence and tried to either make it more realistic, or find it in the data. A few prime examples include Scheinkman & Xiong (2003), who put the overconfidence idea into a modern asset pricing model, Barber & Odean (2001), who find evidence that individual investors are overconfident and trade too much, and a number of "heterogeneous prior" models that allow people to learn as they go. All of these models owe something to the pioneering work of Harrison & Kreps.

But anyway, all that is preamble. This is actually a post about macroeconomics.

You see, the No-Trade Theorem says that financial markets shouldn't have a lot of trading. But we see a LOT of trading in these markets. And Harrison & Kreps showed that those trades are best explained by irrational expectations.

So what does that say about macro? Since the late 70s, nearly all of the models used by macroeconomists have been "rational expectations" models. "Rational expectations" is the idea that people don't make systematic mistakes when predicting the future. If you think that sounds a bit silly, you're not alone, but I kid you not when I say that rational expectations absolutely dominates modern macro.

But if expectations aren't rational in financial markets, why should they be rational in the economy as a whole? The answer is that they shouldn't. This is why Thomas Sargent, who won the Nobel Prize this year and who helped develop the theory of rational expectations, calls himself a "Harrison-Kreps Keynesian." Keynes, though he is usually associated with the idea of fiscal stimulus, was a professional stock speculator, and perceived clearly the irrationality of the markets in which he participated; Sargent is merely recognizing that financial market irrationality, which was formalized by Harrison and Kreps, is a huge hint that rational expectations is not going to get the job done in macro either. 

Pioneering macroeconomists like Sargent have spent a long time hacking through the wilderness of non-rational-expectations models. It is a daunting task, since there are infinitely many ways in which people could be irrational. Rational expectations lends itself to pure logical deduction - you can kind of just sit there and figure out how people should act. But to figure out how expectations really form, you need to get your hands dirty with things like lab experiments and careful empirical work.

But we really have no choice, if we want to understand the economy as it actually exists. As David Glasner says, "expectations are fundamental"; we can't afford to treat the process of human belief formation as an afterthought. That is the insight of Harrison and Kreps, and macro as a whole needs to take it to heart.

Update: Also see this article from today's NYT on Tom Sargent and Chris Sims (this year's other Nobelist). Both believe that modeling irrationality, as it exists in the real world, is the way to go.

Update 2: I just realized that a better title for this post would have been "Why is this market different from all other markets?" Ah, the "stairway wit"...
reade more... Résuméabuiyad