“The Signal and the Noise” by Nate Silver
Subtitled “Why so many predictions fail – but some don’t” Silver’s book is probably one of the only good “pop” statistics book out there. Silver has an engaging style that keeps even the informed reader alert, and brings philosophically profound concepts – like Bayesian reasoning – to the lay man. I put “pop” in scare quotes because I want to deter immediate comparisons to Freakonomics or Blink. In the sense that the book is written by the veritable master of a field – rather than engaging writers dabbling in curious, mind-bending topics – I’m more inclined to compare Silver with Thomas Schelling in Micromotives and Macrobehavior.
As this book has been thoroughly reviewed, I want to frame my response in the context of remarks from Kaiser Fung, Cathy O’Neill, both through a post by Andrew Gelman (who, at least as far back as December last year, had not read the book).
In short, the book is an investigative journey through fallibility of human prediction: from economics and earthquakes to the environment. The thesis is at its core hopeful; and draws a silver lining to human error in the form of: humility, doubt, and above all Thomas Bayes. But he never sells any such heuristic as a panacea to chaos and uncertainty, and is himself very measured about his promotion of a metasolution. I would say he follows his own rules on his 450-page forecast for forecasting.
Since I’m overall very positive about the book, let me start with the cons. Fung (and many others) note that Silver does a wonderful job bringing attention to the Bayesian worldview. They go on to suggest he might have oversold the concept; I see it in another way. He knows that most statisticians would be furious if he started with the tautological identity “p(a|b) = p(a)*p(b|a)/p(b)” which in its simplicity, to a lay person, desalinates Thomas Bayes’ philosophical leap into sterile mathematics. But he takes the abstraction a little too far. While we see the extended formula – through a women updating her priors that her husband cheated – there’s next to nothing in an explanation of why it is the case (we never see the simple form, or hear about the Law of Total Probability). In fact, I’m not sure that removing every reference to Bayes would take so much from his thesis – as the idea of “updating a prior” is not, per se, contingent on probability.
For a book that purports (and for the most part does) tell us why the ghost of Thomas Bayes’ rules the world, the dearth of precise explanation into the mechanics is damaging. Indeed it is crucial to understanding why holding (near) absolute priors makes further revision against evidence (near) impossible which indeed is the bane of every failed forecaster.
Another minor quibble – which is in all honesty dominated by Silver’s clarity and style – is the book doesn’t at all times feel very “together”. With the exception for a few stray remarks, each chapter can be read independently of another as they each tell the story of forecasting in a wide range of disciplines. The story behind Bayes’ is then relegated to Silver explicitly reminding us of its power rather than a natural flow throughout. Again, this is minor and included mostly so that you form a prior in favor of my impartiality!
Let me start with O’Neill’s review. The analytical counterpart of missing the signal for the noise is loosing the forest for the trees (or twigs), and I think accounts for a large part of the review. Don’t get me wrong – I respect her and love her blog – but I just can’t understand how O’Neill concludes that “Nate Silver confuses cause and effect [and] ends up defending corruption”:
The ratings agencies, which famously put AAA ratings on terrible loans, and spoke among themselves as being willing to rate things that were structured by cows, did not accidentally have bad underlying models. The bankers packaging and selling these deals, which amongst themselves they called sacks of shit, did not blithelybelieve in their safety because of those ratings.
Rather, the entire industry crucially depended on the false models. Indeed they changed the data to conform with the models, which is to say it was an intentional combination of using flawed models and using irrelevant historical data […]
In baseball, a team can’t create bad or misleading data to game the models of other teams in order to get an edge. But in the financial markets, parties to a model can and do.
This just doesn’t make sense to me because Nate Silver a) shows that the models are so ridiculously stupid that a child could find their flaws and b) accepts that Wall Street financiers are smart. The only conclusion thereof is the only reason to keep playing fool to these models is to “keep the music playing”, so to speak. No sane reader could finish the first chapter without feeling disgust towards the rot in the financial system. Nor is Silver happy about the bank bailouts that let AIG get away with murder and more.
The purpose of chapter one wasn’t even to “excuse” finance in any meaningful way, just to explain the fallibility of human models and, yes, exuberance. Silver is explaining why the models suck. O’Neill is getting angry that people were using models that suck. It’s not like his “expertise” in finance is – in writing a layman’s book – any less than his intuition of earthquakes or the environment, so O’Neill’s dismissal of his knowledge seems unfair. Especially divorced from the point of his book, which has nothing to do with finance to begin with. Reading the review, one may not be so sure.
Anyway, Silver’s basic point seems to be the criminality of using bad models, and having confidence in their success – so tautologically it seems he hates the way Wall Street worked. I get the feeling O’Neill wanted him to bring his emotions, and other completely irrelevant topics into the discussion thereof, but that misses the point of the book entirely.
And I’m not sure about how other readers feel, this is up to interpretation and I fully believe that O’Neill got this sense, but for what it’s worth I don’t think this is right:
I’m not criticizing Silver for not understanding the financial system. Indeed one of the most crucial problems with the current system is its complexity, and as I’ve said before, most people inside finance don’t really understand it. But at the very least he should know that he is not an authority and should not act like one.
Personally, I never got the sense that Silver was an “authority” in the field, nor did he ever claim to be. Again, a lot of this is up to subjective reading, but I can’t see how someone can reach this conclusion concretely; especially when the chapter is merely an introduction to the way in which models can be used in the real world. Much of his explanation of the failures even derive from a professor at the University of Chicago who teaches a course on the financial crisis; a veritable expert. I can’t help but feel that O’Neill and others feel distaste from the start because Silver introduces Larry Summers without a string of qualifying epithets.
Though of all the subjects discussed in the book, economics was the most familiar. (Which is not saying much for a kid right out of high school, to be fair). And perhaps not surprisingly, I did have the most problem with his discussion of economic forecasting, particularly predicting recession: Chapter Six – “How to Drown in Three Feet of Water”. He mentions several times that professional forecasters failed to predict recession as a serious possibility even after the United States officially was in a downturn and considers this mostly as a flaw of overconfidence or bad modeling; much in tune with the rest of his book.
But I think there’s a disservice in not considering the epistemological impossibility of forecasting recession, quite to the contrary of this passage:
In September 2011, ECRI predicted a near certainty of a “double dip” recession. “There’s nothing that policy makers can do to head it off,” it advised. “If you think this is a bad economy, you haven’t seen anything yet.” In interviews, the managing director of the firm, Lakshman Achuthan, suggested the recession would begin almost immediately if it hadn’t started already. The firm described the reasons for its prediction in this way:
“ECRI’s recession call isn’t based on just one or two leading indexes, but on dozens of specialized leading indexes, including the U.S. Long Leading index…. to be followed by downturns in the Weekly Leading Index and other shorter-leading indexes. In fact the most reliable forward looking indicators are now collectively behaving as they did on the cusp of full-blown recessions.” There’s plenty of jargon, but what is lacking in this description is any actual economic substance. Theirs was a story about data – as though data itself caused recessions – and not a story about the economy.
Silver gets one thing absolutely right. This ECRI firm seems to be staffed of knaves, fools, and worse: a stupidly overconfident chief. That said, Silver’s dismissal of the emphasized text shocked me: because that is precisely the reason a recession is impossible to predict.
Now, Silver agrees that the best forecaster is one who gets it right. Rationally, the goal of any good forecaster is to be trusted and serve as an important source of information for clients. As far as economic predictions go, the client is of course the free market. Here’s the problem, let’s say I’m a trusted forecaster and I publish a report stating that the American economy will shrink by 4% next quarter. If people trust me, my report will have caused a recession. Why? Because business operations across the country will note that consumer demand will crash in three months, and halt expansions and disinvest from the economy into safer instruments like US Treasuries. The fall in investment will precipitate a contraction of nominal spending and hence aggregate demand. Therefore, the market cannot believe that we are on the cusp of recession without actually being in recession at that point.
Of course, when an idiotic firm makes such a prediction no one will bat an eye, because no one trusts that firm: but that isn’t Silver’s point. I think readers loose a gem of an example here because few times is a prediction so intellectually contradictory as in forecasting a recession. Now, Silver does talk about self-fulfilling prophecies in another context entirely, and notes that fear of an epidemic might result in more precautionary measures which undo such a fear.
But that’s not an epistemological flaw. We can’t possibly know that a recession is coming without actually being in one. Single individuals can, but they can’t be trusted by the market as a whole ipso facto. It’s not self-fulfilling at all, it’s like asking “are we there yet” after you’re in the hotel.
The rest of the book is a smooth ride. Silver consistently packs the book with anecdotes from interesting interviews as well as a slew of useful data, and a trove of fascinating references. I suppose many readers would have been thrilled to learn about Knightian uncertainty in the context of climate change, and was somewhat surprised this was absent, but Silver does a lovely job of coordinating various viewpoints with all the important data there is.
Unfortunately, we can’t have too many books on one topic. That’s why reviewers like O’Neill think Silver – for reasons right or wrong – has done a disservice to his audience. I judge a book not by how good it is; but against how well it could have been written. On careful read, I don’t think Silver misses the gold standard by far. I consider myself to be fairly knowledgeable about these topics, but still learned a ton. Most importantly, this book will (hopefully) inspire a new generation of toy forecasters and model tinkerers to approach the world with a probabilistic mindset and to relish in uncertainty. Because it’s a fun book, and doesn’t sellout on substance to get there. Even if Silver isn’t an expert on finance, he has a unique window into that world vis-a-vis the general public, for whom this book is intended.
Oh, and the most important take away is that KPMG should have fired Nate Silver a long time ago.
Five stars, and I’d be hard pressed to update my prior.
I believe so. I believe your write-up will give the individuals a good reminding. And they will express thanks to a person later
Thanks for writing such a fair and accurate review of Nate Silver’s book … I actually read the book out of order (since I did not need to be persuaded that pundits are worthless), but enjoyed the book so much that I went back and read the early chapters I had skipped … my favorite chapter was the one on Deep Blue versus Kasparov
The things computer programs are you looking to try to make internet business bank cards with a laptop computer?