Wizards of Oz

"Life is fraughtless ... when you're thoughtless."

9.4.08

REVIEW: Taleb's "Black Swan"

After resting comfortably in my "anti-library" for many weeks, I recently plucked The Black Swan by Nassim Nicholas Taleb from my dusty nightstand. Since I was embarking on cross-continental flights (albeit with kids), I was looking forward to punctuating the drink-and-peanut monotony of Southwest Airlines (an airline woefully unequipped for flights longer than 90 minutes) with Taleb's insights.

Since my days as a civilian employee of the U.S. Navy, where I evolved from an aspiring systems engineer to a "Science Advisor" to a manager leading the "Red Team" at U.S. Joint Forces Command J9, I have been fascinated with the prospect of "adversarial surprise". Like most analytical efforts under the loose employ of the Pentagon (which has roughly one government civilian employee [tail] for every two active duty soldiers/sailors/airmen/Marines [tooth]), this was a cottage industry.

Taleb's insights echo many of our observations in the Joint Experimentation program, particularly regarding the hubris of intellectualism. His skepticism of inductive logic, his emphasis on the importance of context in perceiving information, and his lionization of Doktor Prof. Sir Karl Raimund Popper (whom I had the pleasure of driving from leland stanfurd junior u. to Cal some 20 years ago in my Nissan Sentra) as well as Henri Poincaré are worthy of note.

However, his self-referential anecdotes are reminiscent of a Tolstoy novel, and his clear disdain for planning (née prediction) creates a scotoma that pulls him into the same abyss of solipsism that consumed David Hume.

The depth of his criticisms can be summarized quite succinctly as:
Don't use quantitative methods for qualitative questions.
Nature is benign, so we can ascribe a comfortable level of determinism to our observations. New data, often obtained through technological innovation, requires modification of obsolete theories (e.g., the Ptolemaic model of the universe to the Copernican; Newton's Laws of Motion to Einstein's Special Relativity; etc.). Key to our understanding (though Taleb would probably insist we understand nothing) is the selection of appropriate parameters -- and to not get too enamored with your own theories, especially if it involves any vestige of "free will".

Fallible? You betcha! Yes, we are inclined to fool ourselves. Yes, we try to cram too many variables into our formulae in some vain hope that we'll "get it right". And yes, our institutions -- particularly financial ones -- tend to reward the wrong kinds of behavior (q.v. Prof. Clay Christensen's The Innovator's Dilemma, in which Clay digs into corporate failures vice successes, finding that Wall Street rewards bad behavior). But Taleb's diatribe against the folly of "epistemic arrogance" has created another confirmation bias that only casually addresses the issue of scale when considering complex topics.

I understand that I am straying far from the "anchor" of many blogfriends (John Robb, Art Hutchinson, General of the Hordes Subadei, ARHerring, zenpundit, Chet Richards) who have offered glowing praise for The Black Swan. Perhaps it's my naïveté (or perhaps that I'm a product of the California public school system), but I honestly don't see our civilization marching toward "Extremistan". Quite the opposite: While our awareness of remote events has increased, and our networks have grown exponentially, I believe that the diffuse topology of our networks actually dampens the impact of an extreme event.

Consider the "Butterfly Effect". Do you really think a butterfly flapping its wings in Jakarta is going to eventually cause a hurricane in New York City? Or do you think the minor perturbation is absorbed locally without cascading into some kind of resonance? Yes, there are examples that illustrate the dire consequences of unplanned resonance. Taleb (who waffles at the end of his book as half hyperskeptic, half intransigently certain) abandons the Gaussian bell curve, yet -- with only a single mention of Albert-László Barabási -- firmly embraces Power Law scale invariance as normative.

Despite Taleb's too-casual treatment of scale, I think he would agree with George E.P. Box's statement (c. 1987) that "...[A]ll models are wrong, but some are useful." Abandoning our dogmatic devotion to certainty is essential in any creative, innovative enterprise -- and can reveal hidden opportunities, and hidden abilities.

This requires that we reexamine how we define "success". In my adopted hometown of Oak Ridge, Tennessee, the best Calutron operators (the electromagnets that separated Uranium isotopes for the LITTLE BOY bomb at Y-12 during the Manhattan Project) were not the scientists from Berkeley who designed them, but seamstresses with no scientific training. And how many Americans would consider Tommy Franks or Norman Schwarzkopf as the most successful U.S. commanders in the Mid-East? What about Tony Zinni (who didn't win a major theater war, but may have demonstrated even greater skill by avoiding one)?

While many of us point to 9/11 as a "Black Swan", I can say unequivocally that it had a far less dramatic effect on my life than Continental Flight 196 on March 6th, 1993. Could I have predicted when or how I would meet the woman that would be the mother of my children? Of course not.... But was I open to the possibility, and adaptive enough (when jabbed in the ribs by Helen from Purchasing to move up one row on that flight) to take advantage of this blessing?

That may be the best value of Taleb's Black Swan: to jar us out of our collective comfort zones, to remind us how ignorant we truly are, and to encourage us to "Be Prepared!" Good advice, regardless of whether you live in Mediocristan or Extremistan.

____
Update: Überblogger Zenpundit has graciously linked this review -- and will have his own review posted this weekend. (Thx Zen!)

Labels: , , , , , , , , ,

4 Comments:

At 10/4/08 13:41 , Anonymous Anonymous said...

I don't think you are at all off the mark here. Anchors aweigh!

The greatest thing a book like this does for you is to make you think. That's why I found the Black Swan so valuable.

 
At 11/4/08 21:51 , Blogger Robert D. Brown III said...

I wrote the review of The Black Swan posted at Chet Richard's blog. I sincerely appreciate your perspective on the book here as well. But then, as now, I do believe The Black Swan is a valuable one, not so much for the discussion on power law scale invariance (that I fully recognize goes to the extent of hyperbole), but because of the extensive discussion of confirmation bias.

Failure to consider uncertainty explicitly or appropriately is another malpractice I frequently observe. If people do go so far as to consider the effects of uncertainty, they are often too narrow in their assessments or rely on symmetric characterizations or assumptions from which they then default back to dealing with averages as proxies for uncertainties. As Taleb correctly points out, this is similar to assuming that one can walk safely across a river that averages four feet in depth.

In my work as a decision analyst, I am a frequent witness of the folly of "epistemic arrogance". Regardless the several weaknesses found in Taleb's book, I think his remonstrances against these shortcomings made it worth the time reading for his intended audience. But in retrospect, I should have dealt with some of the weaknesses in the book, too.

Let me also point out that Taleb does not throw out the Gaussian distribution entirely. He explicitly states that the Gaussian has a number of appropriate applications. His point (and I admit that it gets screechy at times), though, is that it is not always the best type of distribution to use when considering uncertainty. One just cannot assume Normality (both the mathematical variety and the more garden variety of assuming that tomorrow will be like yesterday). But I did not get the message that power law distributions should be normative. His message seemed more clearly to me that one should learn to use discernment for their proper application.

By the way, I absolutely agree with you that increasing complexity more likely exerts a dampening effect on shocks rather than exposing fragility. The literature on brain trauma ought to be a great example of that. People can frequently experience damage to one locus of the brain, only to have it's function replaced by some other area. I can't imagine anything much more complex than the human brain.

And finally, I agree that the allusion to the butterfly effect has become, in many cases, over worked to the degree of being silly. Whenever someone brings it up, I usually resort to a question similar to yours.

 
At 14/4/08 12:36 , Blogger deichmans said...

ARH: Thanks - anchors aweigh indeed! :-)

Robert: Your comment is much appreciated, especially with the added commentary on your own original post. I wholeheartedly agree with your remarks on "uncertainty" -- as a physics undergrad we had "error analysis" hammered into our heads by our professors. To this day I still think of my "error bars" and "plus or minus" in my assessments (with frequent looks of befuddlement from my colleagues).

 
At 14/4/08 21:25 , Blogger Robert D. Brown III said...

Good decision analysis is, in part, measurement error propagation analysis.

 

Post a Comment

Subscribe to Post Comments [Atom]

<< Home