Feeds:
Posts
Comments

Posts Tagged ‘black swan’

From a recent essay by NN Taleb:

Then we will see an economic life closer to our biological environment: smaller companies, richer ecology, no leverage. A world in which entrepreneurs, not bankers, take the risks and companies are born and die every day without making the news.

My question is… do we actually need to establish these rules or is the “market” already enforcing them?  Media is a good example.  The media companies are getting smaller, more diverse and very little leverage compared to how it used to work just 10 years ago.

Read Full Post »

http://www.nytimes.com/2008/10/28/opinion/28brooks.html?_r=1&scp=2&sq=&st=nyt&oref=sloginfree registration to read if not registered…

Go with the as postulated in this NYT.com article, there are four steps to every decision…

  1. you perceive a situation
  2. you think of possible courses of action
  3. you calculate which course is in your best interest
  4. you take the action

&^+%$!!)*?<#!

If only it were that simple.

Over the past few centuries, public policy pundits, talking heads and some academicians have presumed that step three was the most important. Social science disciplines are premised on that presumption as well; despite the ink used to propagate altruistism at every opportunity, people calculate and behave in their own self-interest.

Greenspan’s quoted in the above article made that clear for his reign and for the country. His comments aside, none of the steps above are worth a lot without the others.

Most of the processing takes place without literal awareness. We behave and when pressed for why, we generate a story that fits that situation and puts us in a virtuous light. We don’t really perceive all that well. Thus, the step that seems most simple is the most complex. Looking at and perceiving the world is an active process of symbol meaning-making that shapes and biases the rest of the decision-making chain.

Psychologists have been exploring our biases for four decades with the work of Amos Tversky and Daniel Kahneman, and also with work by people like Richard Thaler, Robert Shiller, John Bargh and Dan Ariely. Now Brooks would have it that it is time for the economists to contribute. Gasp!

The desperation of the day may mean a new wave of behavioral economists and others who are next to bring pop psychology to the realm of public policy. These are the same pundits that used their antiquated assumptions to provide plausible explanations for why so many others are wrong about risk behaviors and globalization implications.

Nassim Nicholas Taleb for instance. In his books “Fooled by Randomness” and “The Black Swan” he explains it all in equally simplistic manner as the four rules above. As an astute colleague pointed out bluntly, we are asking the guy who coined the perception of “black swans” to predict black swans.” The irony is laughable. What gives a black swan example its value is that it is not obvious [read predictable]. While Taleb may have seen it coming, as stated in the above article, that precludes it from being an example of a “black swan” phenomenon. Irony for sure.

When Taleb gets on the philosophical diving board to spring into evolutionary causation decreeing that humanoids brains evolved to fit a less complex world I found myself gagging instead of gasping. His examples of the perceptual biases that distort our thinking are themselves century old prejudices.

1. Our tendency to see data that confirm our prejudices more vividly than data that contradict them

a. We recognize information due to its relation with exiting cues we have in our repertoire. We don’t see what we haven’t been reinforced to see; it is not self-deception any more than it is self enlightenment when we see what turns out to be correct. In that set of circumstances the correctness is not based on enlightenment but on relationships that were there all along but not focused on, recognized or reinforced by the environment.

b. That environment is the same one where superstition, myth, magic, mind and phenomenalism is considered valuable to be our “humanity” and, knock on wood, we sometimes guess right despite the reasons behind the guess.

2. Our tendency to overvalue recent events when anticipating future possibilities

a. the last 6 months is more like the next six months than the last 1000 years are like the next 6 months

3. Our tendency to spin concurring facts into a single causal narrative

a. if for no other reason, this site is the mainstay of the defeat of monocausality which haunts our culture, bolsters our superstitions and keeps us surprised at regular intervals

4. Our tendency to applaud our own supposed skill in circumstances when we’ve actually benefited from dumb luck.

a. We benefit from historical uniqueness and education that is more than smattered with scientific skepticism as opposed to boorish cynicism that ignores our strengths and panders to the voodoo in the caves.

b. See 1.-b above.

Errors of perception are everywhere when experimental analysis is NOT involved. Clearly, getting to our moon and beyond was due to experimental analysis and NOT interpretation of perceptions of pundits.

Without experimental analysis we’ll continually fail to perceive “what’s going on out there.” The relationships between a zillion things and another zillion things are to complex. While a four point decision tree helps us walk across the street in a small town, it is not the way to figure out how to navigate rules of this years tax code or interpret the Patriot Act I or II on any given Sunday. Who knew and who still knows which small events are linked to big disasters? Who knew that the mechanical Newtonian links were there as well as selected consequences of a billion factors coming together world [pick one] (cause – contribute – accompany) a social-political-economic unraveling? Experimental analysis was not involved. Interpretation of biases was.

Faulty perceptions are not the only reason or application for an experimental analysis. Relationships are complex, not caused by single small or an enormous events as you have been trained to think. We don’t have much training to recognize or understand what our own self-interests are in anything but localized strings of spatial-temporal events. Brooks’ towing with trusting government to become engaged in the process is folly. Just how much “help” can a country endure? What’s worse, it is lazy. Separating government and business is impossible but collusion is asking for our own demise handed to as a coupon toward irrelevance. While we regularly make poor decisions, the government is insensitive to making the correct one or those needing to be made in a timely fashion.

If you doubt that, don’t look in the rear view mirror as some would suggest. Follow the consequences of a potential decision and determine for yourself if you or an agent of an ideology is better suited to care for what is in your best interests. Government information feedback mechanisms are limited, broadly myopic, and mechanical; not timely. The very thing that got them away from the citizenry to be politicians has had ideology numb them contributing to an end to pragmatism. This bias, to be sure, is no better or worse than any other bias. They all can be replaced with an experimental analysis from science rather than the pop pap solutions we are offered.

As we’ve seen from recent crashes before the latest one this set of economic biases just keeps on giving. It keep on giving us the problems that government is content to continue to administer to; mindfulness, equality of everything not equal, brinkmanship over leadership and above, all, saying what works to get re-elected. As stated, this meltdown is a cultural event reminding us that we are perceptive beings, seeing things that aren’t there and not perceiving things that are there. (See previous blogs]


Read Full Post »

Great piece from Black Swan author,Nassim Nicholas Taleeb, on Edge.org.

Read it.

What Is Fundamentally Different About Real Life

My anger with “empirical” claims in risk management does not come from research. It comes from spending twenty tense (but entertaining) years taking risky decisions in the real world managing portfolios of complex derivatives, with payoffs that depend on higher order statistical properties —and you quickly realize that a certain class of relationships that “look good” in research papers almost never replicate in real life (in spite of the papers making some claims with a “p” close to infallible). But that is not the main problem with research.

For us the world is vastly simpler in some sense than the academy, vastly more complicated in another. So the central lesson from decision-making (as opposed to working with data on a computer or bickering about logical constructions) is the following: it is the exposure (or payoff) that creates the complexity —and the opportunities and dangers— not so much the knowledge ( i.e., statistical distribution, model representation, etc.). In some situations, you can be extremely wrong and be fine, in others you can be slightly wrong and explode. If you are leveraged, errors blow you up; if you are not, you can enjoy life.

So knowledge (i.e., if some statement is “true” or “false”) matters little, very little in many situations. In the real world, there are very few situations where what you do and your belief if some statement is true or false naively map into each other. Some decisions require vastly more caution than others—or highly more drastic confidence intervals. For instance you do not “need evidence” that the water is poisonous to not drink from it. You do not need “evidence” that a gun is loaded to avoid playing Russian roulette, or evidence that a thief a on the lookout to lock your door. You need evidence of safety—not evidence of lack of safety— a central asymmetry that affects us with rare events. This asymmetry in skepticism makes it easy to draw a map of danger spots.

Read Full Post »