The key to better understanding the world is to build a latticework of mental models.
— Shane Parrish, The Great Mental Models, Vol. 1
How good are we at considering information or opinions that don’t agree with what we think?
I recently attended a conference by a world-renowned speaker. For context, the talk was in Lima, Peru, about how can the industry in Peru become more competitive. This speaker has published eight books and has spent the last five years researching his next one.
There was a panel of five industry leaders to comment and ask questions after the talk: three businessmen, a journalist, and a former secretary of the Treasury (Minister of Economy, as it’s called in Peru).
The speaker’s first point was that one of the reasons countries in Latin America don’t level up with first world economies is the trap of self-complacency. That is, thinking that while we are not the top economies in the world, we are not that bad either. And thus, we take no action towards becoming better. We feel no urge to become more competitive. Meanwhile, the top economies in the world –the speaker explained– never think they are good enough. They are ruthlessly seeking to become more competitive and efficient.
He also elaborated around the idea that we are in the presence of world-wide trends that are inescapable, and that they will be here faster than we think. For example, automation is coming, he said, and in part is already here. After five years of research, and hearing what experts around the world think on the subject, his conclusion is that this trend is going to affect not only first-world economies but also medium economies like Peru. We have to choose now where we want to be in the next years.
What surprised me was the reaction of the panelists. Most of them said something like this: “very interesting, Mr. X, but this doesn’t exactly apply to us. We don’t think this will affect Peru so severely.” (And thus, we, as industry leaders, will make nothing about it.) The people in the panel were not dumb. They know the industry well. Each one of them are successful in their fields. How is that they uncounsciously fell in the trap of complacency that the expert they hired for the talk had just explained?
Daniel Kahneman, in his book Thinking, Fast and Slow, explains that we tend to judge situations based on the information in front of us, honoring only the inside view. What we know, what we feel comfortable with. What you see is all there is, says Kahneman. At the same time, we unconsciously neglect to take in account the outside view as evidence, even if it could provide us with a baseline of reality to our judgments. “Information is routinely discarded when it is incompatible with one’s personal impressions of a case. In the competition with the inside view, the outside view doesn’t stand a chance.” (p. 249)
We should develop some kind of mental vaccine against this cognitive bias.
The Black Swan: The Impact of the Highly Improbable, by Nassim N. Taleb.
Chris Anderson, editor-in-chief of Wired magazine, commenting about this book, writes that “Our brains are wired for narrative, not statistical uncertainty. And so we tell ourselves simple stories to explain complex thing we don’t–and, most importantly, can’t–know. The truth is that we have no idea why stock markets go up or down on any given day, and whatever reason we give is sure to be grossly simplified, if not flat out wrong.”
A Black Swan, as defined by the author, “is [first] an outlier, as it lies outside the realm of regular expectations, because nothing in the past can convincingly point to its possibility. Second, it carries an extreme impact. Third, in spite of its outlier status, human nature makes us concoct explanations for its occurrence after the fact, making it explainable and predictable.”
Taleb’s book is highly opinionated. His writing style makes it hard to follow some of his arguments, and some people find the self-references through the book arrogant and pompous. Nonetheless, that doesn’t invalidate Taleb’s arguments. I think this book is a must-read for anybody interested in improving his or her thinking processes and pondering information prior to making decisions.
Find The Black Swan: The Impact of the Highly Improbable on Amazon.
Shane Parrish on making other’s opinions our own and skipping the thinking:
It’s easy to take other’s opinions and make them our own. This isn’t hard. (…) We read but often we don’t digest. Reading involves effort; the more you put in the more you get out.
The same applies to conversations. We are so busy thinking that we understand the other person that we start thinking about what we want to say before they’ve even made their point. We’re not listening.
When it comes to taking the opinions of others and making them our own, we skip the thinking. We don’t do the required work.