The HIGHLY IMPROBABLE. It hits us, it deeply changes our lives, we don't understand it properly. And we actually IGNORE it.

in #psychology8 years ago (edited)

02.10.201375404.jpg

The HIGHLY IMPROBABLE.


It hits us, it deeply changes our lives, we don't understand it properly. And we actually IGNORE it.


 

We call these improbable events Black Swans because we thought black swans didn't even exists or they are so rare. It's strange, even paradoxically that we accumulate huge information, we are capable of finding meaning in all this vast ocean of knowledge and yet we are ignorant. We have this talent of finding the relevant information in all the output of our stimuli, to create the scientific method, philosophize about the nature of being, and invent complex mathematical models. But it seems we are not so good at it.

What stops us ?

Our beliefs, biases, misconceptions.

We’re inclined to be narrow-minded in our beliefs about the world. Once we have an idea about how the world functions, we tend to cling to it. But because human knowledge is constantly growing and evolving, this dogmatic approach makes no sense.

Just two-hundred years ago, for example, doctors and scientists were supremely confident in their knowledge of medicine, yet today their confidence seems ludicrous: just imagine going to your doctor complaining of a common cold, and being given a prescription for snakes and leeches! Being dogmatic about our beliefs makes us blind to those concepts that fall outside the paradigms we’ve already accepted as true. How, for example, is it possible to understand medicine if you’re not aware that germs exist? You might come up with a sensible explanation for illness but it will be flawed by a lack of crucial information.

This kind of dogmatic thinking can result in huge surprises. We’re sometimes surprised by events not because they’re random, but because our outlook is too narrow. Such surprises are known as “Black Swans,” and they can prompt us to fundamentally reconsider our worldview.

Before anyone had ever seen a black swan, people assumed that all swans were white. Because of this, all their depictions and imaginations of the swan were white, meaning that white was an essential part of “swanness.” So, when they discovered their first black swan, this fundamentally transformed their understanding of what a swan could be. As you’ll see, Black Swans can be as trivial as learning that not all swans are white, or as life-changing as losing everything because of a stock market crash.

 

How important are these improbable events for our life ? Are they a major threat ?


 

Some will be hugely affected by them, others hardly at all. The power of their effect is largely determined by your access to relevant information: the more information you have, the less likely you are to be hit by a Black Swan; and the more ignorant you are, the more you are at risk.

This can be seen in the following scenario:

Imagine making a bet on your favorite horse, Rocket. Because of Rocket’s build, her track record, the skill of the jockey, and the poor competition, you believe that Rocket is the safest bet and gamble everything you own on the horse winning. Now imagine your surprise when the starting pistol is fired and Rocket not only doesn’t leave the gates but opts instead to simply lay down on the track. This would be a Black Swan event. Given the information you’d gathered, Rocket winning was a safe bet, yet you lost everything the instant the race began. But this event will not be a tragedy for everyone. For example, Rocket’s owner made a fortune by betting against his own horse. Unlike you, he had additional information, knowing that Rocket was going to go on strike to protest animal cruelty. Just that small amount of information saved him from having to suffer a Black Swan event.

The impact of Black Swans can also differ widely in scale. Rather than affecting only individuals, sometimes, entire societies can experience a Black Swan.

When this happens, a Black Swan can transform how the world works, impacting many areas of society, like philosophy, theology and physics. For example, when Copernicus proposed that the Earth is not the center of the universe, the consequences were immense, as his discovery challenged both the authority of the ruling Catholics and the historical authority of the Bible itself. In the end, this particular Black Swan helped to establish a new beginning for all of European society.

 

A). The First issue: We are "doomed" to be IGNORANT (at some level).


We are very easily fooled by even the most basic of logical fallacies


 

Although humans seem to be the most intelligent animals on the planet, there’s still a long way to go before we’ll have outgrown all of our bad habits. One such habit is creating narratives based on what we know of the past. While we tend to believe that the past is a good indication of the future, this is often a fallacy. It leaves us prone to mistakes because there are simply too many unknown factors which could go against our narratives.

For example, imagine you’re a turkey living on a farm. Over the years the farmer has fed you, let you roam freely, and provided a place to live. Using the past as your guide, there is no reason to expect that tomorrow should be any different. Alas, tomorrow is Thanksgiving, and you are decapitated, filled with spices, thrown in an oven, and devoured by those who had housed and fed you. As this example shows, believing that we can base predictions about the future on knowledge of the past is a fallacy with potentially dire consequences.

A similar fallacy is confirmation bias: we often search for evidence only for those beliefs we’ve formed already, even to the extent that we ignore any evidence that contradicts them. When we encounter information that goes against what we already believe, we’re unlikely to accept it and even less likely to investigate further. If we do investigate, we’ll probably look for sources that undermine this information.

For example, if you strongly believe that “climate change” is a conspiracy but then happen to see a documentary called “The Undeniable Evidence for a Changing Climate,” it’s likely that you’ll be upset. If, after this, you did a web search for information about climate change, it’s more probable that the search terms you use would be “climate change hoax” and not “evidence for and against climate change.” While both of these fallacies are anti-scientific, it turns out that we can’t do much to avoid such bad reasoning: it’s simply in our nature.

 

B). The second issue: Our brain doesn't help us.


The way that our brains categorize information makes accurate predictions extremely difficult.


 

During our evolution, the human brain developed certain ways to categorize information. While these were great for surviving in the wild, when we needed to learn and adapt quickly to our dangerous surroundings, they are terrible in today’s complex environments. For instance, one way we incorrectly categorize information is the so-called narrative fallacy, where we create linear narratives to describe our current situation. This is due to the massive amount of information we’re faced with every day. To make sense of it all, our brains select only that information it considers important.

For example, while you probably remember what you ate for breakfast this morning, it’s doubtful you remember the color of everyone’s shoes on the subway. In order to give meaning to these unconnected bits of information, we turn them into a coherent narrative. For example, when you reflect on your own life, you probably select only certain events as meaningful, and you order those events into a narrative that explains how you became who you are. For example, you love music because your mom used to sing songs by The Beatles to you every night.

However, creating such narratives is a poor way to gain any meaningful understanding of the world. This is because the process works only by looking back on the past, and doesn’t take into account the near-infinite explanations that are possible for any one event. The fact is that tiny, apparently insignificant events can have unpredictable, major consequences.

Imagine, for example, that a butterfly flapping her wings in India causes a hurricane one month later in New York City. If we catalogue each stage of cause and effect in this process as they occur, then we’d be able to see a clear, causal relationship between events. But since we only see the outcome – in this case, the hurricane – then all we can do is guess at which of the simultaneously occurring events actually influenced that outcome.

 

C). The third issue: We don't always see the big picture.


We don’t easily distinguish between scalable and non-scalable information.


 

We humans have developed many methods and models for categorizing information and making sense of the world.

Unfortunately, however, we’re not very good at distinguishing between the different types of information – most crucially between “scalable” and “non-scalable” information. But the difference between those types is fundamental.

Non-scalable information – such as body weight and height – has a definite, statistical upper and lower limit. Body weight is non-scalable because there are physical limitations on how much a person can weigh: while it is possible for someone to weigh 1000 lbs, it is physically impossible for anyone’s weight to reach 10,000 lbs. Because the properties of such non-scalable information are clearly limited, it’s possible for us to make meaningful predictions about averages.

On the other hand, non-physical or fundamentally abstract things, like the distribution of wealth or album sales, are scalable.

For example, if you sell your album in digital form through iTunes, there’s no limit to how many sales you might make because distribution is not limited by the amount of physical copies you could manufacture. Furthermore, because the transactions take place online, there is no shortage of physical currency to prevent you from selling a trillion albums. This difference between scalable and non-scalable information is crucial if you want to have an accurate picture of the world. And trying to apply those rules that are effective with non-scalable information to scalable data will only lead to mistakes. For example, say that you want to measure the wealth of the population of England. The simplest way to do this is to work out their per capita wealth, by adding up their total income and dividing that figure by the number of citizens. However, wealth is actually scalable: it’s possible for a tiny percentage of the population to own an incredibly large percentage of the wealth. By merely collecting data on the per capita income, you end up with a representation of income distribution that probably doesn’t accurately reflect the actual reality of the citizens of England.

 

D). The fourth issue: We are still ARROGANT.


We are far too confident in what we believe we know.


 

We all like to keep ourselves safe from harm, and one of the ways that we do this is assessing and managing the possibility of risk. This is why we buy things like accident insurance, and we try not to “put all our eggs in one basket.” Most of us try our best to measure risks as accurately as possible to ensure that we don’t miss out on opportunities, while also ensuring that we don’t do something we may later regret. To achieve this we have to evaluate any possible risks and then measure the probability that these risks will materialize.

For example, imagine you are in the market to purchase insurance. You want to buy the kind of policy that will protect you against the worst-case scenario, but also will not be a waste of money. In this case, you’d have to measure the threat of disease or accident against the consequences of those events transpiring, and then make an informed decision. Unfortunately, we are far too confident that we know all the possible risks which we need to protect ourselves against. This is called the ludic fallacy, and according to it, we tend to handle risk as we would a game, with a set of rules and probabilities that we can determine before we play. Yet, treating risk like a game is itself risky business. For example, casinos want to make as much money as possible, which is why they have elaborate security systems and ban players that win too much, too frequently. But their approach is based on the ludic fallacy. The major threats to casinos may not be lucky gamblers or thieves, but rather, for example, a kidnapper who takes the owner’s child hostage, or an employee failing to submit the casino’s earnings with the IRS. The casino’s greatest threats might be completely unpredictable. As this shows, no matter how hard we try, we’ll never be able to accurately calculate every risk.

(A1). First solution: Being aware of our PERSONAL knowledge limitation.


Taking an inventory of what you don’t know will help you to assess risks better.


 

We’ve all heard the phrase “knowledge is power.” However, sometimes we’re constrained by what we know, and at these times recognizing what you don’t know is far more advantageous.

Indeed, by focusing only on what you know, you limit your perception of all the possible outcomes of a given event, and create fertile ground for the occurrence of Black Swan events.

For example, say you want to purchase stocks in a company, but your knowledge of stock statistics is limited to the period 1920-28 – one year before the greatest stock market crash in US history. In that case, you’d observe a few small dips and peaks, but in general you’d notice that the trend is upward. So, thinking that this trend must continue, you spend your life savings on stocks. The next day, however, the market crashes and you lose everything you have. If you’d studied the market a little more, you would have observed the numerous booms and busts throughout history. By focusing only on what we know, we open ourselves to great and unmeasured risks. On the other hand, if you can at least establish what it is that you don’t know, then you’d be able to greatly reduce your risk.

Good poker players understand this principle well, as it’s crucial to their success at the game. While they know the rules of the game, and the probabilities that their opponents have better cards than they do, they are also aware that there is certain, relevant information they don’t know – such as their opponent’s strategy, and how much their opponent can stand to lose. Their knowledge of these unknowns contributes to a strategy that doesn’t focus solely on their own cards, thus enabling them to make a far more informed assessment of the risk.

(A2). Second solution: Being aware of our HUMAN knowledge limitation.


Having a good understanding of our limitations as human beings can help us to make better choices.


 

Perhaps the best defense against falling into the cognitive traps we’ve seen is a good understanding of the tools that we use to make predictions, and their limitations.

While knowing our own limitations certainly won’t save us from every blunder we’ll ever make, it can at least help us to reduce our bad decision-making.

For instance, if you’re aware that you are subject to cognitive bias, like everyone else, then it’s much easier to recognize when you’re only looking for information that confirms what you already believe to be true. Likewise, if you know that we humans like to organize everything into neat, causal narratives, and that this kind of approach simplifies the complexity of the world, then you’ll be more likely to search for further information to gain a better view of the “whole picture.”

Just this small amount of critical self-analysis can help you gain a competitive advantage over others in your field.
It’s certainly preferable to be aware of your shortcomings. For example, if you know that there will always be unforeseeable risks in pursuing any opportunity, despite how promising that opportunity seems, you’ll probably be less inclined to invest heavily in it.

While we cannot triumph over randomness or our limited capacity for understanding the vast complexity of our world, we can at least mitigate the damage inflicted by our ignorance.



(...) Final points:


Even though we’re constantly making predictions about the future, we’re actually terrible at it. We put far too much confidence in our knowledge and underestimate our ignorance. Our over-reliance on methods that seem to make sense, our basic inability to understand and define randomness, and even our biology, all contribute to poor decision making, and sometimes to “Black Swans” – events we believe to be impossible but which end up redefining our understanding of the world.

Actionable advice:


Be suspicious of “because.”
Although it is absolutely in our nature to look for linear, causal relationships between events in order to make sense of this complex world, the reality is that we are absolutely pitiful at both making predictions of the future and establishing causes for the present.

Rather than feeding our desire to see events in clear-cut cause and effect, it’s better to instead consider a number of possibilities without being married to any single one.

Know what you don’t know.
If you want to make meaningful predictions about the future – which, if you are buying insurance, making investments, attending college, changing jobs, conducting research, or just being a human, then you certainly do – then it’s simply not enough to take all of the “knowns” into consideration. This leaves you with only a partial understanding of the risks involved in your prediction.

Instead, you should also be consciously aware of what you don’t know, so that you don’t unnecessarily limit the information that you are working with.

[ source 1 source 2 ]

Sort:  

It was certainly long read for me to go without not being a native English speaker. I believe that you could have shortened it a bit with making essential points more visible. In the core, I think it could be summarized to human intuition and dangers of it and making sure that one must know his weaknesses and danger that overconfidence brings. The keyword is knowledge as always but also limitations of it.
On the other side, there could be opposite arguments made to follow your intuition. Since everyone learns best from his mistakes and that is the way we all follow the most. Only the smartest of us learn from mistakes of others.
Please take this as my limited contribution to your work and keep giving us more food for the mind.
Thank you.

:)) I agree, of course.

  • it's long. can spawn the TL;DR. I'll try to be more succinct in the future. I personally appreciate the people that can follow the KISS principle. I'll try to improve it, of course.
  • IMHO, intuition is good as a last resort, or, if you use it as a primary decision you have to shift your expectation window: to be prepared to embrace chaos. Why I'm saying this ? Because we hardly know a little bit of our mind and to rely on the unconscious part of it (intuition) is kind of dangerous if not enough understood.
  • yes, we are all "victims" of Dunning Kruger effect and yes, the learning process is continuously. Always.

Sometimes I don't want more new lessons from life. I just want to be a plant/animal. I'm joking (49%). :))

Thank you for your patience and feedback. It wasn't easy, I know. :)

Oh, how wonderful it would be to not have mind or thought process. I would be like post stamp traveling all over the world but staying in one place all the time. xD

Meditation does that. As you know, it doesn't annihilate the mind (impossible anyway), but puts you in the position of an external observer of your own thoughts tornado. And that's a welcomed quasi-shutdown of our mind. :)

Comparison with post stamp is in fact riddle which says: What is the thing that goes around the world but stays in the same place all the time? Answer: post stamp. Idk why I included it, but thank you for sharing more knowledge.
I used to meditate a lot while I was a kid, but these days I can't see shit when I close my eyes but I certainly try something from time to time. Steemit so far has been a good push for me in the ways of spirituality and mind expansion because of people like you.
Speaking of it just look what jumped up on my feed:
https://steemit.com/life/@aboundlessworld/a-beginners-guide-to-meditation