Summary of The Black Swan, by Nassim Taleb

Short summary reading time: 3 min

Long summary reading time: 30 min

Book reading time: 10h26

Score: 8/10

Book published in: 2007

Main Idea

There are two types of randomness in life: one where extreme random events don’t happen, and one where they do. These extreme events are called Black Swans.

Prediction and risk experts fail to think about Black Swans because they use predictions techniques from disciplines where Black Swans don’t exist, and apply them to disciplines where they do exist.


The Black Swan is a book written by the mathematician and philosopher Nassim Taleb.

It is the second book of Taleb’s series called The Incerto, organized as follows.

The book deals with Black Swans, these random unpredictable and high-impact events. Taleb explains in which systems they appear, why we don’t see them, how we should think about them, why we have been failing to deal with them, and finally, how to deal with them – as much as one can do.

The book came out in April 2007, when the first signs of the 2008 financial crisis were appearing. The nearly avoided bankruptcy of the Western financial system catapulted Taleb to the pinnacle of Anglo-Saxon intellectualism. He was received like a rock star in the 2009 World Economic Forum in Davos.

The Black Swan ended up selling at least 3 million copies and ironically became…a Black Swan.

Not going to lie, it’s a tough book. I read it at a pace of 7000 words per hour, or 116 words per minute, roughly half of my normal reading-while-summarizing pace. I had to re-read many of the sentences.

If you have never taken a statistics or finance class, google the terms you run across.

Most of the time, Nassim Taleb succeeds at conveying his message, yet some of the things he wrote were unclear – or I lacked the intellect to understand them.

The book was in essence really good because it teaches something no one else is teaching, and it does so in a way no one else does it. But I am a bit disappointed – I was expecting more out of it, which does not make any sense!

Since a Black Swan cannot be predicted, it can hardly be talked about before it happens, an illogicality Taleb himself speaks about in the book.

The Black Swan is a fun book though! Taleb tells you at the beginning that he had fun writing it, and you can definitely sense it.

If you are really into Nassim Taleb’s writing style and ideas, and if you have a lot of time on your hand, read the book. It will be great for your general knowledge.

But if you just want to seize the idea of the Black Swan, then my extensive summary will suffice.


Get the book here.

Short Summary

Situations in life are subject to two types of randomness: one that happens in “Mediocristan”, and one located in “Extremistan”.

Randomness in Mediocristan is concerned with situations where the overwhelming majority of units in a sample will constantly gravitate towards the mean. It means that one unit will never be able to influence the average.

Eg: the money you make as a massage therapist. Some therapists will make €10/hour in lower-income countries, and the highest-paid massage therapists in the world will likely make up to €500/hour. But they will never make €1 000 000/hour.

If you line up 100 random massage therapists, one massage therapist, even if he is paid super high, or super low, will never be able to influence the average made by the 99 others.

Earning money as a massage therapist is in Mediocristan.

If you take wealth, it’s a different story. If you take 100 random people and measure the average of their wealth, the average will definitely be perturbated if you add Elon Musk to the sample. One sample managed to disturb the average.

Wealth is therefore a matter existing in Extremistan.

Situations in Mediocristan won’t know any Black Swan. Situations in Extremistan, will. The point is to always know whether you are in an Extremistan situation, or a Mediocristan one.

People, due to internal biases, aren’t trained to make a difference between Mediocristan and Extremistan.

As a result, they get surprised when a Black Swan appears. A Black Swan is a random event of high impact and low probability. It mainly happens in Extremistan.

We cannot predict Black Swans for several reasons.

First, we predict our future based on our past. That prevents us from “seeing coming” events that never happened in the past. Eg: 9/11.

Second, we tend to look for evidence of things we already know instead of looking for where we could be wrong, for the unknown.

Third, we rationalize all events by embedding them into a narrative despite the fact that these events are random and have nothing to do with one another.

Fourth, we behave as if the Black Swan did not exist because our brain is hooked on routine and regularity.

Fifth, we ignore the silent pieces of evidence of phenomenons, taking into account only what happened, and leaving out what didn’t happen, or what we didn’t see happened.

Sixth, we tend to “tunnel” our knowledge and judgment by focusing on the detail, instead of focusing on the broader picture.

If you want to live your life well, look for asymmetric returns.

Go 100% into bets that have huge upside and no downside (eg: a job in a startup where you get equity), and be extremely careful and conservative with the rest, in order to mitigate the risks when the inevitable Black Swan hits.

Table of Content


Part I: Umberto Eco’s Antilibrary, or How We Seek Validation

Part II: We Just Can’t Predict

Part III: Those Gray Swans of Extremistan

Part IV: The End


Summary of The Black Swan Written by Nassim Taleb


For millennials, people thought all swans were white. Until one black swan was spotted in Australia.

One bird was enough to disprove a belief made for centuries.

Following this logic, Black Swans are events that share three properties.

  1. Rarity: no one expects it because the past could have not have predicted it.
  2. Extreme impact: It carries an extreme impact.
  3. Retrospective predictability: People say it could have been predicted after it happened.

The world evolved because of Black Swans. The agricultural revolution was a Black Swan, the Internet was a Black Swan, and WWI was a Black Swan too.

Black Swans are dangerous because they embody what you don’t know. Their impact is great because nobody expected them.

Consider the tsunami of 2004. If it had been expected, systems would have been put in place to prevent it.

In a way, a Black Swan is an event that happened, but was not supposed to – understand: we hadn’t anticipated it.

This generally applies to businesses. Businesses that thrive are businesses that weren’t supposed to exist, businesses nobody would have bet on.

This teaches us two things:

-> we are not capable to predict the future as we cannot predict Black Swans.
-> we are not aware of our inability to predict the future as we keep on trying to do so.

Experts who forecast have no idea about what they are doing.

It also means that we have more to gain from focusing on what we don’t know, than on what we know.

In some domains, like science and the stock market, the payoff to exposing yourself to Black Swans can be huge. In others, like the military, it can be deadly.

Another sign of our inability to deal with Black Swans is that we keep on learning from details, not the general. Eg: After WWI, the French built a wall to prevent Germany from invading again. The Germans just went around it.

-> we don’t learn that we don’t learn.

History often remembers heroes. But it never remembers silent heroes. Had someone passed a law that forces airlines to lock their cockpit doors on September 10th, 2001, 9/11 would have never happened, and no one would have known that this legislator was in fact, a hero.

Everyone knows prevention is better than treatment, but no one rewards prevention.

We glorify those that fix instead of glorify those that prevent.

While we label “uncertain” as something we don’t know is uncertain (what time does Grandma come at tonight?), the author labels uncertainty as something he ignores that he ignores (in this case, ignoring that Grandma is even coming).

Platonicity is the author’s term to describe human’s tendency to focus on well-defined concepts instead of the abstract. It is what makes us think that we understand the world – while in fact, we don’t.

The main claim of the book is the following: the world is dominated by the unknown and highly improbable, while we only focus on the routine and on what we know.

-> one should begin with the study of extreme events to understand the world.

The second claim is that in spite of our increasing knowledge, the world will be more difficult to predict.

Part I: Umberto Eco’s Antilibrary, or How We Seek Validation

Books you read are less valuable than unread books because what you don’t know has more value than what you know.

The more you read, the more unread books you will have. Let’s call a collection of unread books antilibrary.

The antilibrary is the blue circle, the periphery, of what you have read, in yellow. As what you know (in yellow) grows, so does what you don’t know (the blue circle).

We tend to focus on what we know instead of doing the opposite. Eg: It’d be much more interesting to write what you didn’t study on your CV than what you did.

Let’s therefore name people that study the unknown antischolars.

Chapter 1: The Apprenticeship of an Empirical Skeptic

History and the Triplet of Opacity

History is a black box. It tells you what events happened, but not how they happened.

We have three problems when we look at history, called the Triplet of Opacity.

  1. The illusion of understanding: people that think they understand, but they don’t.
  2. The retrospective distortion: assessing matters after they happened.
  3. The curse of learning: the overvaluation of factual information.

Let’s first speak about the illusion of understanding.

When the Lebanese civil war started, everyone thought it’d last a few days, maybe weeks. It lasted 17 years. It’s the same thing for every war.

When studying similar events, we realize that history is a collection of sudden unpredictable events that each changed the course of history.

History doesn’t crawl. It jumps.

Dear Diary: On History Running Backward

Let’s now speak about the retrospective distortion. The author learned a lot by reading journalist Shirer’s Berlin Diary. The book is a daily account of what is happening in Germany from 1934 to 1941.

The journalist writes about the events as they are unfolding, in the present, not after they happened. As a result, the book gave an idea of how people felt at the time. And nobody thought a war was about to break out.

Education in a Taxicab

Let’s now speak about the curse of learning.

Information isn’t always valuable. The Lebanese elite had a lot of information about the war with which they thought they could predict what would happen – they couldn’t.

Meanwhile, the common people like cab drivers had access to the same information (details about the war) but were aware of their incompetence in predicting the outcome of the war. As a result, they didn’t try.


Clustering means grouping people per criterion. It’s not ideal as it reduces complexity.

200 years ago, Muslims were protectors of Jews, and both hated Christians. Today, it’s the opposite.

Things always seem to go well together, until they split.

This is the Black Swan generator. Clustering prevents you from seeing the true complexity out of which Black Swans appear.

-> no one knows what is happening, including tech execs at big companies.

Chapter 2: Yevgenia’s Black Swan

The author tells the story of the author Yevgenia Krasnova. Yevgenia wrote a book nobody would publish, so she published it on the Internet and was contacted by a small publishing house. The book grew in popularity and Yevgenia is now an established author.

Her success is a Black Swan.

NB: Yevgenia Krasnova is a fictional character invented by Taleb. She does not exist.

Chapter 3: The Speculator and the Prostitute

This is the most important chapter of the book.

There are two types of randomness:

  1. Randomness that welcomes Black Swan, called Extremistan, where things scale. Eg: wealth, wars, software, book sales, the stock market, etc. A trader is a scalable job. Buying one or one million shares is the exact same work. Movie acting is also scalable.
  2. Randomness that doesn’t welcome Black Swans (most of the time), called Mediocristan, where things don’t scale: dentistry, being a massage therapist, etc. They work by the hour, and can’t physically do more than x.

Eg: The story of Yevgenia happens in Extremistan.

Beware the Scalable

Scalable jobs enable you to make much more money, but you should avoid them.


Because scalable jobs only work if you are successful.

There are two types of jobs: non-scalable, driven by the mediocre; and scalable, where you have Giants and Dwarves.

The Advent of Scalability

When a job scales, it means that the best will get a disproportionate part of the pie.

When the recording did not yet exist, opera singers worked in Mediocristan. If you wanted to listen to opera, you had to buy a ticket. When recordings were invented, everyone could listen to the best opera singer’s recording.

Average singers lost their job.

Travel Inside Mediocristan

In Mediocristan, where jobs don’t scale, it’s the opposite.

An extreme event is unlikely to change anything if the sample is big enough.

Eg: the average weight of 1000 random people won’t be any different if the heaviest person in the world is part of them or not. Weight doesn’t scale.

weight doesn't scale
Weight doesn’t scale.

The rule in Mediocristan, is this: When your sample is large, no single instance will significantly change the aggregate or the total.

The Strange Country of Extremistan

Now, get the average net worth of 1000 random people. Then take one person out and replace him with the richest person on the planet.

Does the average change?

Yes, it does – by a lot!

wealth scales
Wealth definitely scales.

Same thing with book sales, number of movies sold, academic references, etc.

The rule in Extremistan is inequalities are such that one single observation can disproportionately impact the aggregate, or the total.

Almost all social matters belong to Extremistan. Black Swans happen there too.

Extremistan and Knowledge

In Mediocristan, an usual variation from the mean won’t disturb the equilibrium. As a result, life is a peaceful and boring routine. You can trust the data, and the more data you have, the more you can predict the future.

In Extremistan, it’s not the case. A single unit can completely skew the data, which is why you can’t trust the data. Any attempt to predict the future will come very slowly.

Wild and Mild

Things you find in Mediocristan: height, weight, calorie consumption, car accidents, mortality rates.

Things you find in Extremistan: wealth, income, book sales per author, sizes of planets, sizes of companies, stock ownership, commodity prices, inflation rates, economic data.

The Extremistan list is much longer than that of Mediocristan.

The Tyranny of the Accident

In Mediocristan, the collective, the obvious, and the predicted rule. You will never lose a lot of weight in one day. You need a collective effect of days to do it.

In Extremistan, it is the individual, the unseen, and the unpredicted that rule. You can make a lot of money in a single trade in the stock market.

Extremistan does not de facto means you’ll have Black Swan, and Mediocristan will not de facto mean you won’t have any.

Don’t platonify it.

Chapter 4: One Thousand and One Days, or How not to Be a Sucker

The Black Swan is expressed through the Problem of Induction: how can we logically make general conclusions out of specific instances?

Eg: a turkey is fed every day, therefore, the turkey believes it will be fed every day for the rest of its life…until an expected event happens at Christmas.

It works for one thousand days, until…it doesn’t anymore, and we find out that we knew was false, irrelevant, misleading.

what happens when you predict the future based on the past
The turkey was fed for 1000 days and eaten on the 1001st day. That day was a Black Swan.

A Black Swan Is Relative to Knowledge

Christmas is a turkey’s Black Swan. But it’s not for the butcher.

-> a Black Swan is a matter of perspective.

You can eliminate it with science, or by keeping an open mind – by “expecting” it.

Some Black Swans are sudden (a crash), some take years in their making (the ubiquity of computers in our lives.)

However, they should be perceived on a relative timescale.

Eg: 9/11 took a few hours, but changed the world forever.

Positive Black Swans are often slow, while negative Black Swans are often sudden. That’s because destroying goes quicker than building.

Sextus Empiricus was one of the first to talk about the Black Swan. He was born in 160, in Alexandria.

I Don’t Want to be a Turkey

Knowing about Black Swan should not scare you to take any risks. It should simply encourage you to consider what you had not considered before, so the Black Swan can be “preventable” somehow.

The following are consequences of our blindness to Black Swans:

  1. The error of confirmation: we make rules based on what we see and apply these rules to what we don’t see.
  2. The narrative fallacy: we tell stories that help us make sense of events, but that don’t explain anything.
  3. We behave as if the Black Swan did not exist.
  4. The distortion of silent evidence: what we see isn’t necessarily all that there is.
  5. We “tunnel” by focusing on a restrictive list of Black Swans.

Each of these ideas will be the object of each of the next five chapters.

Chapter 5: Confirmation Shmonfirmation!

People mix absence of evidence and evidence of absence.

Absence of evidence is saying that there is no evidence that something happened. Eg: I traded the stock market today and there was no crash. There is no evidence that a Black Swan happened.

Evidence of absence is saying that we have proof that a Black Swan will never happen because we are dealing with matters in Mediocritsan. Eg: a human being taller than 3 meters will never happen.

The reason people confuse these is that we cannot transfer knowledge and experience we use in a certain context to another.

In a psychological experiment, many statisticians failed at statistics questions phrased differently!


When we believe something, we have a tendency to look for proof of it in our past – disregarding proof of the opposite.

This is the confirmation bias.

However, it is a fallacy. In the case of positive statements, a series of corroborative facts does not make it true!

Eg: the people I met today were nice, therefore, everyone is nice on the planet.

This works though, for negative statements.

Eg: I saw a black swan today, therefore, not all swans are white.

-> we get closer to the truth with negative instances than with positive ones!

This asymmetry is practical.

-> we don’t have to be 100% skeptical, just 50% skeptical (we have to be skeptical of positive instances).

In life, you only need to be interested in negative instances.

Observing 1 million white swans is a lot of data yet not sufficient to prove all swans are white.

-> data is great, but not in all cases!

One black swan though, is sufficient to say that not all swans are white. And then you can be 100% sure of this statement.

Karl Popper studied decision-making in situations where you don’t have all the information.

Popper’s idea is based on the open society, where doubt is constant. The society is built on “what is not true” rather than “what is true”.

For Popper a statement can be of two nature:

  1. Wrong.
  2. Not wrong yet.

Science advances by saying what X isn’t instead of what X is (according to him, we’ll never know what X is).

The problem is that humans have a bias toward the opposite: the confirmation bias. We only seek evidence of what we think is true.

The best chess players (and investors like George Soros) look instead for evidence that they may be wrong.

Back to Mediocristan

The savanna in which we grew up was much less dominated by Black Swans than the modern world, hence our inability to deal with them.

Chapter 6: The Narrative Fallacy

On the Causes of my Rejection of Causes

The narrative fallacy depicts our inability to observe a succession of facts without making up a story about how they’re all related.

We do so because stories help us make more sense, and remember better.

Splitting Brains

Theorizing about events is what people do naturally. Not theorizing is unnatural.

When you inhibit or enhance different parts of the brain, people tend to theorize more or less. That means that there is a physical cause as to why we theorize. It is biological.

The other reason is that knowledge is costly to store and retrieve for the brain. As a result, we look for a pattern in information and subsequently store the pattern. It’s the brain’s way of compressing information.

And we leave out the Black Swans out of a need to simplify.

Unlike what is often thought, the memory isn’t static. It’s dynamic and changes every time you revisit it.

Since narratives help us see the past as more predictable and less random than it was, we use our memory to decrease the cost of mistakes: “If I had done this…or that…it would not have happened.”

Don’t fight this tendency. Embrace it. But instead of trying to make the event less likely to happen, make it bound to happen.

The narrative fallacy is what happens in journalism. Journalists link different stories and events that have nothing to do with each other simply to make it more comestible for the audience.

Doing so, they make the world more complicated than it actually is (the world is random).

The Sensational And The Black Swan

The way narratives are phrased influences how we estimate the odds.

In general, we overestimate the likelihood of some Black Swans, while we underestimate it for others.


Because there are two types of Black Swans:

  • The narrated Black Swans, those you hear about in the mainstream narrative. Eg: winning the lottery.
  • Those no one talks about since they escape models.

As a result, the first ones are overestimated, while the seconds are underestimated.

The Pull of the Sensational

We will always be more attracted to the story than to the statistics.

When an Italian toddler fell into a well, the worldwide media were on alert.

That story went to Lebanon, in the middle of a civil war where people were dying in the street.

“One death is a tragedy, a million is a statistic”. Stalin.

The Shortcut

Daniel Kahneman and Amos Tversky found out that our cognitive system worked with two systems:

System 1: it helps us make quick decisions using mental shortcuts and biases called heuristics.

System 2: the actual thinking, the one that is painful to do.

Most of our mistakes come from using System 1 since we are often not even aware we are using it.

System 1 mainly works with emotions.

How to Avert the Narrative Fallacy

Our misunderstanding of the Black Swan is mainly due to our using System 1.

Chapter 7: Living In The Antechamber Of Hope

It’s hard to work for a profession in Extremistan as any success will be a Black Swan. It is hard because the brain is hooked on regular results, and worries if there is none.

This is why researchers, writers, and artists lead painful lives compared to dentists and massage therapists.

Where the Relevant Is the Sensational

In a primitive environment, the relevant is sensational: getting food, building a house, etc.

In this world, we direct our attention towards the sensational – which isn’t always relevant.


Linear relationships are clear.

Nonlinear relationships aren’t.

a non-linear relationship
You are happy to drink water at the beginning, but don’t want to drink liters of it.

Non-linear relationships are everywhere. Linear relationships are rare.

Process Over Results

We favor the sensational and visible. It’s difficult to keep on doing something that seems to deliver nothing.

Those that say they enjoy the process over everything else do not tell the entire truth. Sure, you enjoy writing…but having some readers wouldn’t be bad, would it?

Those that take the risk to work in Extremistan don’t even make that much money. Venture capitalists make more than entrepreneurs, science does better than scientists, and publishers do better than authors.

Human Nature, Happiness, and Lumpy Rewards

Making 100k per year for ten years > making 1 million in one year after 9 years of nothing.

Your happiness does not depend on the intensity of a happy moment but on the number of happy moments.

Unfortunately, modern life (especially in Extremistan) does not really deliver that.

Similarly, it’s better to have all of your negative feelings get out at once, than a bit every day.

The Antechamber of Hope

The Black Swan was presented as something we do not expect that happens.

But for the people working in Extremistan, it might as well be the event that they are waiting for which does not happen: Eg: book success. If you are a writer, you are expecting a Black Swan.

This highlights how there are two types of people: those that fall victim to a Black Swan, and those that prepare their whole life for it.

In essence, people either bet that the Black Swan will happen, or that it never will.

If you engage in such work, find people that are in the same position as you are so it’s not too hard.

Chapter 8: Giacomo Casanova’s Unfailing Luck: The Problem Of Silent Evidence

Black Swans are hidden in silent evidence.

The Story Of The Drowned Worshippers

If a sailor on a sinking ship prays and is saved, he may conclude that praying will save people from drowning.

The silent evidence is all the people who prayed…yet drowned.

Silent evidence pervades and changes history, as it is never taken into account.

History, as it is understood, is a succession of events that have been seen. History misses all of the silent events – and is incomplete as a result because of the way we gather evidence.

It’s a bias, and it’s constantly distorting what we are observing.

It often appears with successful people in Extremistan (artists, traders, etc): “if you work as hard as I can, you can make it”.

What about the people that worked harder…and yet failed?

The Cemetery of Letters

It is said that Phoenicians did not write any books – because we never found any. But they actually wrote on a type of perishable papyrus, hence their writings disappeared.

Silent evidence is important when we look at how the winners became winners. Often, it’s not that they were superior, or harder working. All that they say isn’t sufficient, we also need to hear from the people that also worked hard, but failed.

How to Become a Millionaire in Ten Steps

Biographies of millionaires highlight how they became rich – working hard, being courageous, etc.

But we don’t know about all of the people that were as courageous and hard-working and yet that did not make it.

The difference between the winners and losers? Luck.

The silent evidence can be applied to many more domains.

In politics, you will see the benefits of a law, but never the downsides.

Eg: social laws will protect those who have jobs, but will harden finding a job for those who don’t have any.


A doctor has no incentives to prescribe drugs that will save more people than it will kill, as if you’re saved, you don’t say thank you, but if you die, your family will go to court.

The Teflon-style Protection Of Giacomo Casanova

Giacomo Casanova believed all his life that he was immune to problems as every time he struggled, life would somehow save him.

The reason why there was only one Casanova is that everyone else failed (or died) where he succeeded. We won’t hear from all of the Casanovas that lost, but there were so many that statistically, one Casanova had to win until the end.

survivorship bias and the silent evidences

This can be applied to ourselves.

  1. It’s not because we were lucky getting here that we will still be lucky on our way there. At some point, almost everyone runs out of luck.
  2. Evolutionary fitness does not take into account how we could have been. Evolution is not driven to optimize, but is a series of random events.

I Am A Black Swan: The Anthropic Bias

The religious justification is that life is that chances for life to develop in the universe were so low that it could not happen out of luck.

But are there so low? If we take all of the galaxies and all of the stars, aren’t we going to find at least one where life could happen?

This can be applied to lottery winners, actors, and old people: the fact that they lived up to then was likely due to luck as well.

Never look at the odds of succeeding from the winner’s point of view. Always look at them from the number of people that tried to win in the first place. The more people play, the more chances there are to have a winner.

The Cosmetic

This realization outlines how useless explaining success is.

School shames students for saying I don’t know, but I don’t know is the right answer most of the time. Stuff happens in a random manner, and there is simply no explanation for them – especially in history.

This should not discourage you to seek causes for things. Just, be careful. Not everything has an explanation.

Silent evidence (and seen evidence) play in how we perceive the Black Swan, making us overestimate it at times (terrorism fears after a terrorist attack) or underestimate it (terrorism fears before a terrorist attack).

Chapter 9: The Ludic Fallacy, or the Uncertainty of the Nerd

Fat Tony

Fat Tony is that guy that knows everyone in the neighborhood, gets free 1st class seats, and a table in a restaurant that is full. Fat Tony doesn’t do math. His success comes from the fact that he thinks outside of the box.

Dr. John is the opposite of Fat Tony. He is meticulous, always on time, wears a nice tie, and does good work. Dr. John thinks only inside the box.

Dr. John wins every IQ and math test. But Fat Tony wins at life.

The Uncertainty of the Nerd

The author went to a conference in a casino where he coined the term “ludic fallacy”.

The ludic fallacy is the fallacy of games of chance – like gambling.

It works as follows.

A casino is a place where there is no risk and uncertainty because rules have been established to avoid them.

There are no Black Swans in casinos, because in the long-term, the casino always wins. As a result, if there is a place where chance doesn’t intervene…it is in casinos.

While we underestimate luck in real life, we overestimate it at games of chance.

Ironically, the casino where the author was, while having a very sophisticated system to repair cheaters, had most of its losses attributed to 4 Black Swans.

  1. A tiger attacked its own master during a live show.
  2. An employee tried to dynamite the casino (but it was avoided before it happened).
  3. An employee never filled forms regarding big payouts to gamblers, which translated to a monstrous fine.
  4. The owner’s daughter was kidnapped against a ramson.

-> the casino spent all its money on risk management for its gambling activities while the main risks came out of the gambling activity.

Wrapping up Part One

Part 1 was about one problem, and one only.

The cosmetic and the Platonic rise naturally to the surface.

I will quote the author:

This is a simple extension of the problem of knowledge. It is simply that one side of Eco’s library, the one we never see, has the property of being ignored. This is also the problem of silent evidence. It is why we do not see Black Swans: we worry about those that happened, not those that may happen but did not.

We love the tangible, the confirmation, the palpable, the real (…). Most of all we favor the narrated. Alas, we are not manufactured, in our current edition of the human race, to understand abstract matters—we need context. Randomness and uncertainty are abstractions. We respect what has happened, ignoring what could have happened.

If you want to lead a better life, you need to de-narrate. Quit the television and the news. Avoid using System 1, and only use System 2.

Seek the difference between the sensational and the empirical.

Avoid “tunneling”.

Finally, avoid focusing when dealing with uncertainty. Keep an open mind.

Part 2: We Just Can’t Predict

Most of the technology we have today was not planned, but discovered randomly. They were Black Swans.

Chapter 10: The Scandal of Prediction

On the Vagueness of Catherine’s Lover Count

Social scientists found out that 45% of the population guessed wrong while they were certain to guess right.

Overestimating what we know is called epistemic arrogance. Doing so, we also underestimate uncertainty by compressing the space we should reserve for it.

In general, we are really bad at predicting because the difference between what we know and what we think we know is huge.

Just look at how many people get a divorce. They thought it would work out.

Black Swan Blindness Redux

What we think happens once in a century actually happens once every decade or so.

As a result, there is no difference in effect between guessing and predicting.

Information Is Bad for Knowledge

In general, the more educated the person is, the more they will overestimate how much they know and the other way around. The cab driver is in fact, humble in regard to what they don’t know.

Information has another problem: impediment to knowledge.

The more information you give someone to solve a problem, the more hypotheses he will make, and the less likely he will find the correct answer due to the noise.

Why? Our ideas are sticky. Once we produce a theory, we don’t let it go.

Two biases play: the confirmation bias, and the belief perseverance.

noise VS information
It’s not easy to differentiate information from noise.

The Expert Problem, or the Tragedy of the Empty Suit

There are two cases:

  1. Arrogance with competence
  2. Arrogance with incompetence

The first case belongs to professions where experts exist: astronomers, pilots, chess masters, etc. Things that don’t move. The theory of flying planes is fixed. That button will do x, and that lever will do y.

The second case belongs to professions where you are more likely to be right than the expert is: stockbrokers, psychologists, court judges, economists, etc. Things that move, professions based on a future that doesn’t resemble the past don’t have experts.


Because experts “tunnel”. It works well in a safe situation like the first one, but it doesn’t work well in the second where you have Black Swans. It may be due to self-delusion.

the problem with tunnelling
The problem with tunneling is that you miss on a bunch of stuff.

Events Are Outlandish

Experts can predict the ordinary but not the extreme. This is where they fail at predictions.

When social scientists look at experts predictions and compare them with reality, they noticed that not only were they wrong more often than random people, but that they had more faith in their predictions too.

So, why do they keep their job? Because of Black Swans. They invoke the Black Swan as an excuse for not being able to prevent the event.

-> we attribute success to our skills, and failure to external events.

When predicting the future, all you can say is that history will be dominated by a big event. Time will tell which one it will be.

Studies further show that complex mathematical prediction systems gave similar predictions than the simplest ones.

The Beauty of Technology: Excel Spreadsheets

Predictions became worst with Excel spreadsheets, further “tunneling” the prediction without leaving any room for the unexpected.

Don’t Cross A River if it Is (on Average) Four Feet Deep

Corporate and governmental predictions don’t even include a possible error rate.

Forecasting without error rate contains three fallacies.

  1. Variability matters: maybe you forecast a price at 50 next year, and during the year, it will jump to 70 before going back to 30, then 50.
  2. Forecast is unlikely to be correct as you go further in time. You can predict the technology of next year, but of the next 100 years?
  3. The variable forecasted is often much more random than it is accounted for.

The author explains he never made a prediction nor forecasted anything because he knows he can’t.

As we said, Black Swans have three attributes:

  1. Unpredictability
  2. Consequences
  3. Retrospective explainability

Let’s look at the unpredictability.

Chapter 11: How to Look for Bird Poop

How to Look for Bird Poop

Everything invented was invented out of randomness. Someone searched for something and found something else (you look for India and find America).

This is serendipity: discovering something by accident while looking for something else.

What is more striking is that people that found these inventions often did not realize how big they were – nor did their contemporaries.

The church didn’t care much for Galileo. Darwin’s paper, upon being received, wasn’t deemed interesting nor revolutionary.

Forecasters don’t predict the change brought by technology and science, but these changes are also slower than expected. Watson, IBM founder, said the world would not need more than a few computers.

Engineers make tools that seemingly do nothing, until they lead to discoveries which lead to other discoveries.

Tools often allow you to do something they weren’t built to do at first. They’re solutions looking for problems.

While a lot of useful inventions were randomly found, a lot of inventions aimed to solve a problem ended up in the cemetery, like the Segway, which was going to “revolutionize” city transport.

How to Predict Your Predictions!

Karl Popper wrote against historicism.

The point is this: Prediction requires knowing about technologies that will be discovered in the future. But that very knowledge would almost automatically allow us to start developing those technologies right away. Ergo, we do not know what we will know.

The Nth Billiard Ball

Henri Poincaré was a French mathematician, at the origin of the “butterfly effect” idea. He discovered that the slightest details could alter the entire results due to the relation of elements with each other.

This is another reason why it’s impossible to predict the future, as explained by the Austrian economist Hayek.

Hayek said a real forecast should take the whole system into account. You can’t predict the economy by just studying the economy. You need to take society into account.

How not to Be a Nerd

Let’s speak about the difference between Dr. John and Fat Tony. Dr. John will learn a language by studying the grammar rules in a book, while Fat Tony will do so by talking to people.

No one first wrote the grammar of a language then taught it to people. Language evolved organically, the grammar was written after it had developed.

This is why “tunneling” knowledge is a problem. It’s learning knowledge in a non-dynamic and non-flexible way, and it doesn’t fit the messy and ever-changing real world.

Forget about theory and the idea of applying theoretical ideas to practice. Do the opposite. Go from practice, and find a way to theorize about it. What matters is the results.

We may not understand how acupuncture works, but we know that it does. And it’s what matters.

Prediction and Free Will

If you know all possible conditions of a physical system, you can in theory project its behavior in the future. But this is only for inanimate objects.

Once humans, living beings with free will, are involved, we end up with a complete inability to do so.

If someone can predict everything that someone else will do, that person is not as free as they think they are.

-> successful predictions would mean that free will is merely the interactions between elements.

You can’t predict how people will act, except with a trick. Rationality. Economics pretends it can predict because it pretends actors are rational.

However, we have seen now that people aren’t rational, which means that the bulk of economics cannot be applied to the real world.

The Grueness of Emerald

Recall the turkey. Observing past data to predict events can lead to two opposite theories. The fact that you are alive now could mean that:

  1. You are immortal.
  2. You are closer to death.

The riddle of induction distinguishes between linear and nonlinear models.

linear VS non-linear models
A is linear, B isn’t.

Models will predict different results.

That Great Anticipation Machine

So, if we cannot predict, why do we plan? It may be simply because we are humans.

Projecting enables us to estimate whether we will die or not.

Why do we listen to experts? Because of specialization. You go to the doctor when you are sick, to the car mechanic when your car breaks down, etc.

Chapter 12: Epistemocracy, A Dream

Someone with epistemic humility will doubt his own knowledge until exhaustion.

The author calls such a person an epistemocrat, and a place where people do that, epistemocracy.

Montaigne was one of them.

The Past’s Past, and the Past’s Future

There is an asymmetry between the past and the future which is too fuzzy for us to imagine.

The first consequence of this asymmetry is that people do not learn that the relationship between the past and the future is the same than the one between the past and its past.

Timeline of different periods

We imagine that the solutions of our problems of today are definitive solutions, without imagining that people in the past also had definitive solutions.

We laugh at people in the past, without realizing that people in the future will laugh at us.

This is called “future blindness”.

Prediction, Misprediction, and Happiness

One consequence of misprediction is the hedonic treadmill. You buy a new car thinking it will change your life, and three months later, nothing has really changed.

And the worst part is that you knew that, but you still made the mistake.

Science studied and noticed that we overestimate the effect that future events will have on our lives.

It’s not that we mispredict, but that we have a hard time learning from our past mistakes.

Furthermore, we’re not really good at predicting the past either – it may even be harder!

Eg: imagine an ice cube in a hot room. You can predict it will melt into water (forward process). Now, imagine you find some water. Where did it come from? It’s not easy to deduct it came from an ice-cube (backward process).

In this case, the forward process (used in physics, chemistry, etc) is easier to predict than the backward process (used in history).

Now, lets’ introduce the notion of non-linearity. As we have seen, a butterfly can create a hurricane. If you find the hurricane, can you come back all the way to the butterfly?

Once Again, Incomplete Information

In theory, randomness is an intrinsic property. In practice, randomness is incomplete information, called opacity.

The world is random because extremely complex, and we can’t comprehend this complexity.

What They Call Knowledge

History is great, you can learn from it and gain a lot of insight. But since it is fundamentally incomplete, you can’t derive any theories from it.

Unfortunately, most (if not all) historians are plagued by the narrative fallacy.

Chapter 13: Appelles The Painter, Or What Do You Do If You Cannot Predict?

Advice Is Cheap, Very Cheap

Be human. Accept that being human comes with a volume of biases and reflexes inherent to who we are. This may prevent us from predicting well, but this is who we are. Embrace it, don’t fight it.

Meanwhile, don’t avoid predicting things like the weather for tomorrow’s picnic, etc. Just avoid being dependent on big predictions with Black Swan risks (what the economy will look like in 10 years).

Don’t rank beliefs and predictions by their plausibility, but according to how much harm they could make.

Be Prepared

It’s not because you cannot predict that you cannot benefit from unpredictability.

Avoid narrow-minded predictions and take randomness into account.

The Greeks doctors thought they should make space for luck in their diagnosis – that a patient may turn out to be cured by eating some particular food.

-> maximize serendipity!

If you want to find something randomly, you are going to have to try a lot.

Small failures are necessary in life.

Volatility and Risk of Black Swan

People don’t like losing so they do stuff with little volatility.

The problem is that there is a tradeoff between volatility and risk.

Eg: job at IBM. Your salary is guaranteed every month, until you get fired.

As a result, it’s much safer to be a consultant whose salary fluctuates, but who doesn’t get fired.

Likewise, dictatorships that appear stable are more at risk than democracies that aren’t, like Italy.

Barbell Strategy

If you know that your predictions are wrong because of Black Swans, you should become as hyperconservative and hyperaggressive as you can be.

Put 85-90% of your money in extremely safe investments, and keep 10-15% for extremely risky investments – but where the payout is huge.

Instead of having medium risk, you have no risk on one side, and high risk on the other.

Let’s see how you can apply this to life.

  1. Make a difference between undertakings where unpredictability can be positive and where it is negative. Eg: the movie business or being a VC are sectors where Black Swans are positive. In the military, Black Swans are destructive. When your loss is limited, be as aggressive as you can be.
  2. Don’t look for the precise and the local. Be open-minded, look at the big picture, and give chance a chance. Do not try to predict Black Swans. Prepare for them instead.
  3. Seize any opportunity or anything that looks like one. If someone “big” schedules a meeting with you, go! Work hard at exposing yourself to these opportunities. Maximize serendipity. This is why living in big cities is great – it maximizes these outcomes. Go to parties!
  4. Beware of specific governmental plans. Let them predict, but do not count on their predictions.
  5. Do not waste your time fighting forecasters, stock analysts, economists, and social scientists (except to play pranks on them.) There are some people that need to know something before we can tell them.

The Great Asymmetry

All of these have one thing in common: asymmetry. Put yourself in situations where the asymmetry is positive, and as big as possible.

You can’t predict events, but you know if the consequences are positive or negative. Getting positive consequences is all that matters.

Part III: Those Gray Swans of Extremistan

We have four final items to talk about.

  1. The world is moving towards Extremistan from Mediocristan.
  2. The Bell curve is a delusion.
  3. The fractal randomness concerns Black Swans we can mitigate for.
  4. The ideas of philosophers that focus on phony uncertainty.

Chapter 14: From Mediocristan To Extremistan, And Back

The Matthew Effect

An initial advantage follows someone through their entire life.

Eg: a scientist A will read a paper and quote three random sources for his own work from the paper. Someone else will read that paper of scientist A, and will read about these sources, and will quote them too, etc.

These three sources become famous out of luck.

This effect is called cumulative advantage. The more you win, the more you win. Unfortunately, failure is also cumulative.

Zipf law: you use the current words based on how often you used them in the past. The more you use the words, the more you will use them.

-> The big get bigger and the small stay small.

The same thing happened with English as a worldwide language. Because it seemed to be the first one to learn, more and more people flocked to learn it.

Nobody Is Safe In Extremistan

These models are interesting, but they assume that winners remain winners. In reality, a winner may become a loser due to randomness. Nobody is safe.

In capitalist countries, companies rise, then get destroyed (socialist countries protect their monopolies). Cities rise, then get destroyed. Etc.

Why? It’s randomness. Nobody is safe. In a way, randomness is super egalitarian.

While nobody is safe, nobody is completely threatened either.

You can always get exposure to a positive Black Swan and become big too.

The web, for example, while enabling total domination by one actor which wouldn’t be possible otherwise (eg: Google), also enables fragmentation. People hyper-specialized can find a specific audience and build a small niche.

Eg: Yevgenia Krasnova.

The long tail is dominated by several small guys each specialized in their own domain. The mainstream is dominated by a few giant actors.

In a way, the long tail brings in welcomed diversity.

The long tail is a consequence of Extremistan that makes things less unfair.

Naive Globalization

We are entering a period of disorder. This period is made out of periods of peace influenced by a few Black Swans.

The 20th century introduced Extremistan warfare: a few wars, but capable to destroy it all.

Globalization, because it connects everything to everything, makes possible a global collapse. It reduced volatility and gives the impression of stability, but what it does really is create devastating Black Swans.

The merging of banks makes financial crises less likely, but deadlier.

Chapter 15: The Bell Curve, That Great Intellectual Fraud

The Gaussian And The Mandelbrotian

The bell curve, Gaussian curve, or normal distribution, is the following.

Gaussian curve

The Increase in the Decrease

The main point of this curve is to show that the majority revolves around the mediocre, the average (it’s the curve of Mediocristan).

The odds of deviation decline faster and faster as you move away from the average.

The key you need to know is this: how fast is it declining?

Let’s take the bell curve for average size in men.

Centimeters taller than the average (1m67).Chances it actually happens
10 cm 1/6.3
20 cm 1/44
30 cm 1/740
40 cm 1/32 000
50 cm 1/3 500 000
60 cm 1/1 000 000 000

Look at the difference between 50cm and 60cm. A mere 10cm makes you jump from 3 500 000 to 1 000 000 000. This is why the bell curve ignores outliers (Black Swans), and this is why we cannot apply it to Extremistan.

The Mandelbrotian

By comparison, the following table shows a simplified version of chances to be rich in Europe.

People with a net worth equal to or higher thanChances it actually happens
€1 million 1/62.5
€2 million 1/250
€4 million 1/1000
€8 million 1/4000
€16 million 1/16 000
€32 million 1/64 000
€320 million 1/6 400 000

The speed of the decline remains constant. Whenever you double the money, you divide the chances by four.

Linear relationship that scales

These two tables show the difference between Mediocristan and Extremistan. The second table is scalable. The first one isn’t. Scalable laws are called power laws.

This is why the Gaussian curve doesn’t take extremes into account while the scalable, Mandelbrotian curve, does.


The question now is to know whether you are in a scalable (Extremistan) or Gaussian curve (Mediocristan).

If you take two authors that sell 1 million books together, the distribution is likely 993 000 for the first author and 7 000 for the other (Extremistan).

But if you have to guess the size of two people that together weigh 3 meters and 60 centimeters, they are likely to both be 1m80!

The reason why people like the Gaussian curve is that it is predictable.

The problem is that they apply the curve to the wrong disciplines.

Chapter 16: The Aesthetics Of Randomness

The author speaks about fractal and its inventor, Benoit Mandelbrot. Fractals are mathematical geometric figures that scale to infinity.

The author uses fractal as the idea of scalability can help us make Black Swans Grey Swans. That is, they can help us estimate how important future Black Swans will be.

Recall the Mandelbrotian curve. Things scale in Extremistan.

If things scale, it means that big numbers are possible -> there is no maximum limit.

Eg: The Da Vinci Code sold 60 million copies. We can therefore imagine that a book could sell 200 million copies. Not likely, but not impossible.

The same can be applied to wealth. One person could one day be worth €500 billion.

-> you can make predictions of things you don’t see in the data.

Now, a stone is similar to a mountain. Not the same, but with similar. Likewise, the distribution of wealth above €1 billion is not the same, but similar to the distribution of wealth in society.

Where Is The Gray Swan?

Fractal randomness helps us understand that things scale, hence helps us make many Black Swans “predictable”.

Eg: if you know that the stock market can crash, a crash is no longer a Black Swan. It’s a Grey Swan. You can predict the event, just not when it will happen (or how big it will be).

Chapter 17: Locke’s Madmen, Or Bell Curves In The Wrong Places

The main problem today is that we use methods belonging in Mediocristan and loosely apply them to Extremistan, hence disregarding Black Swans.

Find below two ways to approach randomness.

Skeptical Empiricism and the a-Platonic SchoolThe Platonic Approach
Interested in what lies outside the Platonic fold.Focuses on the inside of the Platonic fold.
Respect for those who have the guts to say “I don’t know”.“You keep criticizing these models. These models are all we have.”
Thinks of Black Swans as a dominant source of randomness.Thinks of ordinary fluctuations as a dominant source of randomness, with jumps as an afterthought.
Prefers to be broadly right.Precisely wrong.
Minimal theory, considers theorizing as a disease to resist.Everything needs to fit some grand, general socioeconomic model and “the rigor of economic theory;” frowns on the “descriptive”.
Develops intuitions from practice, goes from observations to books.Relies on scientific papers, goes from books to practice.
Assumes Extremistan as a starting pointAssumes Mediocristan as a starting point.

Chapter 18: The Uncertainty Of The Phony

Philosophers are dangerous because while they understand the difference between Extremistan and Mediocristan in their writings, they don’t apply these ideas in real life and invest in an efficient-portfolio-theory-driven pension fund.

The reason is that philosophers study philosophy out of will to study philosophy. They start studying philosophy, then apply philosophy to real life instead of the opposite.

Philosophers focus on problems that don’t matter (how to conceptualize x or y) instead of focusing on things that do matter: eg: how to avoid being a sucker in the stock market.

Let’s have a look then, at how to act.

Part IV: The End

Chapter 19: Half And Half, Or How To Get Even With The Black Swan

The author sometimes hates Black Swans when these are crises, and loves them when they are random surprises.

He worries less about small failures, and more about big failures; he worries less about advertised risks, and more about the hidden ones.

He worries less about embarrassment than about missed opportunities.

He doesn’t run for trains, as missing the train is only painful if you run after it. the same can be said about being what others expect you to be.

You have more control over your life if you decide the criterion by yourself.

We often get angry over petty problems because we forget the great scheme of things. Being alive in such an immense universe is already in itself, a remarkable positive Black Swan.


Yevgenia Krasnova took eight years to write a second book. Everyone was waiting for it with great impatience.

When it came out, critics were rather pleased, and everyone talked about it.

Yet, no one bought the book.

Another Black Swan for Yevgenia Krasnova.

Postscript Essay: On Robustness and Fragility, Deeper Philosophical and Empirical Reflections

In the second edition of the Black Swan, the author wrote a series of essays going further, talking about the fragility of systems.

These ideas were subsequently integrated into the book Antifragile, hence will not be summarized here.

For more summaries, head to

Did you enjoy the summary? Get the book here!

Did you like the article?

Then you will likely enjoy all the others. When you subscribe to my newsletter, I'll share with you the articles I wrote during the month + the goals I have achieved + those I plan to achieve next month. My purpose is to help you work on your own goals and reflect

Oh, and I'll also send you a special article for new subscribers only! 

How does that sound? 

  • Post category:Summaries
  • Post last modified:May 26, 2022