Summary of The Black Swan by Nassim Taleb

  • Post category:Summaries
  • Post last modified:September 18, 2023

Chapter 8: Giacomo Casanova’s Unfailing Luck: The Problem Of Silent Evidence

Black Swans are hidden in silent evidence.

The Story Of The Drowned Worshippers

If a sailor on a sinking ship prays and is saved, he may conclude that praying will save people from drowning.

The silent evidence is all the people who prayed…yet drowned.

Silent evidence pervades and changes history, as it is never taken into account.

History, as it is understood, is a succession of events that have been seen. History misses all of the silent events – and is incomplete as a result because of the way we gather evidence.

It’s a bias, and it’s constantly distorting what we are observing.

It often appears with successful people in Extremistan (artists, traders, etc): “if you work as hard as I can, you can make it”.

What about the people that worked harder…and yet failed?

The Cemetery of Letters

It is said that Phoenicians did not write any books – because we never found any. But they actually wrote on a type of perishable papyrus, hence their writings disappeared.

Silent evidence is important when we look at how the winners became winners. Often, it’s not that they were superior, or harder working. All that they say isn’t sufficient, we also need to hear from the people that also worked hard, but failed.

How to Become a Millionaire in Ten Steps

Biographies of millionaires highlight how they became rich – working hard, being courageous, etc.

But we don’t know about all of the people that were as courageous and hard-working and yet that did not make it.

The difference between the winners and losers? Luck.

The silent evidence can be applied to many more domains.

In politics, you will see the benefits of a law, but never the downsides.

Eg: social laws will protect those who have jobs, but will harden finding a job for those who don’t have any.


A doctor has no incentives to prescribe drugs that will save more people than it will kill, as if you’re saved, you don’t say thank you, but if you die, your family will go to court.

The Teflon-style Protection Of Giacomo Casanova

Giacomo Casanova believed all his life that he was immune to problems as every time he struggled, life would somehow save him.

The reason why there was only one Casanova is that everyone else failed (or died) where he succeeded. We won’t hear from all of the Casanovas that lost, but there were so many that statistically, one Casanova had to win until the end.

survivorship bias and the silent evidences

This can be applied to ourselves.

  1. It’s not because we were lucky getting here that we will still be lucky on our way there. At some point, almost everyone runs out of luck.
  2. Evolutionary fitness does not take into account how we could have been. Evolution is not driven to optimize, but is a series of random events.

I Am A Black Swan: The Anthropic Bias

The religious justification is that life is that chances for life to develop in the universe were so low that it could not happen out of luck.

But are there so low? If we take all of the galaxies and all of the stars, aren’t we going to find at least one where life could happen?

This can be applied to lottery winners, actors, and old people: the fact that they lived up to then was likely due to luck as well.

Never look at the odds of succeeding from the winner’s point of view. Always look at them from the number of people that tried to win in the first place. The more people play, the more chances there are to have a winner.

The Cosmetic

This realization outlines how useless explaining success is.

School shames students for saying I don’t know, but I don’t know is the right answer most of the time. Stuff happens in a random manner, and there is simply no explanation for them – especially in history.

This should not discourage you to seek causes for things. Just, be careful. Not everything has an explanation.

Silent evidence (and seen evidence) play in how we perceive the Black Swan, making us overestimate it at times (terrorism fears after a terrorist attack) or underestimate it (terrorism fears before a terrorist attack).

Chapter 9: The Ludic Fallacy, or the Uncertainty of the Nerd

Fat Tony

Fat Tony is that guy that knows everyone in the neighborhood, gets free 1st class seats, and a table in a restaurant that is full. Fat Tony doesn’t do math. His success comes from the fact that he thinks outside of the box.

Dr. John is the opposite of Fat Tony. He is meticulous, always on time, wears a nice tie, and does good work. Dr. John thinks only inside the box.

Dr. John wins every IQ and math test. But Fat Tony wins at life.

The Uncertainty of the Nerd

The author went to a conference in a casino where he coined the term “ludic fallacy”.

The ludic fallacy is the fallacy of games of chance – like gambling.

It works as follows.

A casino is a place where there is no risk and uncertainty because rules have been established to avoid them.

There are no Black Swans in casinos, because in the long-term, the casino always wins. As a result, if there is a place where chance doesn’t intervene…it is in casinos.

While we underestimate luck in real life, we overestimate it at games of chance.

Ironically, the casino where the author was, while having a very sophisticated system to repair cheaters, had most of its losses attributed to 4 Black Swans.

  1. A tiger attacked its own master during a live show.
  2. An employee tried to dynamite the casino (but it was avoided before it happened).
  3. An employee never filled forms regarding big payouts to gamblers, which translated to a monstrous fine.
  4. The owner’s daughter was kidnapped against a ramson.

-> the casino spent all its money on risk management for its gambling activities while the main risks came out of the gambling activity.

Wrapping up Part One

Part 1 was about one problem, and one only.

The cosmetic and the Platonic rise naturally to the surface.

I will quote the author:

This is a simple extension of the problem of knowledge. It is simply that one side of Eco’s library, the one we never see, has the property of being ignored. This is also the problem of silent evidence. It is why we do not see Black Swans: we worry about those that happened, not those that may happen but did not.

We love the tangible, the confirmation, the palpable, the real (…). Most of all we favor the narrated. Alas, we are not manufactured, in our current edition of the human race, to understand abstract matters—we need context. Randomness and uncertainty are abstractions. We respect what has happened, ignoring what could have happened.

If you want to lead a better life, you need to de-narrate. Quit the television and the news. Avoid using System 1, and only use System 2.

Seek the difference between the sensational and the empirical.

Avoid “tunneling”.

Finally, avoid focusing when dealing with uncertainty. Keep an open mind.

Part 2: We Just Can’t Predict

Most of the technology we have today was not planned, but discovered randomly. They were Black Swans.

Chapter 10: The Scandal of Prediction

On the Vagueness of Catherine’s Lover Count

Social scientists found out that 45% of the population guessed wrong while they were certain to guess right.

Overestimating what we know is called epistemic arrogance. Doing so, we also underestimate uncertainty by compressing the space we should reserve for it.

In general, we are really bad at predicting because the difference between what we know and what we think we know is huge.

Just look at how many people get a divorce. They thought it would work out.

Black Swan Blindness Redux

What we think happens once in a century actually happens once every decade or so.

As a result, there is no difference in effect between guessing and predicting.

Information Is Bad for Knowledge

In general, the more educated the person is, the more they will overestimate how much they know and the other way around. The cab driver is in fact, humble in regard to what they don’t know.

Information has another problem: impediment to knowledge.

The more information you give someone to solve a problem, the more hypotheses he will make, and the less likely he will find the correct answer due to the noise.

Why? Our ideas are sticky. Once we produce a theory, we don’t let it go.

Two biases play: the confirmation bias, and the belief perseverance.

noise VS information
It’s not easy to differentiate information from noise.

The Expert Problem, or the Tragedy of the Empty Suit

There are two cases:

  1. Arrogance with competence
  2. Arrogance with incompetence

The first case belongs to professions where experts exist: astronomers, pilots, chess masters, etc. Things that don’t move. The theory of flying planes is fixed. That button will do x, and that lever will do y.

The second case belongs to professions where you are more likely to be right than the expert is: stockbrokers, psychologists, court judges, economists, etc. Things that move, professions based on a future that doesn’t resemble the past don’t have experts.


Because experts “tunnel”. It works well in a safe situation like the first one, but it doesn’t work well in the second where you have Black Swans. It may be due to self-delusion.

the problem with tunnelling
The problem with tunneling is that you miss on a bunch of stuff.

Events Are Outlandish

Experts can predict the ordinary but not the extreme. This is where they fail at predictions.

When social scientists look at experts predictions and compare them with reality, they noticed that not only were they wrong more often than random people, but that they had more faith in their predictions too.

So, why do they keep their job? Because of Black Swans. They invoke the Black Swan as an excuse for not being able to prevent the event.

-> we attribute success to our skills, and failure to external events.

When predicting the future, all you can say is that history will be dominated by a big event. Time will tell which one it will be.

Studies further show that complex mathematical prediction systems gave similar predictions than the simplest ones.

The Beauty of Technology: Excel Spreadsheets

Predictions became worst with Excel spreadsheets, further “tunneling” the prediction without leaving any room for the unexpected.

Don’t Cross A River if it Is (on Average) Four Feet Deep

Corporate and governmental predictions don’t even include a possible error rate.

Forecasting without error rate contains three fallacies.

  1. Variability matters: maybe you forecast a price at 50 next year, and during the year, it will jump to 70 before going back to 30, then 50.
  2. Forecast is unlikely to be correct as you go further in time. You can predict the technology of next year, but of the next 100 years?
  3. The variable forecasted is often much more random than it is accounted for.

The author explains he never made a prediction nor forecasted anything because he knows he can’t.

As we said, Black Swans have three attributes:

  1. Unpredictability
  2. Consequences
  3. Retrospective explainability

Let’s look at the unpredictability.

Chapter 11: How to Look for Bird Poop

How to Look for Bird Poop

Everything invented was invented out of randomness. Someone searched for something and found something else (you look for India and find America).

This is serendipity: discovering something by accident while looking for something else.

What is more striking is that people that found these inventions often did not realize how big they were – nor did their contemporaries.

The church didn’t care much for Galileo. Darwin’s paper, upon being received, wasn’t deemed interesting nor revolutionary.

Forecasters don’t predict the change brought by technology and science, but these changes are also slower than expected. Watson, IBM founder, said the world would not need more than a few computers.

Engineers make tools that seemingly do nothing, until they lead to discoveries which lead to other discoveries.

Tools often allow you to do something they weren’t built to do at first. They’re solutions looking for problems.

While a lot of useful inventions were randomly found, a lot of inventions aimed to solve a problem ended up in the cemetery, like the Segway, which was going to “revolutionize” city transport.

How to Predict Your Predictions!

Karl Popper wrote against historicism.

The point is this: Prediction requires knowing about technologies that will be discovered in the future. But that very knowledge would almost automatically allow us to start developing those technologies right away. Ergo, we do not know what we will know.

The Nth Billiard Ball

Henri Poincaré was a French mathematician, at the origin of the “butterfly effect” idea. He discovered that the slightest details could alter the entire results due to the relation of elements with each other.

This is another reason why it’s impossible to predict the future, as explained by the Austrian economist Hayek.

Hayek said a real forecast should take the whole system into account. You can’t predict the economy by just studying the economy. You need to take society into account.

How not to Be a Nerd

Let’s speak about the difference between Dr. John and Fat Tony. Dr. John will learn a language by studying the grammar rules in a book, while Fat Tony will do so by talking to people.

No one first wrote the grammar of a language then taught it to people. Language evolved organically, the grammar was written after it had developed.

This is why “tunneling” knowledge is a problem. It’s learning knowledge in a non-dynamic and non-flexible way, and it doesn’t fit the messy and ever-changing real world.

Forget about theory and the idea of applying theoretical ideas to practice. Do the opposite. Go from practice, and find a way to theorize about it. What matters is the results.

We may not understand how acupuncture works, but we know that it does. And it’s what matters.

Prediction and Free Will

If you know all possible conditions of a physical system, you can in theory project its behavior in the future. But this is only for inanimate objects.

Once humans, living beings with free will, are involved, we end up with a complete inability to do so.

If someone can predict everything that someone else will do, that person is not as free as they think they are.

-> successful predictions would mean that free will is merely the interactions between elements.

You can’t predict how people will act, except with a trick. Rationality. Economics pretends it can predict because it pretends actors are rational.

However, we have seen now that people aren’t rational, which means that the bulk of economics cannot be applied to the real world.

The Grueness of Emerald

Recall the turkey. Observing past data to predict events can lead to two opposite theories. The fact that you are alive now could mean that:

  1. You are immortal.
  2. You are closer to death.

The riddle of induction distinguishes between linear and nonlinear models.

linear VS non-linear models
A is linear, B isn’t.

Models will predict different results.

That Great Anticipation Machine

So, if we cannot predict, why do we plan? It may be simply because we are humans.

Projecting enables us to estimate whether we will die or not.

Why do we listen to experts? Because of specialization. You go to the doctor when you are sick, to the car mechanic when your car breaks down, etc.

Chapter 12: Epistemocracy, A Dream

Someone with epistemic humility will doubt his own knowledge until exhaustion.

The author calls such a person an epistemocrat, and a place where people do that, epistemocracy.

Montaigne was one of them.

The Past’s Past, and the Past’s Future

There is an asymmetry between the past and the future which is too fuzzy for us to imagine.

The first consequence of this asymmetry is that people do not learn that the relationship between the past and the future is the same as the one between the past and its past.

Timeline of different periods

We imagine that the solutions to our problems of today are definitive solutions, without imagining that people in the past also had definitive solutions.

We laugh at people in the past, without realizing that people in the future will laugh at us.

This is called “future blindness”.

Prediction, Misprediction, and Happiness

One consequence of misprediction is the hedonic treadmill. You buy a new car thinking it will change your life, and three months later, nothing has really changed.

And the worst part is that you knew that, but you still made the mistake.

Science studied and noticed that we overestimate the effect that future events will have on our lives.

It’s not that we mispredict, but that we have a hard time learning from our past mistakes.

Furthermore, we’re not really good at predicting the past either – it may even be harder!

Eg: imagine an ice cube in a hot room. You can predict it will melt into water (forward process). Now, imagine you find some water. Where did it come from? It’s not easy to deduct it came from an ice cube (backward process).

In this case, the forward process (used in physics, chemistry, etc) is easier to predict than the backward process (used in history).

Now, let’s introduce the notion of non-linearity. As we have seen, a butterfly can create a hurricane. If you find the hurricane, can you come back all the way to the butterfly?

Once Again, Incomplete Information

In theory, randomness is an intrinsic property. In practice, randomness is incomplete information, called opacity.

The world is random because extremely complex, and we can’t comprehend this complexity.

What They Call Knowledge

History is great, you can learn from it and gain a lot of insight. But since it is fundamentally incomplete, you can’t derive any theories from it.

Unfortunately, most (if not all) historians are plagued by the narrative fallacy.

Chapter 13: Appelles The Painter, Or What Do You Do If You Cannot Predict?

Advice Is Cheap, Very Cheap

Be human. Accept that being human comes with a volume of biases and reflexes inherent to who we are. This may prevent us from predicting well, but this is who we are. Embrace it, don’t fight it.

Meanwhile, don’t avoid predicting things like the weather for tomorrow’s picnic, etc. Just avoid being dependent on big predictions with Black Swan risks (what the economy will look like in 10 years).

Don’t rank beliefs and predictions by their plausibility, but according to how much harm they could make.

Be Prepared

It’s not because you cannot predict that you cannot benefit from unpredictability.

Avoid narrow-minded predictions and take randomness into account.

The Greeks doctors thought they should make space for luck in their diagnosis – that a patient may turn out to be cured by eating some particular food.

-> maximize serendipity!

If you want to find something randomly, you are going to have to try a lot.

Small failures are necessary in life.

Volatility and Risk of Black Swan

People don’t like losing so they do stuff with little volatility.

The problem is that there is a tradeoff between volatility and risk.

Eg: job at IBM. Your salary is guaranteed every month, until you get fired.

As a result, it’s much safer to be a consultant whose salary fluctuates, but who doesn’t get fired.

Likewise, dictatorships that appear stable are more at risk than democracies that aren’t, like Italy.

Barbell Strategy

If you know that your predictions are wrong because of Black Swans, you should become as hyperconservative and hyperaggressive as you can be.

Put 85-90% of your money in extremely safe investments, and keep 10-15% for extremely risky investments – but where the payout is huge.

Instead of having medium risk, you have no risk on one side, and high risk on the other.

Let’s see how you can apply this to life.

  1. Make a difference between undertakings where unpredictability can be positive and where it is negative. Eg: the movie business or being a VC are sectors where Black Swans are positive. In the military, Black Swans are destructive. When your loss is limited, be as aggressive as you can be.
  2. Don’t look for the precise and the local. Be open-minded, look at the big picture, and give chance a chance. Do not try to predict Black Swans. Prepare for them instead.
  3. Seize any opportunity or anything that looks like one. If someone “big” schedules a meeting with you, go! Work hard at exposing yourself to these opportunities. Maximize serendipity. This is why living in big cities is great – it maximizes these outcomes. Go to parties!
  4. Beware of specific governmental plans. Let them predict, but do not count on their predictions.
  5. Do not waste your time fighting forecasters, stock analysts, economists, and social scientists (except to play pranks on them.) There are some people that need to know something before we can tell them.

The Great Asymmetry

All of these have one thing in common: asymmetry. Put yourself in situations where the asymmetry is positive, and as big as possible.

You can’t predict events, but you know if the consequences are positive or negative. Getting positive consequences is all that matters.

Part III: Those Gray Swans of Extremistan

We have four final items to talk about.

  1. The world is moving towards Extremistan from Mediocristan.
  2. The Bell curve is a delusion.
  3. The fractal randomness concerns Black Swans we can mitigate for.
  4. The ideas of philosophers that focus on phony uncertainty.

Chapter 14: From Mediocristan To Extremistan, And Back

The Matthew Effect

An initial advantage follows someone through their entire life.

Eg: a scientist A will read a paper and quote three random sources for his own work from the paper. Someone else will read that paper of scientist A, and will read about these sources, and will quote them too, etc.

These three sources become famous out of luck.

This effect is called cumulative advantage. The more you win, the more you win. Unfortunately, failure is also cumulative.

Zipf law: you use the current words based on how often you used them in the past. The more you use the words, the more you will use them.

-> The big get bigger and the small stay small.

The same thing happened with English as a worldwide language. Because it seemed to be the first one to learn, more and more people flocked to learn it.

Nobody Is Safe In Extremistan

These models are interesting, but they assume that winners remain winners. In reality, a winner may become a loser due to randomness. Nobody is safe.

In capitalist countries, companies rise, then get destroyed (socialist countries protect their monopolies). Cities rise, then get destroyed. Etc.

Why? It’s randomness. Nobody is safe. In a way, randomness is super egalitarian.

While nobody is safe, nobody is completely threatened either.

You can always get exposure to a positive Black Swan and become big too.

The web, for example, while enabling total domination by one actor which wouldn’t be possible otherwise (eg: Google), also enables fragmentation. People hyper-specialized can find a specific audience and build a small niche.

Eg: Yevgenia Krasnova.

The long tail is dominated by several small guys each specialized in their own domain. The mainstream is dominated by a few giant actors.

In a way, the long tail brings in welcomed diversity.

The long tail is a consequence of Extremistan that makes things less unfair.

Naive Globalization

We are entering a period of disorder. This period is made out of periods of peace influenced by a few Black Swans.

The 20th century introduced Extremistan warfare: a few wars, but capable to destroy it all.

Globalization, because it connects everything to everything, makes possible a global collapse. It reduced volatility and gives the impression of stability, but what it does really is creating devastating Black Swans.

The merging of banks makes financial crises less likely, but deadlier.

Chapter 15: The Bell Curve, That Great Intellectual Fraud

The Gaussian And The Mandelbrotian

The bell curve, Gaussian curve, or normal distribution, is the following.

Gaussian curve

The Increase in the Decrease

The main point of this curve is to show that the majority revolves around the mediocre, the average (it’s the curve of Mediocristan).

The odds of deviation decline faster and faster as you move away from the average.

The key you need to know is this: how fast is it declining?

Let’s take the bell curve for the average size of men.

Centimeters taller than the average (1m67).Chances it actually happens
10 cm 1/6.3
20 cm 1/44
30 cm 1/740
40 cm 1/32 000
50 cm 1/3 500 000
60 cm 1/1 000 000 000

Look at the difference between 50cm and 60cm. A mere 10cm makes you jump from 3 500 000 to 1 000 000 000. This is why the bell curve ignores outliers (Black Swans), and this is why we cannot apply it to Extremistan.

The Mandelbrotian

By comparison, the following table shows a simplified version of the chances to be rich in Europe.

People with a net worth equal to or higher thanChances it actually happens
€1 million 1/62.5
€2 million 1/250
€4 million 1/1000
€8 million 1/4000
€16 million 1/16 000
€32 million 1/64 000
€320 million 1/6 400 000

The speed of the decline remains constant. Whenever you double the money, you divide the chances by four.

These two tables show the difference between Mediocristan and Extremistan. The second table is scalable. The first one isn’t. Scalable laws are called power laws.

This is why the Gaussian curve doesn’t take extremes into account while the scalable, Mandelbrotian curve, does.


The question now is to know whether you are in a scalable (Extremistan) or Gaussian curve (Mediocristan).

If you take two authors that sell 1 million books together, the distribution is likely 993 000 for the first author and 7 000 for the other (Extremistan).

But if you have to guess the size of two people that together weigh 3 meters and 60 centimeters, they are likely to both be 1m80!

The reason why people like the Gaussian curve is that it is predictable.

The problem is that they apply the curve to the wrong disciplines.

Chapter 16: The Aesthetics Of Randomness

The author speaks about fractal and its inventor, Benoit Mandelbrot. Fractals are mathematical geometric figures that scale to infinity.

The author uses fractal as the idea of scalability can help us make Black Swans Grey Swans. That is, they can help us estimate how important future Black Swans will be.

Recall the Mandelbrotian curve. Things scale in Extremistan.

If things scale, it means that big numbers are possible -> there is no maximum limit.

Eg: The Da Vinci Code sold 60 million copies. We can therefore imagine that a book could sell 200 million copies. Not likely, but not impossible.

The same can be applied to wealth. One person could one day be worth €500 billion.

-> you can make predictions of things you don’t see in the data.

Now, a stone is similar to a mountain. Not the same, but with similar. Likewise, the distribution of wealth above €1 billion is not the same, but similar to the distribution of wealth in society.

Where Is The Gray Swan?

Fractal randomness helps us understand that things scale, hence helps us make many Black Swans “predictable”.

Eg: if you know that the stock market can crash, a crash is no longer a Black Swan. It’s a Grey Swan. You can predict the event, just not when it will happen (or how big it will be).

Chapter 17: Locke’s Madmen, Or Bell Curves In The Wrong Places

The main problem today is that we use methods belonging in Mediocristan and loosely apply them to Extremistan, hence disregarding Black Swans.

Find below two ways to approach randomness.

Skeptical Empiricism and the a-Platonic SchoolThe Platonic Approach
Interested in what lies outside the Platonic fold.Focuses on the inside of the Platonic fold.
Respect for those who have the guts to say “I don’t know”.“You keep criticizing these models. These models are all we have.”
Thinks of Black Swans as a dominant source of randomness.Thinks of ordinary fluctuations as a dominant source of randomness, with jumps as an afterthought.
Prefers to be broadly right.Precisely wrong.
Minimal theory considers theorizing as a disease to resist.Everything needs to fit some grand, general socioeconomic model and “the rigor of economic theory;” frowns on the “descriptive”.
Develops intuitions from practice, goes from observations to books.Relies on scientific papers, goes from books to practice.
Assumes Extremistan as a starting pointAssumes Mediocristan as a starting point.

Chapter 18: The Uncertainty Of The Phony

Philosophers are dangerous because while they understand the difference between Extremistan and Mediocristan in their writings, they don’t apply these ideas in real life and invest in an efficient-portfolio-theory-driven pension fund.

The reason is that philosophers study philosophy out of the will to study philosophy. They start studying philosophy, then apply philosophy to real life instead of the opposite.

Philosophers focus on problems that don’t matter (how to conceptualize x or y) instead of focusing on things that do matter: eg: how to avoid being a sucker in the stock market.

Let’s have a look then, at how to act.

Part IV: The End

Chapter 19: Half And Half, Or How To Get Even With The Black Swan

The author sometimes hates Black Swans when these are crises, and loves them when they are random surprises.

He worries less about small failures, and more about big failures; he worries less about advertised risks, and more about the hidden ones.

He worries less about embarrassment than about missed opportunities.

He doesn’t run for trains, as missing the train is only painful if you run after it. the same can be said about being what others expect you to be.

You have more control over your life if you decide the criterion by yourself.

We often get angry over petty problems because we forget the great scheme of things. Being alive in such an immense universe is already in itself, a remarkable positive Black Swan.


Yevgenia Krasnova took eight years to write a second book. Everyone was waiting for it with great impatience.

When it came out, critics were rather pleased, and everyone talked about it.

Yet, no one bought the book.

Another Black Swan for Yevgenia Krasnova.

Postscript Essay: On Robustness and Fragility, Deeper Philosophical and Empirical Reflections

In the second edition of the Black Swan, the author wrote a series of essays going further, talking about the fragility of systems.

These ideas were subsequently integrated into the book Antifragile, hence will not be summarized here.

For more summaries, head to

Did you enjoy the summary? Get the book here!

Want more?

Subscribe to my monthly newsletter and I'll send you a list of the articles I wrote during the previous month + insights from the books I am reading + a short bullet list of savvy facts that will expand your mind. I keep the whole thing under three minutes. 

How does that sound? 

Leave a Reply