Short summary: 1 min
Long summary: 38 min
Book reading time: 5h46
Score: 10/10
Book published in: 2017
Takeaway
- The amount of skin people have in the game strongly determines their behaviors.
- Politics today acts as a transferor of risk from one actor to another.
- Politicians reap the rewards when what they do works and transfer the cost to another party when they fail.
- Don’t do anything for which you have no skin in the game.
Table of Contents
Click to expand/collapse
- Takeaway
- Table of Contents
- What Skin in the Game Talks About
- Short Summary of Skin in the Game
- Summary of Skin in the Game, by Nassim Nicholas Taleb
- Book 1
- Book 2: A First Look at Agency
- Book 3: That Greatest Asymmetry
- Book 4: Wolves Among Dogs
- Book 5: Being Alive Means Taking Certain Risks
- Chapter 6: The Intellectual Yet Idiot
- Chapter 8: An Expert Called Lindy
- Book 6: Deeper Into Agency
- Book 7: Religion, Belief, and Skin in the Game
- Book 8: Risk and Rationality
What Skin in the Game Talks About
Skin in the Game is a book written by Nassim Taleb. It explores situations in life governed by the principle “skin in the game”. I learned for example that it is the people that have the most to lose that often win in the end because they’re most invested, or that taking risks is usually safer in the long term depending on the type of risks.
Skin in the Game is the last installment of the series of books known as “the Incerto” which deals with risks and asymmetry in real life.
The Incerto is composed of:
Skin in the Game is the best book I have read in 2021.
This book has completely changed the way I think about life, and am still digesting the lessons as I am writing these lines.
This is not an easy book. It took me 35 hours to summarize.
It was amazing though!
Visual Glossary
“=” means “equal”
“<” means “is smaller/worse than”
“>” means “is bigger/better than”
Short Summary of Skin in the Game
Those who have the most skin in the game often have a disproportionate impact on a system because they are exposed to consequences. Those who aren’t exposed to the consequences don’t suffer from mistakes, hence, they don’t learn.
Another consequence is that anyone who has no skin in the game, can make catastrophic decisions because they won’t suffer from the consequences. This is the case with centralized systems.
Centralized systems like bureaucracies take the skin of everyone out of the game, hence, they don’t move forward…until they collapse.
This shows how someone taking a risk should always suffer the consequences (good or bad), in order to make life fair. This is why laws exist. If you do X, you will suffer Y. Laws force people’s skin into the game.
When it’s not the case, when risk < reward, rewards will tend to be maximized without taking risk into account, hence increasing the risk until the system explodes, like the crisis of 2008 -> getting people’s skin into the game forces them to do better work.
People consider they have their skins in the game up to a certain point. Eg: you will feel more responsible in a small company than in a big one -> things don’t scale. Small ≠ big.
Having your skin in the game means that you are exposed to winning, or losing. Those who lose disappear. Those who won are still there. Time acts as a filter. This is called the Lindy effect.
Lindy states that the longer something has been existing, the longer it will exist. This is why when you are looking for the best practices, you are likely to find them in the oldest ones.
Summary of Skin in the Game, by Nassim Nicholas Taleb
Skin in the Game is an expression that reflects the idea to have stakes in something. Having stakes means that you are taking a certain amount of risk. It means you have something to lose, it means having exposure to the downside.
You can have stakes in a group of people (your family), a project (your company), or a situation (going out with friends).
That risk is exposure to consequences, no matter if the exposure is positive, or negative.
This book examines the effects of skin in the game in life.
Because I do not like this skin in the game expression, I will call it SIG, exposure, or stakes from now on.
Book 1
Prologue Part 1: Antaeus Whacked
Knowledge cannot be separated from the real world -> theory cannot be separated from practice.
The real world is learned through having, SIG, exposure, stakes in something.
Most scientific discoveries were first discovered through practice (SIG) and trial and error (the best scientific method), and were theoretically explained after.
-> SIG is necessary. You can’t discover something new in the world by philosophizing in your bedroom.
The absence of SIG can be dangerous. It leads people to play with stuff they shouldn’t be playing with – simply because they won’t suffer from consequences if things go wrong…but someone else will.
Since they don’t suffer from consequences, they don’t make mistakes -> they don’t learn.
Transferring risks to someone else = transferring lessons.
This explains the wars in Iraq and Afghanistan. Leaders in Washington had no exposure to things happening in the Middle East because they weren’t there -> they didn’t learn.
Risk Transferors
In the past, leaders were risk-takers.
Not anymore. Most of them are risk transferors. Bureaucracy is one example. It is inherently SIG-free.
In fact, centralized systems are traditionally free from any exposure.
-> need for decentralization.
If decentralization doesn’t happen, the system eventually blows up and self-repairs, provided it survives.
Eg: the 2008 banking crisis happened because the system had no exposure to its own actions.
When it collapsed, the people from the system didn’t suffer one bit, because the risks were transferred to the taxpayers through government bailout.
This led people to hate free markets while in fact, they should have hated the government.
Why?
Free markets have exposure built-in.
In a real free economy, banks wouldn’t have been saved -> they would have died -> they would not have taken these risks in the first place knowing the government wouldn’t save them.
When a free market loses its exposure feature, it’s often because of governments.
This is why the people that hate exposure and risks love big governments. It protects them.
As we have seen, people don’t learn much from their mistakes, especially in the absence of SIG.
But who does, then?
The system.
Systems learn by removing parts that no longer work, identified after someone made a mistake.
Eg: transportation didn’t get safer because people drive better, but because the system improved after bad drivers made mistakes and created accidents.
The learning of the system is grounded by filtering -> bad drivers are now dead.
SIG keeps human hubris (excessive pride and self-confidence) in check.