Nassim Taleb’s Skin in the Game is one of the most thought-provoking books I’ve read.
It’s full of insights about our world and daily life that seem unrelated but always end up rooted in the importance of people having “skin in the game”.
Note that “skin in the game” is not just about having a share of the upside. It requires you to have downside risk too.
The most enlightening parts of the book helped me understand my own feelings around risk-taking and entrepreneurship. Feelings I’ve struggled to grasp and articulate until now.
The first main topic is around the importance of skin in the game when it comes to acquiring knowledge.
You’ll learn most effectively when you are in contact with reality and have to bear the full consequences of being wrong, i.e. when you are exposed to risk. Conversely, the most ineffective way to learn is abstract reasoning devoid of a real environment and where there are no bad consequences of being wrong.
most things that we believe were “invented” by universities were actually discovered by tinkering and later legitimized by some type of formalization. The knowledge we get by tinkering, via trial and error, experience, and the workings of time, in other words, contact with the earth, is vastly superior to that obtained through reasoning
The same mechanism of transferring risk also impedes learning. More practically, You will never fully convince someone that he is wrong; only reality can.
This fits with my own experience. Nothing has taught me about public markets and investing as much as when my hard-earned savings have taken a hit during a market crash.
Nothing has taught me about coding and product development as much as actually building and launching products.
And nothing has taught me about business and entrepreneurship as much as actually building businesses.
Not only is this kind of learning most effective, it also keeps you grounded.
Skin in the game keeps human hubris in check.
This explains why having a bias for action can be a big advantage. Nike got it right — Just Do It.
If success is a result of learning, you can boost your chances of success in a given time period by boosting your rate of learning. And you can boost your rate of learning by doing less theorizing and talking, and more acting, in the real world, as fast as possible, and with skin in the game — that is, with real risk.
Those who talk should do and only those who do should talk
Takeaway 1: Think less, talk less and do more — faster and with skin in the game.
When you have skin in the game, you’ll be incentivized to do things properly and keep things simple.
Things designed by people without skin in the game tend to grow in complication (before their final collapse). There is absolutely no benefit for someone in such a position to propose something simple: when you are rewarded for perception, not results, you need to show sophistication. Anyone who has submitted a “scholarly” paper to a journal knows that you usually raise the odds of acceptance by making it more complicated than necessary. Further, there are side effects for problems that grow nonlinearly with such branching-out complications.
Personally, I’ve heard stories of software engineers in large tech companies designing unnecessarily complex systems so they can boast about them in their performance review. Nassim pinpoints the root of this problem — they’re rewarded for the short-term perception of their peers and managers rather than the actual long-term results.
This isn’t easy to solve. It’s hard to measure the actual long-term results generated by an employee, and even if you could, employees need to be rewarded consistently, and in the short term. Nassim identifies salespeople, Wall Street traders and hedge fund managers as exceptions — you can measure their actual results and compensate them proportionally. When they’re good at their job, they can be some of the highest earners.
But it’s more than just better aligned incentives. Having skin in the game can actually increase your ability in a given task. Nassim writes:
A confession. When I don’t have skin in the game, I am usually dumb.
When there was risk on the line, suddenly a second brain in me manifested itself, and the probabilities of intricate sequences became suddenly effortless to analyze and map. When there is fire, you will run faster than in any competition. When you ski downhill some movements become effortless. Then I became dumb again when there was no real action.
Takeaway 2: You’ll perform at your peak when you have skin in the game.
The next topic Nassim covers is symmetry and asymmetry in human affairs.
He introduces the concept of “the minority rule”. This is where a small, stubborn minority is responsible for changes that propagate across an entire population.
He illustrates this with the example of a catered event which multiple families are attending. Before the event, the families submit their dietary preferences. If a family has just one vegetarian member, they declare themselves as vegetarian. The event organiser sees that some families are vegetarian so just orders vegetarian food for everyone to avoid the inconvenience of managing two different food orders. So even though a tiny minority of attendees were vegetarian, everyone ends up eating vegetarian at the event.
The minority rule works when you have a stubborn minority who insist on choice A instead of B and a majority who doesn’t really care either way, so will just go along with A out of convenience.
The individual family member insists on vegetarian food. The rest of the family doesn’t care either way, so declares themselves vegetarian. At the event, the non-vegetarian families don’t care either way, so the whole event becomes vegetarian out of convenience.
So minority rules get propagated up from the individual to larger and larger groups, until the whole population abides by them out of convenience.
Nassim labels this process “renormalization” and attributes the entire economic and moral growth of society to it. He argues it’s responsible for the spread of moral values, civil rights, religions and languages.
He notes that you can limit the impact of minority rules by keeping populations small and decentralized:
Another attribute of decentralization, and one that the “intellectuals” opposing an exit of Britain from the European Union (Brexit) don’t get: if one needs, say, a 3 percent threshold in a political unit for the minority rule to take its effect, and on average the stubborn minority represents 3 percent of the population, with variations around the average, then some states will be subject to the rule, but not others. If, on the other hand, we merge all states in one, then the minority rule will prevail all across. This is the reason the U.S.A. works so well.
The minority rule explains how human groups are fundamentally different at different sizes:
A saying by the brothers Geoff and Vince Graham summarizes the ludicrousness of scale-free political universalism. I am, at the Fed level, libertarian; at the state level, Republican; at the local level, Democrat; and at the family and friends level, a socialist. If that saying doesn’t convince you of the fatuousness of left vs. right labels, nothing will.
I was recently travelling with a friend, and they suggested we rock-paper-scissor every major allocation decision, like who got the best room in the Airbnb, or who got the seat with the best view at a restaurant. It annoyed me because I didn’t really care about which of us got the best room or view. I would have preferred to just share things with mutual generosity rather than make things overly transactional. They pointed out that I made a big effort to get the best deal when booking the Airbnb or restaurant, so wondered why this desire to get the best deal didn’t extend to when we split things. Of course, the answer is that the dynamics of a friendship are different to the dynamics of a free market.
At the friends and family level, it’s natural to be a socialist. Everyone is happy to share things with mutual generosity because they see themselves as a single unit, where if one person benefits, everyone benefits.
But as a group gets larger, the number of relationships in the group grows quadratically — proportional to the square of the group size. In a group of 4 people, there are 6 relationships. In a group of 20 people, there are 190 relationships.
This makes it significantly less likely for everyone to see the group as a single unit. All it takes is a few people to act selfishly to unravel the system — others will realise they’re being taken advantage of, and so will become selfish themselves. This is another instance of renormalization of a minority rule — in this case, self-interestedness.
The solution for this larger group is a system which efficiently allocates resources even if everyone is self-interested. This is what transactional free markets do. So as groups get larger, it becomes near impossible to maintain socialism.
But… there’s something heart-warming about socialism. Where people group together and act for the common good.
So perhaps making groups larger may not always be a good thing.
Heavily populated global hubs like London and Tokyo lack the sense of community enjoyed by many small villages. The biggest cities can feel the most lonely.
Some people may have voted for Brexit because they felt this on a deeper, emotional level — that the relentless push for globalization and economic efficiency was unraveling the social fabric of their community, and making their cherished culture vulnerable to whatever new minority rule happened to be sweeping the world.
Takeaway 3: The size of a human group makes the dynamics fundamentally different.
The minority rule is also responsible for scientific and economic progress. Consider that it only takes one person to discover a new medicine or technology that propagates through the entire population to become the status quo.
Had science operated by majority consensus, we would be still stuck in the Middle Ages
“Never doubt that a small group of thoughtful citizens can change the world. Indeed, it is the only thing that ever has,” wrote Margaret Mead. Revolutions are unarguably driven by an obsessive minority. And the entire growth of society, whether economic or moral, comes from a small number of people.
Personally, I’ve always been attracted to startups. The classic dichotomy between working at a startup vs. a big company is often presented as a choice between large relative impact and large absolute impact. I used to accept this at face value. But in fact, applying the minority rule, your expected incremental impact on the world could be much higher working on a startup than a big company.
Your incremental impact on the world is the impact you create that your replacement wouldn’t.
In a large company, you are usually optimizing or extending what already exists. At junior levels, this job is fairly well-defined and you are fairly replaceable. It will probably happen with or without you. But at more senior levels, your job becomes less well-defined and you become less replaceable. There’s more opportunity to have a large, incremental impact that wouldn’t have existed without you.
In a startup, you are usually creating something completely new. Without you, things will just not happen, so you are fairly irreplaceable. If you succeed, the effects will propagate across society and so your incremental impact on the world could be huge. But it’s worth noting that you are still somewhat replaceable. If your idea is very good, someone else will probably execute it if you don’t.
Takeaway 4: The minority rule explains the spread of moral values, religion, languages, and scientific and economic progress. The world is changed by a small group of obsessive people.
Nassim then discusses the nature of employment and compares it to slavery. He considers most employees to have significant downside risk — and thus fear — of losing their jobs, which allows their employers to control them.
Relatedly, he contends that:
English “manners” were imposed on the middle class as a way of domesticating them, along with instilling in them the fear of breaking rules and violating social norms.
He contrasts this to the behaviour of figures such as Putin and Trump who project a visible “I don’t care” attitude. This actually brings them more followers and support because it signals they are authentic and free. With them, people know that what they’ll vote for is what they’ll get.
Conversely, if a politician is too polished, without any visible vulnerabilities, people can sense they are not truly free — they must answer to others and so cannot be trusted to do what they say they will.
Takeaway 5: Over-reliance on employability can take away your freedom.
Takeaway 6: Not caring about appearances signals authenticity and freedom which can engender trust.
Nassim also studies the stability of systems. He considers that systems which operate by skin in the game are self-stabilizing by eliminating the parts that don’t work. Whereas systems which lack this essential skin-in-the-game component destabilize and lead to very bad “black swan” events.
For example, the financial crisis was a result of bankers not having skin in the game. Since they knew they wouldn’t have to personally suffer the negative consequences of their bad decisions, they drove the system to a point of instability which led to terrible consequences for everyone else. But even now these bankers are still operating the financial system.
On the other hand, pilots have a lot of skin in the game. If they are bad at their job, their aeroplane will crash and they will likely die along with their passengers. So they are heavily incentivized to make good decisions. But moreover the system itself is self-stabilizing as any bad pilots will die and not fly again.
More generally, Nassim talks about the Lindy effect, where non-perishable systems, ideas, books, technologies and institutions actually increase their life expectancy as they age. If they’ve survived through time, having been exposed to harm, they must be robust.
Systems operated by those without skin in the game often have larger “tail risks” — where there’s a higher chance of an extremely bad event happening. These risks can go undetected for a long time, so if the system operator is rewarded for the short-term perception of their peers and managers, like in large bureaucratic organisations, rather than their actual long-term results, then they’re not incentivized to reduce these tail risks. They can get their credit and move on to the next large organisation, before the black swan event happens.
Entrepreneurs are rewarded solely based on the performance of the system they create, not the perception of their peers, and so they cannot ignore these tail risks. Perhaps many Russians feel the same applies to long-tenured autocratic leaders like Putin.
One of the best parts of entrepreneurship for me is that my incentives are completely aligned with the success of the business and has nothing to do with the perception of peers or managers.
Takeaway 7: Systems and ideas increase their life expectancy as they age.
Takeaway 8: Systems operated by those without skin in the game are more prone to black swan events as operators are rewarded for short-term perception rather than long-term results.
Next, Nassim writes about rationality.
a) rationality resides in what you do, not in what you think or in what you “believe” (skin in the game), and b) rationality is about survival.
Related to the Lindy effect, what is “rational” is what works, and thus survives, in the real world.
According to Nassim, superstitious beliefs can be “rational” if they help the belief-holders survive. Such beliefs have stood the test of time — they seem to work — and so there’s nothing irrational about them.
Further, people sometimes don’t really know what they think or what they want. Real preferences and opinions are revealed via actions that carry skin in the game.
What matters, in the end, is what they pay for goods, not what they say they “think” about them, or the various possible reasons they give you or themselves for that. If you think about it, you will see that this is a reformulation of skin in the game.
much of what we call “belief” is some kind of background furniture for the human mind, more metaphorical than real. It may work as therapy.
The most convincing statements are those in which one stands to lose, ones in which one has maximal skin in the game; the most unconvincing ones are those in which one patently (but unknowingly) tries to enhance one’s status without making a tangible contribution
Takeaway 9: The rational is what survives in the real world.
Takeaway 10: Real preferences and opinions are revealed by actions that carry skin in the game.
Since what is rational survives, Nassim suggests that:
making some types of errors is the most rational thing to do, when the errors are of little cost, as they lead to discoveries
Again, trial and error against a real environment is the most effective way to learn.
only evolution knows if the “wrong” thing is really wrong, provided there is skin in the game to allow for selection
Takeaway 11: Rapid trial and error can lead to discoveries, much like evolution.
Nassim highlights the difference between regular risks and risk of ruin. Ruin is something you cannot recover from, so should be treated differently.
I make the case for risk loving … and for taking a lot of risks that don’t have tail risks but offer tail profits.
Tail risks which result in ruin are usually those which can multiply to affect many people like terrorism or a pandemic.
When Covid had just started, many people pointed out that many more people had died of natural causes or car accidents than Covid. What they failed to realise was that the chances of Covid deaths tripling were far higher than the chances of car accident deaths tripling.
Takeaway 12: Know which kind of risk you’re dealing with and avoid risk of ruin.
Here’s all my top takeaways:
- Think less, talk less and do more — faster and with skin in the game.
- You’ll perform at your peak when you have skin in the game.
- The size of a human group makes the dynamics fundamentally different.
- The minority rule explains the spread of moral values, religion, languages, and scientific and economic progress. The world is changed by a small group of obsessive people.
- Over-reliance on employability can take away your freedom.
- Not caring about appearances signals authenticity and freedom which can engender trust.
- Systems and ideas increase their life expectancy as they age.
- Systems operated by those without skin in the game are more prone to black swan events as operators are rewarded for short-term perception rather than long-term results.
- The rational is what survives in the real world.
- Real preferences and opinions are revealed by actions that carry skin in the game.
- Rapid trial and error can lead to discoveries, much like evolution.
- Know which kind of risk you’re dealing with and avoid risk of ruin.
This is just a summary of my top highlights from Skin in the Game. It doesn’t do justice to the book itself which is packed full of many more insights and gems of wisdom. So I highly recommend you read it!Follow @miraan_tabrez