Liars and Outliers: The Big Idea

My big idea is a big question. Every cooperative system contains parasites. How do we ensure that society’s parasites don’t destroy society’s systems?

It’s all about trust, really. Not the intimate trust we have in our close friends and relatives, but the more impersonal trust we have in the various people and systems we interact with in society. I trust airline pilots, hotel clerks, ATMs, restaurant kitchens, and the company that built the computer I’m writing this short essay on. I trust that they have acted and will act in the ways I expect them to. This type of trust is more a matter of consistency or predictability than of intimacy.

Of course, all of these systems contain parasites. Most people are naturally trustworthy, but some are not. There are hotel clerks who will steal your credit card information. There are ATMs that have been hacked by criminals. Some restaurant kitchens serve tainted food. There was even an airline pilot who deliberately crashed his Boeing 767 into the Atlantic Ocean in 1999.

My central metaphor is the Prisoner’s Dilemma, which nicely exposes the tension between group interest and self-interest. And the dilemma even gives us a terminology to use: cooperators act in the group interest, and defectors act in their own selfish interest, to the detriment of the group. Too many defectors, and everyone suffers — often catastrophically.

The Prisoner’s Dilemma is not only useful in describing the problem, but also serves as a way to organize solutions. We humans have developed four basic mechanisms for ways to limit defectors: what I call societal pressure. We use morals, reputation, laws, and security systems. It’s all coercion, really, although we don’t call it that. I’ll spare you the details; it would require a book to explain. And it did.

This book marks another chapter in my career’s endless series of generalizations. From mathematical security — cryptography — to computer and network security; from there to security technology in general; then to the economics of security and the psychology of security; and now to — I suppose — the sociology of security. The more I try to understand how security works, the more of the world I need to encompass within my model.

When I started out writing this book, I thought I’d be talking a lot about the global financial crisis of 2008. It’s an excellent example of group interest vs. self-interest, and how a small minority of parasites almost destroyed the planet’s financial system. I even had a great quote by former Federal Reserve Chairman Alan Greenspan, where he admitted a “flaw” in his worldview. The exchange, which took place when he was being questioned by Congressman Alan Waxman at a 2008 Congressional hearing, was once the opening paragraphs of my book. I called the defectors “the dishonest minority,” which was my original title.

That unifying example eventually faded into the background, to be replaced by a lot of separate examples. I talk about overfishing, childhood immunizations, paying taxes, voting, stealing, airplane security, gay marriage, and a whole lot of other things. I dumped the phrase “dishonest minority” entirely, partly because I didn’t need it and partly because a vocal few early readers were reading it not as “the small percentage of us that are dishonest” but as “the minority group that is dishonest” — not at all the meaning I was trying to convey.

I didn’t even realize I was talking about trust until most of the way through. It was a couple of early readers who — coincidentally, on the same day — told me my book wasn’t about security, it was about trust. More specifically, it was about how different societal pressures, security included, induce trust. This interplay between cooperators and defectors, trust and security, compliance and coercion, affects everything having to do with people.

In the book, I wander through a dizzying array of academic disciplines: experimental psychology, evolutionary psychology, sociology, economics, behavioral economics, evolutionary biology, neuroscience, game theory, systems dynamics, anthropology, archeology, history, political science, law, philosophy, theology, cognitive science, and computer security. It sometimes felt as if I were blundering through a university, kicking down doors and demanding answers. “You anthropologists: what can you tell me about early human transgressions and punishments?” “Okay neuroscientists, what’s the brain chemistry of cooperation? And you evolutionary psychologists, how can you explain that?” “Hey philosophers, what have you got?” I downloaded thousands — literally — of academic papers. In pre-Internet days I would have had to move into an academic library.

What’s really interesting to me is what this all means for the future. We’ve never been able to eliminate defections. No matter how much societal pressure we bring to bear, we can’t bring the murder rate in society to zero. We’ll never see the end of bad corporate behavior, or embezzlement, or rude people who make cell phone calls in movie theaters. That’s fine, but it starts getting interesting when technology makes each individual defection more dangerous. That is, fisherman will survive even if a few of them defect and overfish — until defectors can deploy driftnets and single-handedly collapse the fishing stock. The occasional terrorist with a machine gun isn’t a problem for society in the overall scheme of things; but a terrorist with a nuclear weapon could be.

Also — and this is the final kicker — not all defectors are bad. If you think about the notions of cooperating and defecting, they’re defined in terms of the societal norm. Cooperators are people who follow the formal or informal rules of society. Defectors are people who, for whatever reason, break the rules. That definition says nothing about the absolute morality of the society or its rules. When society is in the wrong, it’s defectors who are in the vanguard for change. So it was defectors who helped escaped slaves in the antebellum American South. It’s defectors who are agitating to overthrow repressive regimes in the Middle East. And it’s defectors who are fueling the Occupy Wall Street movement. Without defectors, society stagnates.

We simultaneously need more societal pressure to deal with the effects of technology, and less societal pressure to ensure an open, free, and evolving society. This is our big challenge for the coming decade.

This essay originally appeared on John Scalzi’s blog, Whatever.

On March 3rd, 2012, posted in: Champion Security Blog by
Comments are closed.