A little knowledge is a dangerous thing. It may be a cliché, but at least in terms of political philosophy, it’s all too true. The problem with knowledge is that it’s hard to tell whether you have enough of it to accomplish your goals, or whether your ignorance is about to be sharply and embarrassingly exposed.
Socrates said that the wise man knows how little he really knows. It’s a profound insight, even thousands of years later, because it pinpoints one of the great failings of mankind: hubris. Economist F.A. Hayek called the presumption of knowledge “the fatal conceit,” and while he no doubt meant his title metaphorically, it turns out to be literally true.
When we look at the failures of communism, socialism, fascism, and any form of top-down government planning, there are two common themes that emerge again and again. The first is power, and its tendency to corrupt (recalling Lord Acton’s famous dictum). The second is the presumption of knowledge by government planners.
Being smart, and no one denies that some of the worst humanitarian disasters of all time were engineered by pretty smart people, planners think that they have the intellectual tools to take something flawed—like society—and reengineer it to make it work better. Who can blame them? If we can put a man on the moon, how difficult can it be to reduce poverty, inequality, and hunger by reconfiguring the puzzle pieces of a nation’s economy? The only problem is, it doesn’t work. It has never worked, and in all likelihood, it will never work.
Why not? Because societies are not closed eco-systems with predictable and controllable conditions. When a physicist builds a model of how gravity affects a spaceship, he does not have to reckon with the thought that gravity might change its mind halfway through the voyage and decide to behave differently than expected. He does not have to worry about incentivizing the rocket. He deals with dead things that can be relied upon to, if not do as they’re told, at least behave as expected.
People are different. People have free will and often act unpredictably. People have desires that they will work to fulfill. People will respond to changing conditions by changing their behavior. Not one person, not even a whole department, can have enough knowledge about how people in an economy will behave to design something that will work as expected. This is what Hayek called “the knowledge problem.”
Given this unpredictability, we might expect to see nothing but chaos around us, and yet we don’t. This is because the same self-interested behaviors that make planning so impossible allow for millions of instantaneous self-corrections, all happening simultaneously. Why don’t cars on freeways always crash into each other? Because every driver is working to prevent it, not through collusion or following instructions, but through individual self-interest. The same thing happens with economic behavior, whether it be purchasing goods, setting prices, or choosing investments. All of these elements interact with one another to create a harmonious whole, and when something goes wrong, people act to correct it with a swiftness impossible to replicate through planning.
In Episode 4 of The Deadly Isms video series, we discuss how history is littered with the victims of those who thought they, with their superior intellect, could throw a wrench into this finely honed machine and make it run better. The details vary across place and time, but one thing remains constant: they didn’t know how little they really knew.