Home
V1.1.0 - epistemology

Ideologies are slow


If you think through an ideology, you are making yourself dumber than you would be otherwise.

That ideology might be communism, techno-optimism, effective altruism, or any other ideology, it doesn't matter.

Dealing with mistakes

Any entity that has beliefs makes mistakes. No one, no group, no book, no ideology has a perfect understanding of the world.

How people change their mind

As people, we can more or less easily change our mind. When we notice a mistake, we can quickly think about what the correct answer is. If we find the correct answer, we can change our mind towards it. Else, we can at least acknowledge that we were wrong, realise that we might have been badly calibrated, and keep in mind in the future that we should be more careful around the topic where we made the mistake.
Then, we might need to practice the correct answer, to make sure we can apply it in the future. When something is striking enough, we might immediately commit it to memory forever. But usually, if we just think about a thing once, we just forget it.

Regardless, in the grand scheme of things, this process is fast. We can change our mind in a matter of minutes when it is striking, or in a matter of days when it is not.

In the worst case, we cannot change our mind. We can be so attached to our beliefs that we cannot see the mistake, or even worse, we purposefully ignore it. Or we see the mistake, but we still cling to our beliefs, justify them and just feel bad about this whole thing.

In that case, we are stuck, and we cope with the consequences until we can't afford to do so anymore. Or we die, still a traumatised fool who couldn't deal with the truth.

How ideologies do not

The above was specific to people. Now, let's consider ideologies.

While people can change their mind, an ideology is a set of beliefs that is supposed to be consistent and coherent. Ideologies do not really have a lever they can pull to change some of their core beliefs.

As such, when an ideology makes a mistake, it is much harder to acknowledge it and change its mind about it than for a person.

This is so for a variety of reasons.

Ideologies are very decentralised. There is no appointed "ideology coordinator" who can tell everyone that "[Ideology] was wrong about [Topic]. Its new stance is [New Stance]."

Ideologies live in a hostile environment. If an ideology makes a mistake, it will be attacked by its opponents for it. Ideologies fight with each other, and a competitor making a mistake is a great opportunity to gain an edge over them. "Haha, [Opponent Ideology] was wrong about [Topic]! It means that we were right all along and that they were fucking idiots!"

Ideologies are authoritative to their followers. One of the points of following an ideology is to defer some of our beliefs to it. We pick a belief system that is plausible, mostly aligned with our values and comprised of people we respect. From this, we expect that it will be right about most things we care about.

But when an ideology makes a mistake, the existence of the mistake itself will be attacked by its own followers, as they might have been so attached to the ideology that merely acknowledging it is too painful for them.

People are smarter than ideologies

As a result, people much more easily and routinely change their mind than ideologies.

Thus, whenever I see someone defend the beliefs of an ideology, as opposed to just defending what is true regardless of ideology, I know that they are making themselves dumber than they would be otherwise.

They will make predictable mistakes, it will be much harder for them to change their mind about a topic whenever the ideology has core beliefs on it, and it's just gonna be a pain in the ass to deal with overall.


This leads to the equivalent of the Planck's Principle in the context of ideologies. Planck put it well:

A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die and a new generation grows up that is familiar with it.

The same is true when applied to ideologies.

In practice, we do not collectively become smarter by improving an ideology. We become smarter by trying ideologies, noticing their mistakes, and moving on to newer ones later.

Or more cynically, we become smarter when people in power who are tied down to an ideology die or lose their power.

Should we even bother with ideologies?

If they are so bad, so much worse than just thinking for ourselves, you might wonder why we should even bother with them.

The answer is that they are some of the best tools we have to coordinate our beliefs at scale. At its core, an ideology is a small set of prescriptive beliefs (such as moral values, or recommendations for how to live) that are tied together by a nice narrative.

Having such a package makes it much easier to know what others are thinking and build common knowledge. Without ideologies, if we wanted to infer what values others have, we'd need to either have long conversations with them, or try to derive it from what we know they know and simulate their reasoning (which is quite error prone).

So, in general, when we want to act together according to a set of philosophies, ideologies are kind of the best that we have. This is why they are so widespread, and this is why I am dissecting them here a little bit. If we have such a dangerous tool that we are forced to use, we should at least try to understand it better to hurt ourselves less often.


Some people just give up. They don't want to deal with ideologies. They feel too bad, too adversarial, too conformist, too much of a pain in general.

These people erase themselves from the game of intellectual coordination. They are only able to make an indirect impact to the extent that they can convince others who do play the game.

Some people are in denial. They are deeply part of an ideology, but instead of acknowledging and operationalising the ideology, they constantly perform bait-and-switch, motte-and-bailey, and other wishy-washy tricks to avoid pinning down the beliefs of the ideology.

They will say shit like "No one truly agreeing with [Ideology] would say [Topic]!" even though there are many examples of such people and authoritative texts that say otherwise, and that it is overall a central tenet of the ideology.

Regardless, the game is played. They are just making themselves dumber than they would be otherwise. Instead of reasoning about the right trade-offs, they are just making themselves more attached to the ideology.


To be clear, it is clearly worth experimenting in that space! We ought to come up with something better than ideologies.

For instance, it would be great to have a group that has a constitution stating what its core beliefs are, as well as an explicit process for how to change those core beliefs.

If you are interested in this, let me know!

I tried many times to build groups like this, but they never got off the ground, as I was the only champion dealing with all the shit that arises with groups (organisation, communication, social friction, etc.).

If instead we had a core of 5-10 people willing to put in the work, I believe we'd do something cool, possibly great even.

Conclusion

In the meantime, my main recommendation is to not think through ideologies.
You might commit to an ideology for coordination purposes, and I recommend a Disagree and Commit strategy here.

But if you start defending the beliefs of an ideology, saying shit like "But [Ideology] is right about [Topic]", you are making yourself dumber than you would be otherwise. Just think about [Topic] directly.

If you want to talk about which ideology is better to deal with a specific topic, that is a separate question. The answer to that question will likely not look like defending an ideology on epistemic grounds ("[Ideology] is deeply right about [Topic]!"), but instead simulating what would happen if two groups of people with different ideologies would try to solve the same problem, and seeing which would fare better. Something like "If we were to use [Ideology A], we would act according to [Principle X] in this situation. This would have [Implications]. Compare this with [Ideology B], ..."


There is a more general question, about the correct balance. If we decide to use ideologies, for how long should we Disagree and Commit? When is an ideology bad enough to be abandoned?

This is a complex question.

But given that I see almost everyone miss the key point of this essay, we should almost surely be more proactive in ditching ideologies and moving on to better ones.


If you are interested in such questions, let me know! I believe there is a lot of room for experimentation and I'd love to see what we can build.