I love NetLogo, an agent-based modeling and simulation program that allows users to explore collective behaviors based on a set of simple rules that governs individual behaviors. Sometimes when I’m bored I build models to try to explain human (or other) behavior, or maybe I’ll start off with a set of nonsensical rules and then just see what happens!
The other day I decided to build a model that simulates cognitive dissonance in religious societies. It’s a rather simple model, but it’s fun to watch.
The model initially consists of two kinds of agents: 1) people who hold a certain religious belief (black), and 2) people who have irrefutable evidence that contradicts those beliefs (blue).
When I begin the simulation the black and blue agents wander around randomly. When they come into contact with one another the blue agent attempts to convince the black agent that their religious belief is wrong by showing them irrefutable evidence. If the black agent accepts the evidence and changes their beliefs, they turn green and go about their happy informed lives. If the black agent rejects their beliefs, they turn red and hold onto those beliefs with greater conviction. This happens according to a user-controlled probability slider.
For example, if we set the probability that the believer will experience cognitive dissonance at 0.15 (or 15%), then the outcome of the simulation looks like this (x axis is time):
I’m actually a little surprised that this run came out so perfectly; 15% exactly experienced cognitive dissonance and rejected the irrefutable evidence, while 85% accepted it and incorporated the evidence into their set of beliefs.
If we set the probability considerably higher (what we might find in lesser-developed societies that put a lot of emphasis on the afterlife, such as Sub-Sahara African villages), then we get different results.
So why do some people, when confronted with irrefutable evidence that disconfirms their beliefs, believe with more conviction? It’s because it’s incredibly difficult to admit you were wrong, and it’s incredibly easy to come up with some reason why the evidence is wrong (satanic conspiracy, scientists are lying to get more federal funding, etc.). If you can justify your beliefs according to any flimsy argument, then you feel that the evidence actually confirms your beliefs because, you feel, the evidence is wrong.
This doesn’t just apply to religious beliefs. Flat earth beliefs, anti-vax beliefs, climate change denial, hell, literally anything that you believe. I might even experience cognitive dissonance if someone presented me with irrefutable evidence that cognitive dissonance does not exist!