Apophenia

I recently read an LA Review of Books article and I have been transfixed by the word ‘apophenia’ since. It’s a word which appears to nail the reason why disinformation spreads, mutates and so passionately rouses opinions.

Susceptibility

I have already written about a mechanism in our brains which demands attribution for events observed in the natural world.

To recap; our brains are hardwired to make observations in the natural world. This pattern-detecting software enables us to navigate the world safely by making observations, which are ‘stored’ for a later date. Next time we encounter a phenomena similar to one we’ve seen before, our pattern-recognition software kicks in to inform our decision making process in the present. It’s this lightning fast pattern recognition that is ‘intuition’.

Where this system falls down is with incorrect attribution. We can wrongly attribute an event to a particular ‘cause’ and ‘store’ incorrect assumptions. The irony is that even if the assumption we ‘store’ for later use happens to prove true when recalled later, on balance of probability, it’s preferable (from a survival perspective) to ‘store and use’ assumptions which are wrong and succeed occasionally, rather than to correct our incorrect assumption. The phrase ‘fortune favours the brave’ eloquently sums up the idea that gambling rarely pays off.

All it takes is one single incorrect variable in a complex system, to lead to grave and unpredictable problems further down the line.

Spreading Information

The compunction to tell others about observations we’ve made is a sound one. It’s good tribal practice. We wish to share information with those close to us in order to keep them safe. The benefit of such an information exchange is that it is reciprocal. You scratch my back, I’ll scratch yours. Of course, the information exchanged can be both true and untrue.

I’m going to use an analogy with computer systems to describe how one incorrect attribution, can lead to widespread falsehoods.

Connected groups

Let’s take a networked computer system. A series of computers linked together, each relying on one another to share computational load:

Figure 1. [Stickulator / CC BY-SA (https://creativecommons.org/licenses/by-sa/3.0)]

In the image Figure 1, imagine the rectangles are people and the percentages are a threshold of incorrect observations. Everything currently within limits. These people are sharing observations, ideas and concepts.

What would happen, in such a network, if failure on one of the nodes was introduced? Here’s the beauty in this analogy. If we exceed the incorrect observation threshold in single person (creating a FAIL) - the entire network becomes susceptible to the ‘incorrect’ observation. The network re-balances itself around the new information and a cascade of failures becomes inevitable:

Figure 2 [Stickulator / CC BY-SA (https://creativecommons.org/licenses/by-sa/3.0)]

Watching Figure 2, you can observe that once a critical-mass of ‘wrong observations’, or node failures is reached, the whole system will begin to fail and become swamped. Apophenia has taken hold. This is also known as Cascade Failure. A new system now exists - one which has failed. In the case of our analogy, an established pattern of ‘wrong observations’ exists in the group.

Apophenia and the self

This analogy can be extended to the views held by an individual. A person doesn’t need to actively share information among a group, to arrive at cascade failure in their own minds. The conspiracy theory that Donald Trump is leading a top-secret fight against a child-cannibal-paedophile ring, operating from the basement of a Washington DC pizza parlour, of which Hilary Clinton is implicated - is a harsh cascade failure. Fuelled by false observations, rumour and galvanised by apophenia doing it’s thing.

Mutually observable cascade failures

We can feel deep frustration when we observe family, friends and entire communities appearing to suffer from their own internal cascade failures. This is because the longing to share our own observations in the spirit of reciprocal information sharing, pushes us to set the record straight. We long to continue the mutual information exchange for communal protection. It’s a frustration borne out of caring for one another.

Doubly, what makes the antagonism even more pronounced, is that those people who we perceive to be in the throes of a cascade failure, view people outside of their own cascade failure, as suffering from a cascade failure of their own. Meta, I know:

Conclusion

Disinformation spreads because we are susceptible to see patterns where they just don’t exist. It’s not a bug, it’s a feature. There’s no shame in admitting that we are all susceptible to being suckered into multi-level cascade failures. There’s a word for it: apophenia.

Incorrect observations are easily amplified by the volume of people that we have the capacity to communicate with in the modern world. It frustrates people on both sides of a cascade failure equally, regardless of which group is actually in possession of the truth.

This explanation seemed to ‘click’ for me. Perhaps it will for you too. I often find myself returning to the question of why people accept certain viewpoints or ideas. Whether that’s a religious belief or a crackpot conspiracy theory, it seems to fit that from a Darwinian perspective, it is our rigid desire to find patterns in the natural world which leads to apophenia. When this is combined with network effects and the compunction to share information, incorrect assumptions become systemic and we become fired up to argue passionately for the pattern’s we’ve discerned. All it takes is one wayward variable to lead to a cascade of misapprehensions.

Written by Thomas

10th October 2020