The following is a derivation of a speech I gave at a conference recently, regarding the relationship between cognitive biases, internet algorithms, and the increased polarization of the political spectrum that can be witnessed today.
Francis Fukuyama argued in his 1989 article “The End of History?” that following the rise of Western liberal democracy after the collapse of the Soviet Union, humanity was reaching
”the end of history as such: that is, the end point of mankind’s ideological evolution and the universalization of Western liberal democracy as the final form of human government”¹
Without arguing in detail against Fukuyama’s complete thesis – despite of me agreeing with one of the conference attendees in describing him as a ‘tragic fool’ – I will instead focus on one key assumption where I think he went wrong, and from whence I believe all the other mistakes hence derived. In my opinion, Fukuyama’s original mistake was in thinking that ideas could be defeated even semi-permanently.
Recently, there has been a clear rise in identity politics – the tendency of people of shared ethnicity, religion, sexual orientation or any other on the surface non-ideological feature to group together to advocate causes from their in-group’s perspective. At the same time, old favourites like Marxism and fascism refuse to go away, with thriving groups still advocating them. The reason for this, I believe, is also one of the reasons why Fukuyama’s predictions about the future of ideologies went so wrong as far as I know, ideas can only be defeated by other ideas – not by the collapse of regimes – and never permanently as long as the idea can have some utility towards individuals. For as long as an ideology cannot answer to everyone’s every material and spiritual need, other ideas will be there to compensate this lack. No matter what you think of western liberal democracy, you must surely agree that it cannot fulfill everyone’s every need, material and spiritual.
When thinking of the resilience of ideas, just think of how thriving astrology is, despite it having no factual basis. It fulfills people in some valuable way, so it persists. As long as western liberalism cannot appease everyone on every single facet of their material and spiritual lives, other ideologies will go on living, even if their flames may be muted for a time. Western liberal democracy has not proven to be able to answer to everyone’s every need. At least not so quickly to have made other ideologies obsolete.
The role of the internet in this is that it has enabled people to group together and polarize around ideas catering to their own needs faster and more easily than was previously the case. Nowadays, most of information transferred within the western society goes through the internet and its many algorithms. From social relationships to politics, our information about other people and their ideas gets filtered through the internet.
As such, it matters how internet algorithms pander to our biases.
Big browser and social media companies make revenue based on how long they can keep us browsing on any given page, and as such they have employed algorithms that are designed to give you more of what you had before, or what people with similar browsing habits to you have looked at before. This is a succesful tactic in both making a more pleasant browsing experience for the consumer, and creating revenue for these corporations by keeping you browsing.
Ideologically, it is also a recipe for:
- Regression in tolerance and increased in-group/out-group dichotomy – i.e. tribalism
- Ideas becoming insulated from criticism.
- The distillation of ideologies into their more extreme forms.
The human mind is a machine that did not evolve to deal with the current intellectual and technological environment. It has not had the time to adapt to the current information age, and definitely has no mechanisms to counter internet algorithms. Cognitive biases are a group of reasoning flaws that our minds tend to make based on the shape of our brain, and when combined with ideologies and group identities, they attract cultishness. Like cognitive biases, cultishness is a human phenomenon which happens when we group together, because it served us well for a period of our evolutionary history.
”Every cause wants to be a cult.”²
What I mean by cultishness in this context is high conformity to one’s in-group, hostility towards out-groups, and the polarization and distillation of the group’s beliefs as time goes on. Conformity, in-group bias, and reluctance to change one’s opinion are also all highly ubiquitous among humans regardless of other factors. Our mind is a machine that enjoys being in a cult. It is not a matter of whether an ideology or the people who support it are innately cultish, as they all have that potential.
This tendency of ideological groups to decay into cultishness has been around for a long time, but the internet and its algorithms have begun to work as a catalyst, accelerating this process. The algorithmic nature of social media and search engines have enabled the formation countless in-groups that are effectively insulated from opposing ideas, unless their members go out of their way to try to seek countering voices – which we are unlikely to do based on our innate biases and insecurities.
Think of this: A person has an issue to which she has not found a satisfying answer. She goes to the internet to find answers, and finds a group with a cause that seems to answer this. She is relieved, and begins interacting with the people of this cause, reading more of what they have to say and forming meaningful relationships within the cause. Internet algorithms make sure she gets served with more links to associated ideas and causes, and she goes deeper into the rabbit hole in her euphoric death spiral of having her eyes opened to so many things she had not thought of before just like that.
Eventually she will encounter a ’normie’ who never even heard of the answer to the her original issue, let alone the other associated ideas she has now adopted. To the convert, this normie and everyone else will seem like sheep with their eyes closed, for she has seen the light. At this point, unless she goes out of her way to try to challenge herself, nobody will manage to make her rethink her stance because everyone else will think she is the lunatic and discussions between people who underestimate each other’s mental capabilities are not going to convince either side.
Additionally, within these groups the first ones to leave or get ostracized are the moderate people on the margins. When the most sceptical members are inclined to leave or be excommunicated by the group, the average opinion naturally shifts towards the more extreme.
Rinse and repeat, and causes can become quite ’extreme’ in no time.
An ideology does not need a deep hidden flaw for its adherents to form a cultish in-group. It is sufficient that the adherents be human. Everything else follows naturally. Decay into cultishness is the default state. Internet algorithms are unwittingly complicit in increasing cultishness by pandering to our biases. However, the internet is just a catalyst to a fundamentally human phenomenon. Even so, because the internet facilitates and accelerates this process, we need more vigilant meta-cognition and advocates for a better way to think about ideological issues.
Ideas cannot be defeated, at least not permanently, because of the shape of our brain. People will keep returning to ideas that appeal to them as long as alternative ideas do not answer to all their material and spiritual needs. When they form groups around these ideas, all the mechanisms I have talked about kick in, enhanced by the algorithmic nature of the internet, which filters almost all of the information we receive today.
This is why I think we are at where we are at right now, and western liberal democracy isn’t the sole surviving player on the field.
¹ Fukuyama, Francis. ”The End of History?” The National Interest, no. 16 (1989): 3–18.
² Yudkowsky, Eliezer. ”Every cause wants to be a cult” in Rationality: from AI to Zombies” Berkeley, MIRI (2015). 458–460.
This blog post and the speech it is based on was overall heavily influenced by Yudkowsky, Eliezer. Rationality: from AI to Zombies” Berkeley, MIRI (2015).