Asymmetric power in the information age

Philipp Markolin
105 min readNov 10, 2022

--

How an infodemic is reshaping the world

How an infodemic is reshaping the world

This investigative article will require some patience. It is a deep exploration of how a (largely) social media-driven epistemic crisis interferes with democratic processes, arguably one of the most important topics to understand in today’s world. Using a complexity science framework, this article will provide systemic insights using available scientific research, well-documented case examples, and expert opinions to map drivers of democratic backsliding all around the world.

Background:

The ubiquity of digital & social media has disrupted how democratic societies function. Journalists, politicians, and citizens often frame the problems of social media in the context of misinformation, about how lies spread faster than the truth, and how people seek information that comforts their beliefs. My personal interests were always more concerned about how the interactions of humans with algorithmic online systems at scale create emergent meta-phenomena.

Meta-phenomena are difficult to conceptualize, but their effects are often very visible.

I have previously outlined my systematic understanding of how social media dynamics reliably distort our perception of reality and cause us to make bad decisions, collectively:

Information has been commodified into a special product in the attention economy. We have created an online environment where addictive software is directing us toward the most engaging content, often produced and manipulated by the most capable human attention-stealers. Influencers are financially incentivized and algorithmically empowered to provide the psychologically most salient, emotional, or outrageous narratives on any topic for engagements, with zero regards for evidence, accuracy, context, or truth. Online, we find ourselves nudged, pulled, or pushed into echo chambers or isolated bubbles by algorithmic curation and personalized addictive content. We are engulfed by selective and unrepresentative information that is prompting us to commit systematic thinking errors on a societal scale.

The opposite of wisdom of the crowds, so to speak. A crowd folly. It is worth stressing that it is our collective actions, psychological predispositions, and behaviors within these online environments, not just algorithms or choice architectures alone, that create these distortions.

It’s a meta-phenomena where we all find ourselves as unwitting participants. We are part of a crowdsourced global misinformation project and share responsibility for the current state of affairs.

This does not mean we are powerless. Becoming aware of the prevalence of cognitive biases, echo chambers, and online grifters leading to widespread distortions in the current information ecosystem is a critical tool of media literacy in the information age, and I am happy to see that many people I talk to have evolved some understanding of these mechanisms. These problems also have reached some sort of wider public and political recognition, a recent example would be Jonathan Haidt’s popular Atlantic article or a much-needed study from the joint research center of the European Commission.

There are many sociological, psychological, and cognitive characteristics that are associated with believing and sharing of misinformation. (Bruns H. et al., Publications Office of the European Union, 2022)

I do have hope that we collectively grow wiser to the folly of our current online systems, albeit I fear we are still missing the bigger picture.

A meta-phenomena that directly impacts democratic processes and every issue we should care about. I am talking about the breakdown of our collective ability to make sense of the world (epistemology), sometimes known as epistemic crisis.

A crisis where we collectively lose the ability to assess what is real or true.

This epistemic crisis leads to a fragmentation of the realities we inhabit, it shapes our worldview and our understanding of ourselves and others. Even worse, the fragmentation of our shared reality has created a vulnerable information environment that many asymmetric actors try to entrench permanently to our detriment.

This epistemic crisis might be difficult to conceptualize, but its effects are easily visible as well. I am talking about the success of conspiratorial political movements, a breakdown in trust in institutions, skepticism toward science, and overall democratic backsliding.

Often, things have to get worse before we start acting, and despite democratic forces in the US midterms and Brazil elections seemingly holding their ground against conspiratorial movements for now, I seriously doubt we have reached the peak of our epistemic crisis yet.

Democratic citizens have to start turning the tide on conspiratorial thinking that has been weaponized by political actors in societies all around the world, from the Philippines to Nigeria, Brazil to Hungary, Italy to the US. No democracy is currently safe.

Example of the far-right German AfD party platforming conspiratorial myths to appeal to voters (source: Konrad Adenauer Stiftung, 2020)

Conspiracy theories are an interesting and popular research topic, but despite all the interesting social and psychological mechanisms leading people into believing false, magical, or conspiratorial worldviews, this article aims to address a different part of the total equation:

The systematic impact of technologically-enforced bad epistemology on democratic societies and processes

We are always talking about multiple interacting causal layers contributing to the problem; social, behavioral, cognitive, technological, and individual.

Using available scientific research, well-documented case examples, and expert opinions, I want to shed a light on our epistemic crisis. How the high prevalence of misinformation, algorithmic amplification, and especially conspiratorial thinking are toxic for any democratic society and play into the hands of the powerful in unexpected and asymmetric ways. When the public loses, somebody else is winning.

I believe we now have reached a point where have good evidence to suggest that our online information architecture makes us incredibly vulnerable to crowd-sourced distortions and to targeted manipulation by information combatants. This likely poses an existential threat to democracies and sabotages efforts to deal with other cooperative problems like climate change, public health, or preventing war.

In short, this article will attempt to give a comprehensive explanation of how all our current information systems, especially social media platforms, are fundamentally corrupting democracies all around the globe and are causing a backslide into a world all of us had hoped we left behind.

When nothing is true, everything becomes possible.

Sit back, take your time, and make yourself a cup of tea. This is not going to be an easy read.

(free version of the full article will be on medium, and individual chapters on my free substack: chapter 1, chapter 2, chapter 3, chapter 4)

Chapter 1: Democratic backsliding

Democracy as a non-linear complex system

This chapter will explain how to conceptualize democratic processes with the tools of complexity science and build an understanding of the complex systems we are all part of. It will be a useful model to grasp the bigger picture of how a broken info sphere interferes with democracy.

“Democracy is the worst form of government, except for all the others.”

…as the unsourced aphorism famously cited by Winston Churchill goes.

Yet for much, if not all of modern human history, autocracies have been the norm. Even today, only 29% of all humans live in democracies. Up until about 2011, people living in democracies have steadily increased in numbers worldwide. In democratic waves after the Second World War, the end of colonization, and the fall of the Soviet Union, many more democratic nations emerged. In fact, some historians predicted that democratization would be unstoppable. In an ever-more connected world, the people would see the fruits of democratic societies, take power and liberate themselves of their oppressors. More recently, on the back of the social-media empowered challenges to autocratic rule in Middle Eastern and African nations (Arab spring), many hoped that with the power of a democratic information sphere came the power of a democratic society. These were more hopeful times. But with the exception of Tunisia, a resurgence of authoritarianism, religious fundamentalism, and civil war has since left many Arab nations in a terrible state (Arab winter). Even faced with these setbacks, the idea of democracy still remains powerful to the politically oppressed, it inspires much strive and a willingness to endure hardships and bloodshed in hopes of a better tomorrow.

Until about 2011, people living in democracies worldwide steadily increased (Boese VA. et al., Democratization, 2022)

Despite its aspirational ideal, for many people currently living in democratic societies, there is a widespread feeling that the “system” is not working anymore as it used to. Although democracies should work for a majority of citizens and their preferences, recent experiences put this fundamental notion of democracies into question, at least in some democratic nations.

Many across the globe are dissatisfied with how democracy is working. (PEW research, 2019)

These views are somewhat warranted. Things have been going wrong for some time and democracies are in decline worldwide (Boese VA. et al., Democratization, 2022).

Many smart people have also written about a potential role for social media in that democratic decline. We all love to have our pet peeves and factors to blame. Misinformation. Algorithms. Polarization. Conspiratorial ideation. Globalization. Capitalism. Billionaires. Socialism. Immigration. Foreigners. Politicians. China. Russia. Even the Coddeling of kids might be problematic, according to some popular academics (sorry Prof. Haidt!). The list of blame is endless, and I can acknowledge many grains of truth in some of the alleged culprits.

However, I believe we as a society lack the right conceptual frameworks to talk about democracy, what is going wrong and why we seemingly feel powerless to effect change.

This perceived powerlessness is likely another cause for our bad epistemology and falling into conspiratorial and other magical thinking (Whitson JA. et al., Journal of Experimental Social Psychology, 2015).

A sizable proportion of people believe in sweeping conspiracy theories involving the “establishment”, “the media” or the “elites”, despite there being no coordination or “great reset” plan between individuals who are part of said groups. Anxiety is palpable and public conversation seems stuck, all while anti-democratic movements are winning democratic elections all around the globe.

What is going on?

Can science provide better frameworks, concepts, and language to think and talk about democracies?

A) Democracy as a non-linear complex system

Complex problems usually require complex approaches

In general, we have a cognitive tendency to award too much credibility to simple explanations; to weigh causal relations disproportionally strong without asking for the relative effect size, and to prefer singular reasons and simplifications that are ‘true enough’ over the fuller messy picture. It is cognitively easy to blame everything on let’s say politicians, foreigners, or globalization, when each factor works just as a small cog in the larger wheel of democracy. I hope that deep down most of us know that these singular explanations can not be the whole story, that there are multiple factors playing a role, despite us having trouble naming all of them, or mentally mapping their influence and contributions, because it is just too complex.

Complex systems in general are counterintuitive to model for our analytical minds, despite most of the things we care about in life being either embedded in them or constituting complex systems by themselves. A cell is a complex system, and so is an organ, our mind, and the human condition. As is an ant colony, social media, the internet, the economy, and the climate.

In these systems there exists no proportionality and no simple causality between the magnitude of responses and the strength of their stimuli: small changes can have striking and unanticipated effects, whereas great stimuli will not always lead to drastic changes in a system’s behavior. — Willy C. et al., European Journal of Trauma, 2003

On top of that, we humans are pretty limited at processing gradual shifts, suck at extrapolating over many dispersed but subtle or indirect phenomena, and are not built to handle non-linearity and feedback loops.

We reduce complexity to heuristics as a way to reach actionable certainty.

Much of science is no different, being done by humans, after all. Many scientific fields thrive by reducing complexity, taking away confounders, and trying to elucidate the minimalist mechanistic essence of causal interactions. Don’t get me wrong, there is nothing wrong with that, this has been a successful and valuable approach to building up much of our current body of knowledge of how the world works. Yet how is a reductionist approach supposed to work in systems that only become functional when a certain complexity is reached?

Studying the complexities and impact of social media is difficult. For example, what can really be said about the effect and impact of Instagram on the mental health of teenage girls? Mental health is a fuzzy subjective concept. Scientists can track indicators and proxies, like suicide numbers, but they have many biological, psychological, and societal antecedents. However, if let’s say suicide numbers double in female Instagram users compared to non-Instagram user cohorts, there are causal questions to be asked. Are female suicides really up because of Instagram? Can Instagram cause something so dramatic? Are suicide-prone teenagers maybe more likely to engage with Instagram? Scientists would like to study these questions, yet big platform companies are doing everything they can to not give academics access to data (but that is another topic). So scientists have to keep scratching at the edges, is it really a surprise many studies find inconclusive evidence of social media harm (Kross E. et. al., Trends in Cognitive Sciences, 2021) ?

How is a reductionist approach supposed to work in systems that only become functional when a certain complexity is reached?

Complexity is a well-known problem in mathematics (Meyers RA, 2012) social sciences (Elliott E. and Kiel DL, University of Michigan Press, 1996), biology (Mazzocchi F., EMBO reports, 2008), medicine (Willy C. et al., Eur J Trauma 2003), and economics (Gasparatos A., Environmental Impact Assessment Review, 2008).

In psychology, some researchers go so far as to even avoid causal inference explicitly because of complexity (Grosz MP. et al., Perspectives on Psychological Science, 2020). In my opinion, this is wrong-headed. Accurate causal inferences are important for science and non-linear systems are still based on causal interactions, albeit complex ones. We can predict weather likelihoods, despite getting it wrong sometimes because of chaotic dynamics. There is no great workaround, complex systems require scientists to study them as a whole, with all the caveats, limitations, and uncertainties attached to them.

The causal structure of a climate model, adapted from Knutti R. et al., 2015 to network structure by Prof. Peter Holme

I hope now it becomes a bit more understandable why it has been difficult to scientifically pin down the impact of our fragmented info spheres and broken epistemology on wider society. A pre-registered systematic review of almost 500 studies finds a lot of evidence that digital media can increase political participation, but often is detrimental and can erode democracies (Lorenz-Spreen P. et al., Nature Human Behavior, 2022).

Our results provide grounds for concern. Alongside the positive effects of digital media for democracy, there is clear evidence of serious threats to democracy. Considering the importance of these corrosive and potentially difficult-to-reverse effects for democracy, a better understanding of the diverging effects of digital media in different political contexts (for example, authoritarian vs democratic) is urgently needed. To this end, methodological innovation is required. This includes, for instance, more research using causal inference methodologies, as well as research that examines digital media use across multiple and interdependent measures of political behaviour.- Lorenz-Spreen P. et al., Nature Human Behavior, 2022

I think it is fair to say that science has not solved social media yet, especially when it is unclear what scientific disciplines need to engage. Sociologists? Psychologists? Mathematicians? Algorithm engineers? Data analysts? Neurobiologists? Graph, network, or game theorists? Disinformation researchers? Probably we need all of them, working together, and then some. On top of that, many current studies still use a reductionist framework for investigations, looking at indicators, single metrics, or phenomena, rather than approaching certain questions from a complex system’s perspective (I hope my article here might make the case, feel free to reach out). Two of my biggest worries when it comes to current scientific uncertainties can be summarized in two questions:

Are we looking at the right things to scientifically assess whether social media dynamics cause harm to democracy?

How can we hope to get a scientific answer before it is too late and irrevocable harm has been done?

I don’t know, but I believe it is important enough to try for an answer today.

To do that, we first have to get a workable overview of how to conceptualize democracies and social media as interacting complex systems.

This is not a small task. The most useful frameworks I have found within the scientific literature to think and talk about the societal problems I see come from areas of research as disparate as physics, neuroscience, socioeconomics, and systems biology all the way to control theory, mathematical modeling, epidemiology, statistics, and game theory.

Basically, all social and natural sciences which work toward understanding complexity and investigate the behavior of complex systems. Without going into too much detail (here is an excellent scientific paper for that), the idea is to use scientific concepts, tools, and methods developed by these domains to try to apply them to us.

Complexity science tries to chart the macroscopic behavior (in this case of messy little autonomous humans) and describe systemic trends of the bigger systems we are part of, i.e. fragmented online information spheres, and of course democracy itself.

Fair warning:

Attention is a critical good and I don’t want you to get frustrated.

The next sections of this chapter are going to put a lot of cognitive strain on the reader, so I actually recommend jumping ahead to Chapter 2 for most readers. This is where we explain the asymmetric forces on social media, why influencers dominate conversations, and how information combatants exploit vulnerabilities.

For the deep or academically inclined readers, the next sections could be useful to rethink the systems we inhabit. Take your time, or revisit these sections after first jumping ahead to the more important Chapter 2.

B) Tools from complexity science

The vocabulary and principles to conceptualize complex systems

  • System descriptions

When talking about systems, complexity science uses terms like ‘robustness’ (sensitivity to micro-scale perturbation), ‘stability’ (resistance to change over time), ‘adaptability’ (how viable a system remains during transformational change), ‘equilibrium states’ (local optima that reinforce robustness) and so on. For us, it is sufficient to know that researchers look at a set of indicators, economic, historic, physical, political, or social (e.g GDP, unemployment rate, press freedom score, corruption index, suicide rates, graduate degrees) and then try to figure out how these indicators impact the stability or robustness of democracies.

At the risk of oversimplification: complex system researchers try to understand what are the fundamental parameters impacting the systemic behavior of democracies to then be able to build hypotheses on how to strengthen democratic systems or make predictions about what societal forces or trends might indicate or bring their demise.

An example of such hypothesis would be that economic inequality leads to institutional capture by elites, thus reducing the institutional adaptability (and with it threatening long-term stability) of democracies:

Democracy presupposes a basic equality of influence. As economic inequality increases, so too do differences of influence over institutions. Those who have substantial financial resources are better able than those without to influence institutional change. — Wiesner K., European Journal of Physics, 2019

The theory is assisted by observational data that for example considers that policy outcomes in the US (and elsewhere) reflect not the majority view but are better predicted by the preferences of the wealthy.

Indicator-based system descriptions are of course to be taken with care, just because some indicators might be associated with certain outcomes does not mean that they are causally related, nor is their directionality of influence always easy to discern. (Complex systems, remember?!?)

Policy outcomes in the US (and elsewhere) reflect not the majority view but are better predicted by the preferences of the wealthy

Does institutional capture cause economic inequality or is it the other way round? Are there other confounding factors, for example, elite overproduction or age/race/sex demographics, that better explain institutional capture? Or can events like a recession better explain economic inequalities? It is a mess because, in complex systems, all the different indicators are somehow involved in shaping the total and furthermore react together in often chaotic, synergistic, and unpredictable ways. This brings us to the next point:

  • System dynamics
Τα πάντα ρει - everything flows [attributed to ancient Greek philosophers]

Outside of identifying critical indicators, another way how the complex system’s view can aid our understanding of democracies is when it comes to dynamics; namely feedback loops, tradeoffs, and non-linearity.

In biological systems, for example, a cell’s metabolic, transcriptional, or signaling networks, we differentiate between positive (as in amplifying) and negative (as in reducing) feedback loops. These are mechanisms that react to a certain perturbation to the status quo (sudden availability of sugar, or drop in oxygen availability) by either causing other cellular processes to run faster, shut down, or perform a different function. Feedback loops perform many important functions, from regulating stability of dynamical systems to optimizing signal-to-noise ratios for cellular decision-making (Hornung G. et al., PLOS Computational Biology, 2008).

In democratic systems, feedback mechanisms are ubiquitous as well. Gerrymandering in the US is a prominent example of a “positive” (as in amplifying) feedback loop (Wang SSH. et al., PNAS, 2021) where the electoral winner gets to draw voting district lines that will favor him in subsequent elections. “Negative” (as in reducing) feedback loops include the separation of powers into co-equal legislative, judiciary, and executive branches. When one branch exceeds its function in an attempted power grab, the other two reign it in and thereby reduce its power, which is overall good for the stability of democracies as a system. There are many more such feedback loops… freedom of the press is negative feedback on political corruption, grassroots movements are a positive feedback for policy change, voters punishing bad behavior or performance from politicians is another negative feedback loop, etc… you get the point. Lots of causal interactions make sure the complex system of democracy remains robust to many types of perturbations (e.g the inevitable individual failings of us error-prone humans that are part of the system).

Voters punishing bad behavior or performance from politicians can be seen as a negative feedback loop

Tradeoffs are a common theme in complex systems too, either because there are always limited resources to be allocated (and allocating them to one aspect means there is less left for others) or the functional optimization of one feature comes at the cost of vulnerability in another.

Trade-offs are well studied in economics, biology, physics, and math. Examples include how just-in-time manufacturing optimizes for cost-reduction and speed but makes the supply chain less robust towards disruptions (Jiang B, IMF economic Review, 2021), for example by unexpected viruses running havoc causing labor shortages or lock-downs…

Many tradeoff examples come from biology, e.g needing to boost survival mechanisms in resource-poor environments cause bacteria to lose adaptive capacity in the form of speed and accuracy (Lan G. et al., Nature Physics, 2012)

Both feedback and trade-offs often go hand in hand and play a role in control theory in biology (Cowan NJ. et al., Integrative & Comparative Biology, 2014) and many other areas of science and engineering (Veccio DD. et al., J. R. Soc. Interface., 2016)

Control theory has arisen from the conceptualization and generalization of design strategies aimed at improving the stability, robustness, and performance of physical systems in a number of applications, including mechanical devices, electrical/power networks, space and air systems, and chemical processes — Veccio DD. et al., J. R. Soc. Interface., 2016

  • non-linearity

Non-linearity in math and science is a term to describe system interactions where the change of the output is not proportional to the change in input and may appear chaotic, unpredictable, or counterintuitive. Evolving complex systems are almost always non-linear because their reaction to perturbation usually depends on the specific state there are in at a specific moment in time, and the space of potential states might be infinitely large. This might sound complicated but think of it as acknowledging that we do not live in a vacuum, but in a constantly changing environment that interacts with us, we come with a unique history and we are not a perfect machine. For example, if you give me the input that ‘I suck’ today, I might laugh at you, or get angry and push you away, or do something completely different, depending on my initial state of mind, my experiences, but also our shared history together. Same input, disproportional output. (disproportional output does not mean random, it might however be non-deterministic, probabilistic or unpredictable)

Non-linear systems have disproportional output

Disclaimer: Please consider that the dynamics of non-linear evolving systems is an area of active study in many disciplines, far exceeding my own capabilities to even just outline properly (here might be an interesting book for the mathematically inclined). In fact, because it is so vast an area of research, I am very uncomfortable to even create the false appearance that I have any good grip on the various research topics. I do not. At most, I have limited system’s biology experience which gives me some conceptual and working familiarity with complex living systems.

In any case, what I feel comfortable doing is trying to sketch out common themes in systems science in order to have a conceptual framework to talk about democracy as a dynamic, non-linear complex system, because I find it useful and increasingly necessary to understand our world.

Finally, there is one last point we have to cover. Dynamic complex systems perform complex functions.

C) Complex functions in democracy

Understanding emergence and meta-phenomena

Living is an emergent meta-phenomena. It is the result of implementing a set of complex functions. Staying alive requires navigating the environment, supplying oxygen and nutrients to cells, repairing tissue damage, and propagating the pattern of existence forward without dropping the ball even once. Living is not just one thing, it is like a juggling act if one could juggle a torch, an ax, and a baby while running a marathon. In short, it is weird and messy and complicated. There are many different shapes of “living”, not least because of the problem with the definition of what constitutes living and what does not (which is also not agreed upon and up for discussion, but I digress).

If we understand humans as the dynamic, non-linear complex system consisting of 34 trillion cells that we are, “living” requires a specialized set of micro & macroscopic actions that differs from what a tree or a bacterial colony has to do to stay alive. This trivial observation is still somewhat surprising since many living things are essentially implemented by and the product of trillions of cells that are mostly made up of similar building blocks (nucleic & amino acids, proteins, lipids, sugars etc…) and these cells observe similar basic principles of physics, chemistry, and cell biology.

The point is: Largely similar autonomous units (cells) behaving a bit differently can produce vastly different complex functions, collective behaviors, and with it new emergent phenomena.

Although a bumpy analogy, macrosystems like nation states can be understood as somewhat “living” organisms too, and are certainly shaped by a set of micro & macroscopic actions as well (nation states are implemented by and product of the collective actions of their constituent parts, meaning us largely similar autonomous humans).

We humans collectively perform the complex functions that keep any system of governance ‘alive’.

It does not matter if that nation-state is a democracy or a dictatorship, or something in between, as long as the same system propagates forward in time, it can be considered “alive”.

However, an argument can be made that there are only a few robust ways to uphold these complex functions that keep the respective ‘system of governance’ alive, and failure to do so will lead to instability, dysfunction, and decay; ultimately ‘death’ of the old system (often finalized with a coup d’état) and the leftover constituent parts either transition towards a new systemic state (new regime) or collapse and chaos (failed stated).

Let’s take a look at (some of) the complex functions any stable system of governance needs to perform:

  • resource allocation between citizens and/or subjects

Every macrosystem, be it a kidney, an ant colony or a governing body has to navigate the tradeoffs of resource distribution. Stable systems distribute resources in a way that adds to their robustness or maintains equilibrium, whereas failure to distribute appropriately can lead to system collapse. Economists have long mapped the often counterintuitive relationship between economic disenfranchisement of the masses, or the threat of new taxation of a rich elite, with political transitions (Acemoglu, D., & Robinson, J. A. The American Economic Review, 2001, De Mosquita B. et al, MIT Press book, 2003, Besley and Kudamasu, 2008).

Stable autocratic systems make sure that resources flow primarily to secure the loyalty of the power-holding elite in the autocracy, meaning the military, police, select government officials, key bureaucratic functionaries and party power brokers (Gallagher M. & Hanson J, 2009). Resources for public goods (like infrastructure, hospitals) or the wider population (education, social programs) are limited or not considered at all.

Stable Democratic systems have to allocate and redistribute resources among power holders to maintain robustness and stability too, but in the democratic case, power lies with masses of voters and the interest groups, unions, parties, intelligentsia, religious leaders, and economic elites who influence them.

From a system’s perspective, the complex function of ‘resource allocation’ has at least two implementations that historically seem capable to sustain stable governments (keeping them in equilibrium), while variations from these implementations might cause inherent instability to the respective system.

For example, distributing resources to the masses might be a bad strategy for autocratic rulers, as they might face a coup d’état from an insurgent who promises the actual power holders in autocracies (military generals, bureaucrats, etc) a bigger share of the resources currently ‘wasted’ on the powerless masses. It has been well observed that after a successful military coup d’état, resource distribution towards the military increases. Some studies even suggest it is often the reason why a coup d’état happened in the first place (Leon G., Public Choice, 2014). Let that sink in for a second. From an autocratic system’s perspective, spending on the “people” is considered resource mismanagement that can make the system less stable.

(To be honest, I recommend this excellent and entertaining video based on “the dictator’s handbook”, to get a decent intuition surrounding resource distribution and power… also, it’s fun!)

On the other hand, failures to redistribute and allowing resources to accumulate in only few elites might be a bad strategy for democracies. Democracy presupposes a basic equality of influence, so any type of societal inequality might subvert democracy in varied ways (we mentioned institutional capture by elites), albeit a scientific consensus has not been reached on this issue (Savoia A et al., World Development, 2010 , Acemoglu D. et al., MIT economics, 2015 , Page & Gilens, University of Chicago Press, 2020 , Dacombe R. et al., Representation, 2021).

On top of that, distributing resources too unevenly towards particular interest groups to ensure (“buy”) voter loyalty of that group challenges political legitimacy of the distributing body with not-favored voters, and with it the stability of democratic systems.

This brings us to the next point: political legitimacy

  • maintaining legitimacy to direct coordinated action

Political legitimacy is an ongoing scientific research area with both normative and quantitative approaches being advanced to measure and analyze constituent factors of political legitimacy. There are many different lines of thought, from the Lockean notion of ‘consent of the governed’ to Weber’s empirically observable “belief in legitimacy”, to questions if definitions of ‘political legitimacy’ even make sense in autocratic rule (Gerschewski, J., Perspectives on Politics, 2018).

However, what can be said is that legitimacy in democratic systems is a multi-factorial issue that can not simply be measured by asking people if they are satisfied with democracy (Linde & Ekman, European Journal of Political Research, 2003).

Individuals within any larger democratic system will have different political positions, preferences and problems that it wants the governing structure to accommodate to recognize its legitimacy, and prolonged failure to do so might lead to disengagement from the system and withdrawal of cooperation. However, democracies are usually robust to a subset of citizens not recognizing the political legitimacy of the current power holders.

[…] generalised or diffuse support (e.g., support for regime principles) does not emerge overnight. Rather, it must be built on a record of acknowledged
regime performance […] that are not only of an economic nature, say, a matter of economic growth or social reforms.

Crucial for the creation of a reserve of generalised system support is also the regime’s capacity to maintain order, to maintain the rule of law, and to otherwise respect human rights and the democratic rules of the game. — Linde & Ekman, European Journal of Political Research, 2003

In autocracies, political legitimacy is more nebulous, because it is unclear who it addresses in an unequal society. Is political legitimacy irrelevant when behavior is enforced through violence or coercion? Are claims to legitimacy solely dependent on the loyalists, military leaders, and bureaucratic functionaries of autocracies, but not wider society?

Legitimacy is a relational concept between the ruler and the ruled in which the ruled sees the entitlement claims of the ruler as being justified, and follows them based on a perceived obligation to obey. The legitimating norms must constitute settled expectations in order to be fully effective and must be actively transferred. — Gerschewski, J. Perspectives on Politics, 2018

So while political legitimacy was, is, and will likely remain a hotly debated topic, most agree that maintaining legitimacy is necessary for any government or institution to ensure the cooperation of the majority of its constituent parts.

Another way of looking at the governing people/institutions of a nation-state then is that they are central control hubs that maintain the stability and robustness of a system through the collection of system-relevant information, integrated decision-making, and enforcement of coordinated actions (or instigation of cooperation).

Creating legitimacy in democracies is important because it allocates decision power over the rules of cooperation to the democratically elected control hub (i.e government) via for example making laws or by creating financial incentives (i.e power of resource distribution). We see already that complex functions also interact with each other, because a nation-state is a complex macro system.

  • Regulation and control of cooperation

All control hubs, biological, corporate, social or societal, depend on and use the power of hierarchical structures to enforce coordination and control the behavior of individuals.

Hierarchical structures either instigate, guide, monitor, enforce, or punish behavior as well as disseminate decisions and information between regulatory layers.

That does not mean there are no differences in the shape of hierarchical structures between different systems, nor are hierarchical structures the only way to get emergent behaviors (see for example control hub-free swarm intelligence).

In many complex systems, regulation and control is a fascinating field of study, and researchers have identified many shapes of how influence can or might be asserted by one constituent part onto others of the same system by abstracting them in networks (Bocaletti S. et. al., Physics Report, 2006).

Any complex system has regulatory or interaction relationships that can be mapped to abstract networks to understand how the system works. Depicted here: Regulatory architectures in different gene networks. Nitin Bhardwaj et al., PNAS, 2010

Autocratic hierarchies, for example within the military, usually follow a linear chain of command and top-down dominance between the layers. Democratic hierarchies show higher connectivity within and between layers, asserting control via multiple intermediaries. Some research suggests that the more complex a system gets (i.e the more constituent parts to regulate), the more likely it is supported by regulators that collaborate with other regulators to exert control, rather than strict dominance chains (Bhardwaj et al., PNAS, 2010). This could also be a feature of how the control system arises in the first place:

Democratic hierarchies are built bottom-up through election while autocratic hierarchies are built top-down through domination. Both, however, have power asymmetries between the weaker citizens and the stronger politicians, which are amplified the stronger the hierarchies are. — Toelstede, B. , Rationality and Society, 2020

Control hubs collecting and processing system-critical information and instigating collective action is a critical feature for a system’s adaptability, for example, to react to environmental threats or changes. We can illustrate this with another complex function:

  • system adaptability for self-preservation

Our world is non-linear, the environment is constantly changing, and complex systems need to adapt to new circumstances to remain viable through environmental challenges, or transformational change.

Complex systems sometimes have the means to actively defend themselves from destructive outside influence, be it an immune system that learns to thwart parasites, an ant colony reorganizing itself during disease, social media algorithms getting smarter at filtering bots, to a military increasing its capabilities to defend the sovereignty of the state. Failure to self-defend can lead to critical system collapse.

Most (all?) living systems we know of are adaptive, from bacterial colonies to plants to humans to societies, these systems might even need to have evolved self-defense to remain viable in environments where (for example) they have to compete with other living adaptive systems for resources... maybe let’s phrase it more accurately:

It would be unrealistic for certain complex systems to have survived in competitive environments if they had not adapted to it by evolving their self-defense capabilities.

Self-defense is so important that software and cybersecurity researchers try to create artificial immune systems to detect and protect against threats to distributed power systems (Alonso FR et al., IEEE, 2015) or even complex IoT ecosystems (Aldhaheri S, MDPI, 2020).

Both democratic and autocratic societies rely on self-defense to be viable.

For democracies, some research suggests that increases in network alliances dramatically reduced the threat of war and regime change (Jackson MO., PNAS, 2015), and that good self-defense capabilities through alliances might even foster democratization (Gibler & Wolford, 2006). A nation with a weak army and no alliances (= incapable of systemic self-defense) will likely not remain viable as a system when needing to adapt (transformationally change) to ever new circumstances of a shifting world (Powell et al., Democratication, 2018).

A nation with strong alliances has better chances to emerge whole through rapid transformational change, as the Ukrainian people are currently showing us in their rapid systemic transition to democracy. Indications look promising, albeit that history has not yet been written.

Alright, that wraps up our primer on complex system science for democracy. The last thing we need is to apply these concepts to a real-world example. How about something “easy and uncontroversial”, like I don’t know, Afghanistan?

D) Understanding democratic failure from a complex systems perspective

Applying complexity science to the real-world

Hundreds of people run alongside a U.S. Air Force C-17 transport plane, some climbing on the plane, as it moves down a runway of the international airport, in Kabul, Afghanistan, Monday, Aug.16. 2021. Thousands of Afghans have rushed onto the tarmac at the airport, some so desperate to escape the Taliban capture of their country that they held onto the American military jet as it took off and plunged to death. Image and reported by AP NEWS

Many people, including decision-makers, were surprised at the sheer speed of the collapse of the democratic Afghan government once the US troops withdrew and the Taliban came marching in. Because complex system behavior is often highly dependent on initial conditions, to understand what happened, we have to start by looking at the recent history first.

Political legitimacy of various power holders in Afghanistan was and has been enforced by military power for a long time, most notably since the Soviet-Afghan War, which was weaponizing its populace over decades and shifting traditional power structures and hierarchies away from clergy, community elders, intelligentsia, and military in favor of powerful warlords ruling through dominance hierarchies (Kakar, M. Hassan, University of California Press, 1995). The withdrawal and breakdown of the Soviet Union led to an ongoing civil war between rivaling Mujahedeen parties and ultimately a failed state with shifting militias like the Taliban (propped up by Pakistan) taking control of many regions. From a system’s perspective, Afghanistan’s systemic state was not in any equilibrium, nor a single system for that matter, but multiple competing smaller systems based on military hierarchical structure and autocratic rule fighting for territorial supremacy in the region.

When the US started occupying the country, despite easily taking control of the capital and its institutions, there was no effective ‘centralized control hub’ in place they could have taken advantage of and tried to transition towards performing the complex functions of government in a more democratic way. There simply was no state-wide system, it needed to be built. They did take Kabul and made whatever local system existed in Kabul and its surroundings more democratic, but most of the US’ resources and energy would still go towards fighting other competing systems (insurgencies, the Taliban, Pakistan), almost identical to what the warlords did before. Fewer resources went toward trying to expand their local democratic system by integrating more of the constituent parts, the Afghan people., and build a state-wide system (i.e true nation-building).

The Afghan state collapsed because it lacked legitimacy in the eyes of the people. The sources of this legitimacy crisis were multiple and interwoven. First, the 2004 Constitution created a system of governance that provided Afghan citizens with few opportunities to participate in or have any meaningful oversight of their government.

Second, the international coalition was focused on fighting an insurgency and consolidating power — missions distinct from and often at odds with democracy-building.

Third, the intemperate rule of President Ashraf Ghani (2014–21) hastened state collapse. Ghani, who kept a tight, close circle and had only a narrow base of support, micromanaged both the economy and the state, and he discriminated against ethnic minorities. […] his behavior was more authoritarian than democratic.

Finally, it was only with the support of Pakistan that the Taliban could reemerge as a political and military force. — Murtazashvili JB., Journal of Democracy, 2022.

Despite being made pro-forma a ‘democracy’ during US occupation, the macro-system “Afghanistan” was not successfully changed where it counts: At the underlying constituent parts performing complex functions like distributing power among the supposed power holders, i.e the governed, and with it create political legitimacy in the whole country. Despite citizens being given the right to vote, the legitimacy to govern the whole country “democratically” was instead created and enforced through autocratic means, like bribes to loyalists. So once the current power holders fled, it was all too easy to get rid of the democracy and impose a new order.

Creating legitimacy through autocratic means is of course only one example of how the constituent parts of the Afghan system were performing complex functions at odds with democracy. Similar arguments can be advanced about the complex function of resource allocation, where the US-installed Afghan government showed itself being famously corrupt, hoarding all the resources within a small group of elites and loyalists (like in autocracies) and not distributing them to the people. Another breaking point was the way how coordinated action was instigated, through military power and bribes, rather than popular participation and representatives. Lastly, the Afghan government as a system also failed to build sufficient self-defense capabilities to protect itself, with the lack of any alliances and with the Afghan army famously falling apart or joining the enemy.

Again, while we see that the Afghan situation is undoubtedly complex, it becomes navigatable and comprehensibly so with the right conceptual framework.

Afghanistan’s “democracy” did not suddenly fall with the withdrawal of the US, it was a failed state governed by autocratic mechanisms before the US came and its autocratic dynamics remained robust despite the ‘perturbation’ of US occupation

Then again, I will be careful to not assign blame. Changing complex system dynamics, especially when it comes to nation-building, are difficult topics where much smarter people than me will bite their teeth out. I also withhold judgment on the morality of the supposed endeavor, because I certainly have too superficial knowledge about the topic.

I just wonder: How surprised should we really have been about the fact that Afghanistan’s democracy rapidly disintegrated and transitioned towards autocratic rule under the Taliban once the US (and international institutions) withdrew last year? Everything we knew (or could have known) suggested so. Our surprise seems to me more like a reflection of our own ignorance than a response to an unpredictable unfolding of history or a black swan event. (Also, we should not have any illusions that Afghanistan will have any stability under the Taliban, their taking control is likely just as transient and will break down into civil or geographical disintegration when transformation change is required, or the next powerful perturbation arrives)

Alright, for our purposes, I think you get the point. Seeing the world is made up of non-linear dynamic complex systems can be useful, even illuminating, it allows us to conceptualize common themes and find a vocabulary to describe wildly-different and emergent macro-phenomena in a useful way: To intuitively understand them better without false simplifications.

Conclusion Chapter 1:

Understanding democracies as non-linear complex systems gives researchers and citizens the tools to ask more useful questions and gain insights about the mechanistic needs driving the system we are a part of. Most importantly, it sharpens the framework of possible actions that individuals, groups and institutions can take to effect change.

A society wishing to live in a democratic system has to perform complex functions arising from our collective behavior, and transitioning towards those functions is difficult and often unsuccessful (Arab Winter comes to mind, or post-Soviet Russia). For the lucky people and societies already living in democracies, upholding them is more a question of maintaining existing control systems in place, implementing adaptive responses to systemic threats, and managing new environmental challenges. Democracies have faced and mastered challenges before, in fact, in the later half of the 20ths century, they seemed better equipped to do so than any other form of governance.

Yet the democratic backsliding observed by researchers, journalists, and citizens in recent years marks a worrying transition from the self-stabilizing equilibrium of democracy toward a new system of democratic instability and dysfunction. That transition is a reflection of something going wrong, and is showcased by the failings or corruption of critical complex functions and dynamics, be it the dismantling or abuse of critical feedback loops (e.g independent journalism, voter manipulation) or the implementation of complex functions at odds with democratic processes (e.g voter disenfranchisement, unequal resource allocation, weak/no separation of powers). While this is very much a gradual process, the longer it takes us to come back to a democratic equilibrium state, the harder it will be.

We need to identify what pushes our democratic system in the wrong direction, and we need to do so fast.

Chapter 2: How asymmetry favors the powerful

How to exploit the asymmetry of our shared info sphere

This chapter will look at various emergent macro-behaviors resulting from our collective actions online. It will highlight how social media platforms & the attention economy created exploitable vulnerabilities and why an epistemic crisis interferes with multiple complex functions democracies have to perform to remain viable.

Democracy might not survive the newly created asymmetries of power of our information age

But before we get there, let’s have a quick look at how social media really works

A) The ‘rich getting richer’ power law of attention

When the winners take all, democracies lose

There is no middle class in the creator economy

Despite much talk about the ‘democratization’ of sharing information, none of the big social media platforms is democratic (some encouraging developments in Taiwan though). In the background, they are all profit-driven companies with strong hierarchical structures fulfilling the function of any late-stage capitalist company: Shareholder profits above everything else.

But even on the platforms, within the user base, there is no democratic equality. Platform companies have been secretive about the distribution of payouts and other metrics to measure individual user influence on these platforms, but that will not stop us from making some inferences based on proxy measurements we do have access to.

Leaked information about the payouts to creators from Twitch streaming platform. (source)
  • A leak of creator payouts from the video game streaming platform Twitch revealed how thin the air in the attention economy truly is. A mere 10.000 of its 8 million streamers garnered income higher than 10.000$ per year. So that is ~ 0,12 % of all streamers (not viewers!!!) and still that amount is far from a livable salary. To get to a livable salary of like 40.000$ per year, that number drops to about 2500 (0,03%). As others have noted, there is no middle class in the creator economy. The winners take all. Once the upper echelons of attention are reached, money starts flowing, following a power law. 81 streamers earned more than 500k, and 25 streamers more than one million dollars per year, with the top earners coming in at close to 5 million per year. Just from twitch, because influential streamers of this size have countless other avenues to make real money. The platforms also rely on and cater to their influencers, paying huge sums to exclusively retain them and throttle competitors.

So the baseline of any online social media platform is this: Only a minuscule fraction of platform users dominate social media platforms, be it in audience size, revenue, exclusive deals, or even just getting their voices heard, whereas the rest is drowned out as background noise. This uneven distribution of reach vs. followers is roughly the same for all platforms. The “getting voices heard” part is especially salient when talking about social media platforms like Twitter or Facebook, the poor substitutions for a public square to talk about political and societal issues. A minority of influencers asymmetrically dominate what topics get talked about in wider society, and how they get talked about. Often, these influencers even shape what is acceptable to discuss on these platforms (Overton window comes to mind). Spreading anti-vax talking points and other harmful medical misinformation during a pandemic, even antisemitism and ‘great replacement theory’ conspiratorial garbage all have become ‘acceptable’ (although often veiled) in public discourse because of influencers shamelessly mainstreaming it for engagement.

Whatever you might want to call this system, aristocratic, capitalistic, or even meritocratic, it certainly is not democratic

Democracy presupposes some basic equality of influence, whereas these systems are designed to produce, facilitate and maintain an inequality of influence. The rich get richer and the winners take all.

In functional capitalist societies, we have at least regulations and taxation to reign in financial power (however weak they are today is another topic), but on social media, there is no “attention tax” that redistributes eyeballs, nor are there strong (or any) regulations about what can and cannot be done to capture attention.

Accurate, contextual, relevant and reliable information in our communication systems is, in the wider sense, a public good, whereas lies, misinformation and nonsense can be seen as pollutants to our shared info sphere.

Even in a (somewhat functional) capitalist system, if you pollute the river to increase your financial income, regulators are gonna slam and punish you because you endangered a public good for personal gain.

However, if you pollute the info sphere by e.g lying about vaccine safety during a pandemic, odds are nothing is going to happen to you, even (or maybe especially) if you hold the biggest megaphone in the world.

Even worse, the fact that influencers deal in information products means that they inadvertently shape human perceptions of reality. If you pollute a river, people can at least agree that it was bad and that you need to be held accountable. If influencers lie, most of their followers will be convinced they were actually telling the truth because of an array of parasocial, tribal, and psychological phenomena influencing belief formation I really don’t have the space to go into. Take the false consensus effect:

To illustrate, a recent study by Leviston et al (2013) on people’s attitudes about climate change showed that only a small minority of people, between 5% and 7% in their Australian sample, denied that climate change was happening. However, those minority respondents thought that their opinion was shared by between 43% and 49% of the population. The massive discrepancy between actual and conjectured prevalence of an opinion (around 40% in this case) is known as the false consensus effect (Krueger and Zeiger 1993). When people believe that their opinion is widely shared, they are particularly resistant to belief revision (Leviston et al 2013), are less likely to compromise, and more likely to insist that their own views prevail (Miller 1993). The fact that any opinion, no matter how absurd, will be shared by at least some of the more than one billion Facebook users worldwide (Synnott et al 2017), creates an opportunity for the emergence of a false consensus effect around any fringe opinion because the social signal is distorted by global inter-connectivity. — Wiesner K., European Journal of Physics, 2019

No wonder people strongly believe in all kinds of nonsense and are not susceptible to belief revision when their echo chamber sings the same tune as the influencers around whom it formed. I mean, the examples are as endless as they are consequential, from climate change denial to alternative medicines to election stealing fantasies, as an influencer you get to sell whatever information drug you want in service of reaching the top of the attention hierarchy. And once you are there, the winners take all, remember?

On social media, there is no “attention tax” that redistributes eyeballs

So all these dated ideas of “social media being a force to democratize the world” were never much more than cynical marketing. Nothing within these systems supports critical democratic processes. On the contrary, most design decisions taken by software engineers in service of their capitalistic platform companies might actively subvert them.

I see two main drivers of the epistemic crisis haunting our info spheres:

  • de-centralized, uncoordinated, or crowd-sourced distortions
  • centralized, coordinated, or targeted manipulations

To truly understand this, we have to look at how asymmetries create vulnerabilities, and how various actors exploit the gameable online environment these platforms have created. Let’s start with crowd-sourced distortions, shall we?

B) Asymmetric forces shaping the info sphere

The unwitting sabotage of democracy for profit

Epistemic paralysis hinders collective democratic action

I have previously written about the psychological, social, and technological factors playing into crowd-sourced distortions of the public info sphere, so let’s quickly recap and sharpen this phenomenon to see its relevance to democratic decline.

  • Any topic that garners a lot of attention is a potential source of income for influencers. The more eyeballs something attracts (or can be made to attract), the more money can be made from it. The easiest way to game the system is to create an information product that is just very addictive, broadly appealing, or shareable.
  • There are entertaining versions, like endless cat videos, and more harmful or toxic versions, like nasty political memes. For influencers, the real issue with these info products is not that some are potentially toxic to society, but that they are very flat and easy to produce, work independently of their creator or platform, and there is huge market competition and zero creator loyalty, so consistent monetization with these products is difficult unless they are very quick to manufacture on a large scale, basically click-bait or memes.
  • A more elaborate tactic is to create an addictive information product that has a unique appeal, or that is custom-made for a specific audience of the attention market. In marketing, we call the former ‘USP’, a unique selling proposition, and the latter targeting a ‘niche market. This is where gurus, contrarians, political commentators, culture war spin doctors, and of course information grifters come in. These influencers are ‘experts’ in sensing what a niche audience wants to hear, developing parasocial relationships with their targeted group, and doing everything in their power to create outrageous, polarizing, or addictive meme content for their ‘tribe’.

Once any topic has garnered a lot of attention on social media, almost automatically a combination of contrarian grifters, political actors, and profiteering influencers jump on the bandwagon, supported by microtargeted information algorithms. Their goal: create a polarizing counter-narrative; a wedge to segregate opinions and groupthink into two or more warring factions separated usually along lowest common denominator lines like ‘left/right’, race, religion etc… After all, emotional outrage, fights with the outgroup, and opportunities for identity, virtue or intellectual signaling are tried and proven recipes to create the very engaging content that sucks people in, and that is rewarded by the algorithms.

  • Let me stress again that the content of the information product does not matter. It does not need to be bound by accuracy, reliability, context or facts, or any other metric we would usually care about in information. What matters to influencers is how many people engage with it, and how likely information consumers will come back for more. This is also what those content ranking algorithms dictate that made the platform companies (or better said their shareholders) so immensely wealthy. The algorithms optimize and reward influencers to steal your time, attention, and data so it can be sold to the highest bidder. I sometimes call it a grift, where attention grifters sell junk information products to their customers for huge personal profits.

The current social media system has financially incentivized grifting on an unprecedented scale. Offline grifting is hard because you’d have to extract money from your acolytes directly and they might sour on your extractive behavior eventually. Online grifting is a different ballgame. You just pretend to be an expert, clown, guru, or victim for a popular cause. Whatever. Your goal is to entertain. You aim to steal your audience’s attention for as long as you can, while platform media companies extract the monetary value for you. Right from the data of your followers. The companies then pay you your share of the profits and deliver ever more fresh unsuspecting customers to you. It was never easier to capture an audience (or be captured by it).

  • That really is all the magic behind the influencer economy, and all other correlates stem from that fact. Sponsorship or other monetization schemes enrich influencers because they steal your attention for personal gain, sometimes through intermediaries or in service of offering eyeballs, your eyeballs, to companies. However, information is not just a product, but a necessary good to make sense of our world, so by controlling information, influencers or companies shape our perception of reality too.
Challenges of the digital world. (Kozyreva A. et al., Psychological Science in the Public Interest, 2020)

Okay, now that we refreshed our memory, we can look a bit deeper into why influencer and platform incentives might be problematic to a democratic society.

First, there is the obvious asymmetry of narrative power: Simplified, emotionally engaging, or populist narratives and ideas will eat everything else. Reality often comes in gray, facts are boring and relevant info might just not be entertaining enough to catch eyeballs.

Current social media narrative effects might be described as the great dumbing down of nuance, an exodus of facts, context, and expertise from public discourse.

In such an environment, almost any piece of information, even when entirely accurate, can easily become misinformation when deprived of its original context and surrounding facts, or oversimplified into oblivion. An extreme form of this would be cherry-picking, where information gets put out of its actual context to further an oppositional narrative.

Second is the asymmetry of audience demand. Going by the motto of ‘the customer is always right’, many influencers create content that the audience wants to see. This is a very straightforward business model and it might help sales of information products enormously, given our own propensity to seek confirmatory information and other psychological biases. However, giving an attentional audience always (and too much of) what it wants, rather than what it needs to hear, certainly does not help with fostering epistemic humility, on the contrary, it nudges us towards very biased fragmented realities of our own making.

Youtube political influencer Tim Pool garnered over a billion views by espouses popular but mutually contradicting talking points constantly to cater to audience demand. Here is a takedown video explaining his methods (mostly conspiratorial attacking of “mainstream media”) and some in-depths investigative reporting on him from Robert Silverman.

Catering to ever-changing audience demand is one of the biggest challenges for influencers, because attention, by its nature, is a fickle thing. Maintaining market share of attentional audiences requires flexibility, that’s why influencers acting for example in the socio-political sphere (e.g culture war, social and political commentators) have to seemingly constantly reinvent themselves to keep their income stream.

Just take the sudden change from self-proclaimed viral epidemiologists to foreign policy experts of hundreds of Twitter accounts with the start of the Russian invasion of Ukraine.

These commentators have no business commenting on issues they know nothing about, but target audience (and follower) demand drives them to do so anyway. Once an attentional audience has been cultivated by an influencer, they need to keep selling information products to keep their market share of their niche. So what many end up doing is faking to be insightful, either by

  • being convincing through their confident delivery of information, or by fulfilling a figurehead role based on their identity (PoC saying actually ‘whites’ are discriminated, or gays saying conversion therapy is good,…), or by creating contrarian counter-narratives to differentiate themselves from genuine expert opinion (vaccines actually kill more people than they save, ‘they’ don’t tell you how climate change is actually good for us,…), or just bringing their unrelated pet talking points to any new topic. (if you have ever seen hot takes like ‘Russia is gonna win the war because they are not woke’, you know exactly what I mean).

The point is: They are selling junk information products based on grift, identity, manipulation, or intellectual virtue signaling, instead of offering real informational merit.

Fake but engaging experts shape public perception and action, and that can manifest for example in poor vaccine uptake during a pandemic, or political inaction in tackling climate change.

When people with no relevant expertise rapidly move to offer their opinions on a wide range of topics as soon as these topics become fashionable or newsworthy, and especially when these opinions are contrarian, we should suspect them of intellectual virtue signalling. […] I also suggest it is harmful, because it distracts attention from genuine expertise and gives contrarian opinions an undue prominence in public debate. — Neil Levy, 2022

And this brings us to the overarching, emergent systemic point:

The incredible asymmetry of noise over signal our interaction with social media creates. We don’t need to invoke Shannon-Hartley theorem to understand that the amount of information available to us far exceeds our bandwidth to process it.

A bad signal-to-noise ratio makes it difficult for people to find reliable information to form their opinion on any issue. Our current signal-to-noise ratio makes it almost impossible.

The overt favoritism of simplified, emotionally engaging narratives and personalized niche content delivered by contrarian shysters, marketeering influencers, or other engagement gurus destroys any navigable level of signal-to-noise ratio on any topic for the wider public.

This is one of the core roots of our epistemic crisis.

The current informational architecture destroys the signal-to-noise ratio of any topic that garners widespread attention.

Think about every aspect of the pandemic, the one topic that by its nature demanded us all to pay attention to. What ‘attention-grappling’ aspect of Covid does not have a polarizing wedge driven through society? Does it exist? Do mask work? Do lockdowns help? Should we open schools? Can you trust your health institutions? What about Ivermectin instead of vaccines? Are vaccines even safe and effective? Did humans create SARS-CoV-2? (on that last one, I did some work to provide epistemic clarity, and the answer is no)

If we cannot browse reliable information, if we lack domain expertise, time, or energy to deeply engage with the topic, we have to rely on our trust networks to reach actionable certainty on any topic to navigate modern life. However, relying solely on trust networks, not expert institutions, scientific consensus, or factual reporting to inform one’s opinions is dangerous in a world of fragmented realities. Fake experts, political commentators, and other attention stealers have a far wider reach than actual trustworthy sources, and usually stronger personal appeal or skill to manipulate us into trusting them. (I’d even say: never trust an influencer, always look for what the boring ‘institutions’ have to say)

Even worse, in an ideologically polarized environment, the system-imposed need to outsource opinion formation to trust networks will often result in our dependence on unreliable proxies already in our ideological network, or demand of citizens to choose the most stomach-able lowest common denominator tribe ideology they can live with to get their information from.

This leads to absurd social phenomena, for example, the association of ineffective Ivermectin prescriptions with political affiliation, or watching a specific cable news network reducing COVID-19 vaccine compliance.

Now let’s quickly think back to chapter 1, several systemic problems for democracy arise when the complex system we are part of is experiencing noise. Again, the scientific literature is nuanced here, a little bit of noise is normal and can be overall good for the robustness of a system (Tsimring L., Rep. Prog. Phys., 2014 , Junge K. et al., Systems Research and Behavioral Science, 2020), whereas large amounts of noise can throw systems out of equilibrium (Tyloo M. et al., Phys. Rev. E, 2019). But what could happen if noise completely drowns out any useful signal?

High noise means that communication breaks down between individual elements of a network that need to act together to fulfill a function.

This can manifest in multiple ways, one obvious example is the inability of feedback loops to exert regulation, another is a faulty allocation of resources within the network, yet another would be the decoupling or separation of elements from the larger system.

A biological cell within our body that loses communication with its environment might stop performing its function within the collective, maybe even turn cancerous and eventually threaten the whole.

In democracies, losing intra-system communication ability is horrible, because democracies are basically built from bottom-up productive interactions of citizens, unions, interest groups, intelligentsia, movements, and political parties. As we have seen in chapter one, democratic networks have a lot of cross-regulation in the intermediary layers, and political legitimacy of representatives gets challenged when collective demands cannot be met with real actions, or when some people are not involved in shaping political actions in the first place.

Another issue arising is of course polarization, as the links between different groups and layers become weaker or unworkable because of noise, reaching compromises is rendered impossible. There is a rich popular literature on the negative effects of polarization, so I will not spend any more time on it (for system dynamics of polarization, see e.g Leonard NE. et al., PNAS, 2021 , Levin SA. et al., PNAS, 2021).

Increased polarization is also a direct result of what disinformation researchers call epistemic paralysis, and the inability to reach actionable certainty or compromise on any topic of importance because nobody can agree on a basic set of facts or truth. (Poison a river and people can agree that was bad, poison the info sphere with over 30.000 lies while being in office (noise!) and your followers will still believe you are a great truth teller). So the outlook is dire.

Anything that grips our attention, like a pandemic, war, or presidential election, will inevitably segregate the public into warring epistemic factions, creating an unnavigable fog of information noise and increasing polarization. The resulting epistemic paralysis sabotages collective democratic action on shared challenges.

Alright, guess it’s time to sum up the meta-phenomena of our crowdsourced misinformation project.

The attention economy and information architecture of our current info systems incentivizes the creation of toxic influencers and narratives. Almost exclusively emotional and addictive narratives, together with contrarian opinions of parasocially elevated influencers inevitably shape the information landscape in an asymmetric way that is diametrically opposed to the epistemic needs of a democratic society. Furthermore, because each influencer has to deliver specialized information products to their niche market to stay in business, we humans get a wide spectrum of information products to pick and choose from, irrespective of the hidden harms some might cause us (a need for consumer protection regulation comes to mind). We willingly but unwittingly participate in the creation of our various fragmented info spheres, and with it are partially responsible for the noise that leads to epistemic paralysis. This epistemic paralysis collectively sabotages several essential democratic functions, from collective decision-making, to cooperation, to creating political legitimacy.

But it gets way worse, we still have to look at the even darker side of this already pretty dark coin: How targeted system manipulations can weaponize epistemic paralysis in our info spheres to serve the strategic aims of information combatants.

And this is what we have to look at next.

C) The rise of information operations and info warfare

Information has strategic utility for information combatants

The attention economy works because holding attention is power, and that power can be exerted to fulfill strategic aims

The centrality of information has changed pretty much anything in the 21st century. Physics, math, even biology, chemistry, neuroscience have transitioned or are currently transitioning into information science. The way we store, share, distribute and consume information has changed dramatically. Even military talk about information has changed, talking about information not just as a tactic, but as a ‘battle space’ to fight in, like naval or aerial. Fundamentally, the capitalist platforms we use are pay-for-play systems developed for advertisers to manipulate us, so money naturally goes a long way to amplify specific products, ideas, ideologies, or points of view on these platforms. But as in real life, money is not the only way to exert power over these platforms, and with them, our info spheres.

While we were talking about information as an addictive product when it comes to our crowd-sourced distortions, disinformation researchers and cyber specialists in the military talk about information as a tool of war when conceptualizing the role, impact, and purpose of targeted manipulations.

What both conceptional approaches have in common is that the content of the information is irrelevant, only the effect matters. Facts, accuracy, context, reliability, and other desirable traits of information are often even counterproductive when it comes to optimizing information products or tools. For information as a product, consumer engagement determines its value. For information as a tool, information space strategic utility determines its value. These are not mutual exclusive perspective, because engaging information products that can be used to manipulate have a higher strategic utility, and strategic use of information within the info sphere can also make it more engaging.

In the battle space of information, information has strategic utility when it can be employed to further the strategic objectives of information combatants. Information combatants include a wide array of actors and entities, from businesses to political campaigns, from troll farms to militaries, from religious movements to governments of nation-states. The strategic objectives are best illustrated through their use of information operations.

Information operations are inauthentic actions of a coordinated group, from state level and businesses down to disinformation contractors, pay-to-engage services, chat rooms and message boards; they may include fake identities, content, messaging or amplification; and they pursue a social, economic, or political purpose.

The aims, strategies, and tactics of information operations (Krasodomski-Jones A. et al., Demos, 2019)

But before we go a bit deeper, we have to first quickly talk about another enabling feature on social media platforms that immensely empowers coordinated online operations and probably contributed most to their rise:

  • Microtrageting and the asymmetry of knowledge

Platform businesses use creators, influencers, and an arsenal of algorithms to bind us and our attention to their platforms. The ‘higher’ purpose is to steal our data to better target us with products as well as shape our perception of products. We talked about this already. User or customer data might already be the most valuable asset there is for businesses, no matter if they are big tech or not, and citizens have no access to it, often not even to their own data. This asymmetry of knowledge translates and asymmetry of power for information combatants over citizens.

To keep others under surveillance while avoiding equal scrutiny oneself is the most important form of authoritarian political power (Balkin 2008; Zuboff 2019). Similarly, to know others while revealing little about oneself is the most important form of commercial power in an attention economy. — Lewandowsky S. & Pomerantsev P., Memory, Mind & Media, 2022

Sitting on the iron throne of the attention economy is of course the evil giant Facebook, and the company does its best to use its power to whitewash the abusive, extractive, and anti-democratic practices they employ to get cash for our data. Targeted manipulation sounds sexier when you call it ‘behavioral marketing’. The Facebook PR department is truly worth their money when it comes to cover-up, denial and gaslighting about harms they have cause… but I get ahead of myself.

Let’s focus on microtargeting, the act of finely segregating users based on different (demographic, social, cognitive, psychological…) metrics, to better allow advertisers and product sellers to find their niche audiences.

The fundamental problem is that Facebook’s core business is to collect highly refined data about its users and convert that data into microtargeted manipulations (advertisements, newsfeed adjustments) aimed at getting its users to want, believe, or do things. […] Describing this business as “advertising” or “behavioral marketing” rather than “microtargeted manipulation” makes it seem less controversial. But even if you think that microtargeted behavioral marketing is fine for parting people with their money, the normative considerations are acutely different in the context of democratic elections. -Bankler Y. et al., Network propaganda, 2018

Ethically, it is highly questionable to allow companies to use data to build for example psychological profiles from their potential customers, it opens up the potential to target and exploit their psychological weaknesses (Matz SC. et al., PNAS, 2017), and yet here we are. For information combatants, microtargeting is basically a game changer, like having a rocket with a target-homing function vs. a rocket that just can fly straight and on sight. Guess which one will be more effective in war.

Researchers are quite unequivocal about the incompatibility of microtargeted messaging and manipulations with democratic fundamentals.

What must be noted, however, is that the micro-targeting of messages may be at odds with the democratic fundamentals. The foundational idea of a democracy is that it provides a public marketplace of ideas, however imperfect, where competing positions are discussed and decided upon. We suggest that this entails a normative imperative to provide the opportunity that opponents can rebut each other’s arguments. This possibility for engagement and debate is destroyed when messages are disseminated in secret, targeting individuals based on their personal vulnerabilities to persuasion, without their knowledge and without the opponent being able to rebut any of those arguments. — Wiesner K., European Journal of Physics, 2019

Information combatants attacking, sabotaging or subverting democracies thrive in environments where they are invisible, and fragmented info spheres are rendering public coordination to form a strong opposition to their actions impossible.

These platforms do not need to be as bad as they are for the public, and we should not allow them to be any longer

Furthermore, all our social communication platforms, public squares, and even the rules for public engagement or debate, are currently provided by private, autocratically ruled, capitalistic companies that care more about shareholder profits than public good or democracy. These systems do not need as bad as they are for the public.

The only reason microtargeting exists is that it is profitable to the platform companies and valuable to various monied interests and other information combatants. Again, there is not really any scientific controversy about how immensely stupid and dangerous this is for any democracy.

That same platform-based, microtargeted manipulation used on voters threatens to undermine the very possibility of a democratic polity. That is true whether it is used by the incumbent government to manipulate its population or by committed outsiders bent on subverting democracy. — Bankler Y. et al., Network propaganda, 2018

These impacts of social media on public discourse show how democracies can be vulnerable in ways against which institutional structures and historical traditions offer little protection. — Wiesner K., European Journal of Physics, 2019

The way our current public squares are set up certainly exposes a deep systemic vulnerability, and as we will see, anti-democratic actors are not shy to abuse it. In fact, abusing the vulnerabilities of these platforms is currently a booming business in many corners around the world.

“There are many legal ‘advertising’ companies and businesses (and a whole lot more illegal ones) that offer to manipulate public conversation for cash.” Carl Miller, Centre for the Analysis of Social Media. (personal communications)

Today, there is a sprawling cottage industry of pay-for-play ‘advertising’ companies, click farms, bot networks and even venture-funded AI startups that offer services to information combatants.

Under the guise of ‘monitoring misinformation’, AI Start-ups like Blackbird or Zignal develop tools that will allow information combatants to monitor, react to and take control of narratives that are damaging to them before they are fully formed. Sounds nice for capitalist businesses protecting their brand. Sounds awesome if you are a totalitarian government in need of detecting and squelching emerging cooperation of an opposing public before it has fully formed.

It is absolutely cynical how these disinformation services are marketed as tools of ‘protection’ against misinformation, or to ‘empower’ well-meaning businesses to detect threats, all to garner investment funding. It is either a charade, or strategic ignorance, because the actors who finance this know exactly what they are getting and why. This is why current technological innovations constitute another asymmetry that favors the currently powerful. While technology might not deterministic per se, in the sense that it can be used for good or bad (here is an AI start-up example for good), it is certainly telling when the majority of these technological innovations happen in private or secret, outside of public scrutiny or influence. A saying comes to mind:

If you are not at the table, then you are probably on the menu

Currently, the democratic public is being feasted upon, so the alarm bells should be ringing in everybody’s ears. If not, well, I guess now would be a good time to look at a well-documented recent case example from the dark underbelly of information operations.

  • How to influence from the shadow (#IStandwithPutin operation)
With the Invasion of Ukraine and before the upcoming UN resolution vote, two hashtags trends started appearing on Twitter: #IstandwithPutin and #IstandwithRussia (Source: CASM technology report)

Russia has been known to be involved in information operations for a long time. Across March 2nd and 3rd in 2022, two pro-invasion hashtags began to trend on Twitter across a number of geographies around the world: #IStandWithPutin and #IStandWithRussia. In the days that followed, research began to be made public that suggested that some of the activity associated with these hashtags were inauthentic, including bots and engagement farming (using pay-to-engage services).

Further investigation by disinformation researchers (see CASM technology report) unearth the full scope of these inauthentic interactions with the help of deep learning based transformer language models. The model finds patterns of similar language use considering vocabulary, topic, thematic framing etc., basically allow researchers to cluster and group user accounts based on text-behavioral similarity.

Many of the accounts studied here were created very recently, have very few followers and post a very small amount of original content themselves,
preferring to amplify instead. They all follow a common volumetric pattern: a small uptick on the day of the invasion, a large spike on the day of the UN vote and a sharp decrease in the days thereafter. — CASM technology report, 2022

Notably, they found clusters of inauthentic networks with geographic association, likely a feature of how to best game the specific platform (Twitter hashtags and trending feature is based on geography, so activating networks there most likely was aimed at targeting people regionally, not globally).

#IstandwithPutin and #IstandwithRussia disinformation campain to manipulate Asian and African countries to abstain from voting against Russia in the UN (Source: CASM technology report)

The inauthentic activity focused on popular themes within their regions that might have persuasive power, for example anti-western sentiments, equivocating NATO membership expansion with colonialism, or appeals to BRICS solidarity and mingling pro-Zuma/Modi messaging with Putin appraisal.

It is difficult to assess the impact of a single information operation that ran very shortly and targeted mostly BRICS and developing countries, it is however notable that targeted countries in Asia and Africa largely decided to abstain from condemning the illegal Russian invasion of Ukraine.

While the Chinese and Indian abstention for condemning Russia might be the most consequential geopolitically, South African abstention was the most surprising, and drew international criticism. It is also notable that Brasil, where the #IstandwithPutin information operation was not observed, was also the only BRICS country (with strengthening economic ties to Russia and Bolsanaro’s sympathies towards Putin) that did not side with Russia. As often in this new age of covert actions, we are left to wonder: Is this all just coincidence? What other factors are at play? What is the relative effect size of a single information operation?

Again, we run into the complexity problem when we try to find a reductionist answer to this question. Probably the impact was small, but can it be excluded that it might have contributed to the decision making in those countries when it came to the UN vote? After thinking about systemic impact, I think we can appreciate that these types in information operations certainly create noise and contribute to epistemic paralysis, even without having measurable geopolitical payoff (which they might still have had!).

Sometimes, changes in quantity produce dramatic changes in the quality of a phenomenon, this is the essence of emergence.

One last thing we have to consider is that information operations are cheap, ubiquitous and quite easy to perform on unsuspecting targets. Pair this ease of manipulation with a gameable environment already distorted by crowdsourced misinformation, and we have a recipe for systemic catastrophe.

Conclusion Chapter 2:

Social media platforms are not built to be democratic or serve the public good, in fact, many of their design decisions inevitably cause anti-democratic incentives, behaviors, and phenomena to arise.

There is an ongoing epistemic crisis because democratic societies face a dual threat: Crowd-sourced distortions and targeted manipulations of the shared information sphere, leading to mutually incompatible fragmented realities and the breakdown of some complex functions required to maintain democracy. The most dramatic systemic vulnerability created for democratic societies is a catastrophically low signal-to-noise ratio, which asymmetric actors and information combatants both cause and exploit for personal profit or strategic aims.

The attention economy is powerful because information shapes our perception of reality, and attention is the mechanism to filter information that ultimately ends up reaching us. While we seem to have control and agency over our attention, the architecture of our info spheres rewards content that best steals it, and this content is optimized for engagement and all the bad incentives and distortions coming along with it. We are circling down the drain into fragmented realities that endlessly reinforce themselves.

In this way, a broken signal-to-noise ratio favors engaging narratives or powerful actors who either can buy, capture or demand attention, it entrenches their power and worldview to the detriment of people’s own agency and democratic processes. This is also why microtargeting is so unethical and dangerous. Furthermore, while asymmetric actors and information combatants fight over information supremacy, the public gets polarized and the resulting epistemic paralysis halts progress in solving societal problems.

Democracy might not survive the newly created asymmetries of power of our information age

From a systems perspective, both broken signal-to-noise ratios and epistemic paralysis are a threat to the continued existence of democracy, the are a constant and persistant perturbation that corrodes and sabotages the systemic stability and adaptability of democracies.

So I guess what I am saying is that we are finally ready to assess the full impact of the multiple overlapping crises of our information age.

Chapter 3: How an infodemic is reshaping the world

The method behind the madness

Information operations and crowd-sourced distortion interact in complex ways to destroy democratic processes and pose existential threats to human rights, democracy, environmental ecosystems, and in the case of weaponized conspiracy theories, our humanity

Fragmented realities have real-life impact

Without facts, you can’t have truth. Without truth, you can’t have trust. Without trust, we have no shared reality, no democracy, and it becomes impossible to deal with the existential problems of our time. — Maria Ressa, Nobel Peace Prize lecture, 2021

Here, we will look at four case examples of how fragmented realities and a broken epistemology works against the public good and threatens democracy, even the whole world.

Epistemic neutralization of regime-critical journalism

For many, the case of Rappler vs. Duterte was and is eye-opening to how social media dynamics can be weaponized by state actors to undermine journalism. Most people however have either never heard of it, because it is a developing country they don’t care about, or if they are Filipino, believe various conspiracy theories about the news outlet and its independent journalists.

Rappler is a Filipino online news website founded by Maria Ressa, a trained investigative reporter who aimed to bring veteran journalists together with young tech-savvy digital natives to speak truth to power.

Rappler was reporting on the rise of a Filipino major who had a very radical ‘though on crime’ rhetoric, a disdain for independent journalists and the press, and a populist-folksy and nationalist appeal against the democratic ‘elites’ in the capital Manila. In short, he was a ruthless man with a polarizing and hateful message, positioning himself as a savior to take the country back for the poor and forgotten Filipinos.

Image Source: Buzzfeed News, How Duterte used Facebook to Fuel the Philippine Drug War, 2018

In complex systems, small inputs can have disproportional outputs. Unable to compete with the money required to run a traditional campaign, Duterte opted for going fully online. The relatively cheap social media campaign by Duterte’s team of like 500 staff, together with bots, paid influencers, and fanatics, was sufficient to create an emotionally engaging online narrative that poisoned the well of public discourse, seeding fear, doubt, and hate. Amplification dynamics on social media (more than 50% of Filipinos are on Facebook) made sure that Duterte’s extreme movement became the center of attention online, and with it, got more mainstream media coverage, more followers, more outrage, and more engagements yet again. The winners take all, remember? Frustrations and divisions were sharpened, reporting narratives and journalists critical of Duterte were served up as public enemies, threatened and intimidated by online mobs stirred by pro-Duterte fanatics and influencers. As everybody knows, Duterte won.

“Just because you’re a journalist you are not exempted from assassination, if you’re a son of a bitch” (Rodrigo Duterte, president of the Philippines at his inauguration).

This should just mark the beginning. Independent journalism is a negative feedback loop on power in democracies and is often among the first to be dismantled by autocratic leaders. Others have written about the chilling effect of online hate and harassment, and how it can be weaponized to either suppress free speech or silence dissent. In the Philippines, the social-media radicalization of hateful discourse is nothing short of stochastic terrorism against journalists. The results are chilling. Social media has been intensively and widely used for propaganda. Human rights abuses and extrajudicial killings in Duterte’s drug war have become accepted by a large segment of the population as the price for an increased sense of security of the general population. Fear, uncertainty, and doubt tactics call for a strongman after all. Once in power, Duterte did of course not shy away from using the tools of the state to really crack down on independent media in the more classical sense as well.

A handful of media outlets have tried to cover Duterte’s authoritarian excesses. In March 2017, the irascible president warned them: “I’m not threatening them but someday their karma will catch up with them.” They included the country’s leading newspaper, the Philippine Daily Inquirer. It was bought up a year later and its journalists were brought to heel. The next target was the country’s leading radio and TV network, ABS-CBN. In July 2020, the ever-compliant congress sealed its fate by refusing to renew its franchise. He is now targeting the last bastion of press freedom — the Rappler news website and its CEO, Maria Ressa. Hounded by lawsuits and prosecution brought by Duterte’s allies, she is facing the possibility of sentences totaling around 100 years in prison.

Thanks to collusion at all levels within the state apparatus, Duterte has an arsenal that he can use to wage “total war” against journalists, an arsenal that includes spurious charges of defamation, tax evasion or violation of capital legislation; rescinding broadcast licences; getting accomplices to buy up media outlets and bring their journalists into line; and using an army of trolls to subject journalists to online harassment. -Reporters without borders

The Philippines currently is ranked 147th out of 180 countries on the press freedom index, tendency falling.

It is really worth reading the whole 3-part series (part1, part2, part3) of excellent investigative journalism from Rappler. Trying to make sense of what was happening to the Filipino population online is what led to Maria Ressa being awarded the Nobel Peace Prize in 2021. Her unrelenting nature also likely has sealed her fate.

Having lost her appeals in the Filipino justice system, she is currently facing lengthy jail sentences and Rappler has been ordered shut down.

In the end, it can be said that with the help of social media, the bad guy(s) won, and democracy and the public good lost. And this is a pattern we will see, again and again, so let’s jump to a different case study.

Epistemic sabotage to prevent science-based action

It is no secret that CO2-emitting industries have tried everything in their power to manipulate public perception about the scientific reality of anthropogenic climate change. (I really do not want to rehash that whole story, you all know the drill, and if not, please, please read literally anything on the topic and work your way from there)

But eventually, businesses would lose the fight with science over the reality of climate change and a majority of the world today is finally aware of the looming danger if we do not decarbonize.

Well, for CO2-emitting industries, the fight over reality might have been lost, but the real war has just started.

Recently, disinformation researchers have reported a set of new strategic aims that emission industries use to shape the information battle space to their benefit. They can be summarized as deny, deceive, delay.

As the public conversation on climate change evolves, so too does the sophistication and range of arguments used to downplay or discount the need for action […] One strategy has received relatively little attention to date: policy-focused discourses that exploit contemporary discussions on what action should be taken, how fast, who bears responsibility and where costs and benefits should be allocated. […] We call these ‘climate delay’ discourses, since they often lead to deadlock or a sense that there are intractable obstacles to taking action. — Lamb, W., et al., Global Sustainability, 2020

By selectively focusing information operations against climate actions on exploiting contemporary political discussions, information combatants from the emission industries aim to throw sand in the gears of democratic decision-making, halt democratic processes to implement climate actions and manipulate people to not participate in climate solutions… all for the ‘noble’ goal of keeping their revenue streams running a bit longer. Every ,extra day of delay brings billions in profits, and if there is a war driving oil and gas prices up? Jackpot! So spending a few hundred million on dedicated anti-science think tanks publishing misleading books, fake universities targeting children with climate misinformation, pay-to-engage services, or bot networks is really pennies on the dollar to their businesses.

Deny, deceive, delay documents the many information operations and tactics used by Fossil Fuel and Carbon industries to manipulate public conversation about climate change and climate action. (Figure from Lamb, W., et al., Global Sustainability, 2020)

I recommend you read the above report because it is important and well presented.

How these popular climate-denial networks interact with other rightwing political ideologies to create self-sustaining alternative realities we outlined already in Chapter 2. Attention is power. Being the first or loudest, or having the most engaging content, goes a long way to convince people that climate change is a globalist plot, especially when they have never heard the actual scientific evidence supporting what climate scientists claim.

Infusing undeclared amounts of often dark money to finance influencers, entities, and platforms to amplify narratives that align with your business interests is the easiest and most straightforward way to manipulate public discourse on these online platforms where people spend their time. That is what they were built for. Take PragerU, an advocacy group that markets itself as ‘university’ (despite not being an academic institution, granting no diplomas and holding no classes) and posts well-produced ‘lectures’ (misleading content & straight propaganda) on youtube.

PragerU’s online platform was launched in part through investment from fracking billionaires Dan and Farris Wilks, and regularly platforms other key players in this network such as Bjorn Lomborg, Patrick Moore, Dinesh D’Souza and Alex Epstein. According to their main website, PragerU content has garnered over 5 billion views with 4 million average views daily. It also claims that 60% of its YouTube audience are under 35 and 70% of them’ changed their mind on at least one issue’ after viewing PragerU content. — King J. et al., Institute for strategic dialogue, 2022

Part of the popular appeal of PragerU comes not from just climate misinformation, but from targeting their information products to politically conservative audiences (their niche market) by frequently playing on religious themes, immigration fears, or day-to-day political issues. Pretty odd for a supposedly ‘educational’ channel. And yet, PragerU is just one node in a network of well-financed influence operations.

Technically, CO2-emitting industries are not only within their rights, they are encouraged by social media platforms to just do some smooth behavioral advertising for their climate-destroying causes, and gee, if that happens to distort the public’s perception of scientific reality, unfortunate.

The truly remarkable part about all of this is how mundane and old these tactics are.

Step 3 from the disinformation handbook: Manufacture scientific uncertainty where none exists. Image source: Union of Concerned Scientists. The disinformation playbook

Businesses using disinformation playbook tactics against scientific experts to squash public health or safety is well documented going back to at least the 1950s (just read about the tactics against scientists coming from the tobacco industry). We, as a society, are none the wiser in spotting these tactics or discarding arguments that go against the scientific consensus and we have to do better.

There is however one notable difference compared to the past: Undermining public perception of the scientific consensus has become easier and more effective with information operations. These inauthentic entities and actors get paid to create engaging meme content that then again gets paid for to be made viral. Once attention was captured, once a topic is up there, an ecosystem of winner-takes-all dynamics makes sure that climate contrarians (paid or self-made!) with the hottest takes and best USP get rewarded mightily for selling their anti-science contrarianism to the masses, consequences be damned. Forever, if need be.

Our current information architecture makes resolution impossible and conflicts perpetual, because conflict is engaging.

In the past, it took the CO2-emitting industries decades of hard work, hundreds of millions of dollars, sophisticated rightwing influence networks, political lobbying, and every trick from the disinformation book to barely manage to create a legislation-blocking minority (the GOP) to protect their interests against the scientific consensus on climate change in the US. The US is an important country, but still just one country.

Compare this to the mostly low-effort, bungled-together anti-vaccine grift that swept over the world recently, turning from multiple niche movements and actors to a disinformation behemoth in less than two years. A behemoth that is now kicking the scientific consensus aside like an old can of garbage, demonizing vaccines, institutions, and scientists (& hospitals, pharmacies, doctors, and nurses) during a life-threatening pandemic of all times. This behemoth is currently destroying public health progress made over decades, and we will see more pandemics of already defeated diseases make their return to our society because of it.

My absolute disdain for anti-vax influencers is likely no secret to anyone who has seen my Twitter feed. Given the attention a Joe Rogan, Bret Weinstein, RFK junior or all the other anti-science shysters (not linking to their misleading claims, if you are truly unaware, listen to this entertaining and educational decoding podcast) attract, is it really a surprise that the public is increasingly skeptical of or gaslighted about scientific authority on a scientific question such as vaccine safety?

Microtargeted influence operations aimed at dismantling scientific authority are of course not a thing of the past either. Using already existing anti-scientific networks build up from rightwing climate disinformation, pivoting the manipulative message from climate policy to public health sabotage is quite easy in today’s world.

  • Let’s say you are someone from the owner class (yep, smuggling in Marxist theory words here, seems only appropriate given the topic) who would not like to shut down his factories while still paying workers, just because of a stupid pandemic? What if you are not okay with any % drop in profits, even if it means a % drop in life? What can you do to make the public see eye to eye with your way? I don’t know, but maybe it would look like some pseudo-scientific ‘let it rip’ policy with a grandiose name or something.

The Great Barrington Declaration was a ‘public health’ *cough* proposal to not do anything against the spread of SARS-CoV-2, a misinformed, stupid, and anti-scientific idea that somehow *wink wink* found its way into public discourse and policy (Zenone M. et al., PLOS Global Public Health, 2022).

Right-wing billionaires sponsor a network of think tanks, fake institutes, and influencers to further their strategic aims [i.e ‘Great Barrington Declaration’] to avoid Covid restrictions and financial losses to their businesses at the cost of public health (Image Source, reporting from Adrian Egli and Allison Neitzel MD , David Gorski, Amanda D’Ambrosio and others)

Lockdowns were tough, the pandemic sucked and many things did not go as well as we hoped. We all understand that. However, the idea that not doing anything, not even masks or distancing as GBD advocates suggest, would have been actually better is false,. For example, studies showed that lockdown measures saved millions of lives in 11 EU countries alone (Berry CR., et al., Proc. Natl. Acad. Sci. USA, 2021). Cost-benefits of different NPIs, including lockdown measures, are not always easy to quantify, but the idea that outcomes would not have gotten much worse following the GBD policy is not based on any evidence and vehemently contradicted by domain experts (Lewis Dyanis, Nature News & Views, 2022).

The fact that the same right-wing network that promotes anti-scientific inaction on climate measures also promotes anti-scientific inaction on public health measures, with often the same tactics and talking points, should give the public some pause though, if only they were aware of the shady asymmetric actors behind it.

Online misinformation against scientific consensus is a critical determinant of perpetuating unnecessary human loss and suffering, and I do wonder:

Are democratic citizens really on board with throwing out science so that a fraction of influencers and businesses can profit mightily?

My guess is no, but again, in my experience, scientific institutions have a very small megaphone to make their concerns heard or compete for attention in the current information environment.

Do you want to know what else can go wrong when science and institutions take a back seat in societal sense-making?

You guessed it, we’ve finally arrived at one of my pet peeves:

Epistemic confrontation through strategic weaponization of conspiratorial thinking

Conspiracy theories are nothing new, but technology offers new ways for information combatants to exploit them. Carl Miller, Centre for the Analysis of Social Media. (personal communications)

Conspiracy theories are ubiquitous on social media platforms, it is almost a cliché at this point, but scientifically, it is actually not entirely obvious why.

For example, some research suggests that conspiracy theories are knowingly shared by individuals for social motifs, like building connections and social status (Ren ZB. et al., researchgate, 2021), so their prevalence on social media platforms is somewhat expected.

It is noticeable how conspiratorial audiences develop strong parasocial relationships with conspiratorial influencers, not unlike cult members with their gurus. Many cognitive psychologists and anthropologists have a lot more to say about how crowdsourced misinformation plays into the dynamics and mechanisms of cults than I ever could, and how easy it is for anybody to fall for them. (Do check out the decoding-the-gurus podcast where two professors do a brilliant and entertaining job of tackling the worst offenders!)

However, a recent study also indicates the arrival or use of social media did not change the prevalence of conspiratorial beliefs over time (Uscinski J. et al., PLOS One, 2022). Others have noted that conspiracy theories are nothing new and have been with us for a long time (Prooijen JW & Douglas KM, Memory Studies, 2017). What is maybe less known is the unexpectedly large percentage of any citizenry has a predisposition to conspiratorial ideation (i.e a conspiracy mentality) (Uscinski J. et al., PLOS One, 2022).

Contrary to our intuition about a human trait so prevalent, conspiracy mentality is not evenly spread between left-or-right leaning political parties, but rather reflective of extremism and authoritarianism of the respective political grouping, and currently most authoritarian parties are politically on the right (Imhoff R. et al., Nature Human Behavior, 2022)

Taken together, supporters of political parties that are judged as extreme on either end of the political spectrum in general terms have increased conspiracy mentality. Focusing on the position of parties on the dimension of democratic values and freedom, the link with conspiracy mentality is linear, with higher conspiracy mentality among supporters of authoritarian right-wing parties. Thus, supporters of extreme right-wing parties seem to have a consistently higher conspiracy mentality, whereas the same only counts for extreme left-wing parties of a more authoritarian makeup and with less focus on ecological and liberal values.- Imhoff R. et al., Nature Human Behavior, 2022

But if conspiracy theories and psychological attachment to gurus/leaders have been around forever, when extreme & authoritarian parties are equally susceptible and social media did not change the overall prevalence of conspiratorial ideation, why are we experiencing such a dramatic impact of conspiracy theories over our lives today?

Well, part of the answer might lie with information combatants.

Former president and leader of the republican party in the US Donald Trump is embracing the Qanon conspiracy theory in a post on his alternative social media platform ‘Truth Social’ (Source)

Political leaders and movements can use conspiracy theories to gain power, and this is not unique to the US by any means, as we have seen for example with the Philippines, but we could also talk about Poland, Italy, and Hungary, or Myanmar, Nigeria, and Brazil, and many other nations currently in democratic decline. Making use of conspiracy theories is one of the oldest tricks in the book that authoritarian leaders and dictators use to attack opponents, galvanize followers, shift blame and responsibility, and undermine institutions that threaten their power (Ren ZB et al., Curr. Opin. Psychol., 2022). Most politicians who share conspiracy theories are usually regarded as less trustworthy, but certain political actors can create the authentic impression of an ‘outsider’ capable of changing the system (Green R. et al., Journal of Experimental Social Psychology, 2022). Some experiments suggest that lying demagogues appeal ‘authentic’ to people who perceive the current system as unjust or illegitimate (Hahl O. et al., American Sociological Review, 2018).

[…] conspiracy beliefs are correlated with alienation from the political system and anomie — a feeling of personal unrest and lack of understanding of the social world. — Douglas KM. et al., Political Psychology, 2019

Interestingly, conspiracy theories in themselves might present another asymmetry in what type of political actors (spoiler: it is not the friendly, normal ones) can make full use of them to further their strategic aims. Demagogues have a clear incentive and an advantage when they can weaponize available grievance, hate or fear narratives, but their political opponents can not.

So when we understand information spheres also as battle spaces, certain actors weaponizing crowd-sourced conspiracy theories for strategic aims is like using available terrain advantage for battle advantage. A broken info sphere makes their influence operations more effective and likely to succeed.

Another way to phrase it:

Conspiracy theories are crowd-sourced information operations in wait to be weaponized by certain politicians.

Leaders often spread conspiracy theories to direct the attention, emotion, and energy of followers toward a common enemy who threatens their interests, thereby galvanizing followers. Toward this end, many conspiracy theories depict a nefarious perpetrator engaging in covert activities to harm the welfare of followers. — Ren ZB et al., Curr. Opin. Psychol., 2022

While autocratic movements might arise spontaneously, the vast majority of autocratic political movements today have been flying under the radar of democratic societies for a long time. Take Italy’s (prospective) new prime minister, Giorgia Meloni, a neo-fascist, grievance populist and head of Mussolini’s successor party, a party who after decades at the margins of society, found itself profiting from the current forces favoring autocrats. In Meloni’s case, in the form of a softer-looking strongwoman and demagogue with a ethnochristian-patriotic sentiment towards Italy and a hateful, conspiratorial message against the ‘establishment’, ‘elites’, and immigrants. This is not to single Italy out, far-right parties all over Europe have been strengthened by playing exactly the same messages.

In many places around the world, we are currently observing targeted online mass mobilization around shared conspiratorial narratives, either shaping available conspiracy theories towards political ends or manufacturing new conspiracies that tap into conspiratorial audiences for a strategic purpose.

Nowhere does this conspiratorial mass mobilization seem more visceral and impactful than in the US, so we will have to look a bit under the hood of what is going on there.

  • Has anybody noticed something weird going on in the US?

January 6th and Trump’s election steal myth, anti-vax conspiracy fantasies, lableak fearmongering, QAnon, white genocide, and moral panics about immigrant caravans or LGBTQ minorities; many of the most hateful and conspiratorial narratives seem to aggregate around a political party, leader, and ideology. Prima facie, it is odd to see such a diverse set of conspiratorial ideas neatly align with a large segment of the population that happens to vote for the same political party under the whip of an autocratic demagogue. Even before the weird cult-like parasocial worshipping of this demagogue, this is a weird and dangerous situation.

Donald Trump was always a quite limited character with thinly veiled autocratic and anti-democratic tendencies, his ascent to political power very much a harbinger of democratic decline as well as a reflection of damage already suffered. Not to spend any more words than necessary on his central role, his actions and tactics were very much boilerplate autocratic and pretty commonplace all over the anti-democratic world.

Politicians who adopt conspiratorial strategies find this to be an especially effective tactic if their own claim to power is illegitimate or controversial. Moreover, since the exposure to conspiracy theories reduces followers’ confidence in democratic institutions, leaders may even mobilize followers to engage in violent actions that further undermine these institutions (e.g., disputing an election defeat by initiating riots or mobilizing military forces). — Ren ZB et al., Curr. Opin. Psychol., 2022

Many journalists and academics have written elegantly about the roots and impact of the January 6th insurrection, Qanon, and all the other anti-democratic conspiracy theories currently advanced by US political actors. It is easy to get lost in these details, so again, I just want to focus on and highlight some of the larger systemic patterns.

Dr. Caroline Orr Bueno is a postdoctoral fellow at the University of Maryland’s applied research laboratory for intelligence and security.

An important feature of weaponizing conspiratorial thinking in the information age has been technological targeting. As we have mentioned in our introduction, there are many sociological, psychological, and cognitive characteristics that are associated with believing and sharing misinformation, including conspiracy theories (Uscinski, J. E et al, HKS misinformation review, 2020 , Bruns H. et al., Publications Office of the European Union, 2022).

If social media does anything well, it is good at segmenting the population of users based on demographic, behavioral, or engagement characteristics.

One striking characteristic of being prone to conspiratorial ideation is the fact it does not stop at any specific conspiracy; usually, the best predictor for believing in a specific conspiracy is already believing in other unrelated conspiracies (Goreis A. & Voracek M, Front. Psychol., 2019).

The best predictor of believing in a conspiracy theory? Already believing in others

That is a simple enough pattern to figure out for example by the ranking algorithms on youtube that drive people down rabbit holes of ever more extreme conspiracies about a topic. Take ‘crisis actor’ content:

[…] YouTube’s role in spreading this “crisis actor” content and hosting thousands of false videos is akin to a parasitic relationship with the public. This genre of videos is especially troublesome, since the content has targeted (individual) effects as well as the potential to trigger mass public reactions.

The view count for 50 of the top mass shooting-related conspiracy videos is around 50 million. Not every single video overlaps directly with conspiracy-related subjects, but it’s worth pointing out that these 8842 videos have registered almost four billion (3,956,454,363) views. — Albright J., medium, 2018

Four billion views and endless videos about a baseless conspiracy theory are pretty crazy. If you have been sleeping behind the moon, these dynamics are why convicted conspiracy influencer and liar Alex Jones profited heavily while his promotion of ‘crisis actor’ conspiracy fantasies put a target on the back of grieving parents who lost their children. And we all lost a bit of our humanity. Alex Jones is of course a political influencer furthering right-wing ideology on all kinds of issues, not only gun legislation. But the reason why he could reach so many audiences has everything to do with ranking algorithms pushing people in his direction, and the financial reward for influencers who create outrageous and engaging content that grabs attention, irrespective of truth, accuracy, context or value. This is one way to weaponize the conspiratorial predisposition of people for political aims.

The other way how conspiracy theories can be weaponized by political actors is of course via influence operations. Oh sorry, that is just called political campaigning now, at least when you are part of the MAGA movement. All political campaigns of recent times, including Trump’s, made heavy use of anti-democratic microtargeting to find voters susceptible to their message, and when your message is a conspiracy theory like the “birtherism” myth, well guess what type of audiences the microtargeting algorithms will deliver to you?

If you have a Custom List of three hundred thousand people, […] you can use Lookalike Audiences to find another three hundred thousand Facebook users with attributes similar to those in the first group. One of the most difficult tasks of a political campaign — distinguishing likely supporters from the undifferentiated mass of the American electorate — can now be accomplished instantly through artificial intelligence. Brad Parscale, Trump’s 2016 social media campaign manager, profiled in The New Yorker

That is preparing for mass mobilization. The best predictor of believing a conspiracy theory is already believing in others. Today, we understand much better what actually happened, and what is still going on. Right-wing US politicians have been playing hard into the power of conspiracy theories to mobilize voters.

The MAGA political movement is weaponizing grievance, hate and fear conspiratorial narratives to speak to a core audience and to engage new audiences, inducting them to a process of radicalization and mobilization (Weinberg & Dawson, Sociology, 2020)

While unethical and dangerous, it is somewhat expected from an electorate and ruthless party who feels their current grip on power is threatened, potentially permanently, unless they subvert democracy to remain in power.

Systems such as open elections and the free press can safeguard democracy by illuminating corrupt behavior and ensuring the peaceful transition of power. Leaders may use conspiracy theories to undermine the credibility, legitimacy, and authority of these institutions, however, if they threaten their power.- Ren ZB et al., Curr. Opin. Psychol., 2022

The next steps are also somewhat predictable, no matter if Qanon, January 6th, or ‘democrats are groomers’ conspiracy fantasies, we should by now have a good intuition of where things are going. Very much as with the anti-vax behemoth, climate denial, or anything about Covid, once an emotionally engaging narrative (and what is better than fear, anger, and hate directed at a scapegoat, right?) is brought to the peak of the attention economy (platformed by let’s say a former president, let’s say), it is very hard to put the conspiracy demons back in the box.

Even worse, the ‘winner takes all’ dynamics of our attention economy will make sure that an array of subservient content for the conspiracy theory will be created by manipulative information combatants and persuasive influencers.

Those asymmetric actors will mainstream, justify, normalize and expand upon any conspiratorial narrative that reaches a wider audience.

This dynamic is very much desired by those politicians who can abuse the conspiracy theories they set into the world, or promoted to the front pages. They will use the asymmetric advantage our epistemic crisis provides to them, for example by utilizing conspiratorial talking points to keep attacking their political enemies, often even intimidating them out of public conversations by exposing them to harassment, doxxing and stochastic terrorism from whipped up radicalized supporters. They might abuse the power of the state to do show trials and witch hunts to boost their profile, as we observe for example by Senator Rand Paul’s attacks on Anthony Fauci. By now, the keen observer will have seen this dynamic in action already multiple times, from the ever-expanding lableak conspiracies to the January 6th gaslight attempt to rewrite history. Once attention has been captured by a conspiratorial narrative, politicians can use it (almost casually) for their personal aims, even plain obvious campaign ads.

The systemic effect of conspiratorial political movements and it’s associated dehumanization of political opponents in the same larger network is corrosion and disintegration, a falling apart of the shared whole system, leading to disarray, dysfunction and chaos. Preconditions for civil war between the remaining components of the old system and the alternative dominance networks currently being build. If the US will still be a democracy after this conflict is anybody’s guess.

But it does not have to come this far. The future is not deterministic and there is a lot of systemic immunity building up, in the US and democratic societies elsewhere, against the asymmetric forces, actors and behaviors trying to reshape the world in their image.

Democratic societies can still find a new stable equilibrium in the technological age.

The next chapter will offer some solutions to the epistemic crisis, but before we can go there, we have one more example to talk about. How bad it can really get.

Creating epistemic nihilism to facilitate war

How could I not talk about Russia?

Decades of asymmetric misinformation warfare have contributed to gradually shaping the Russian nation and its people of 150 million to become cynical, disengaged, or ignorant enough to allow a kleptocratic megalomaniac in power to usher in a new dark age of imperialist war, slaughter, and nuclear escalation.

Members of Ukraine’s UNA-UNSO militarized group walk among the debris of destroyed Russian military vehicles on April 5, 2022. Source: Illia Ponomarenko reporting for The Kyiv Independent

In many ways, current Russian circumstances are the product of many of the asymmetric forces and actors we talked about, just driven to extreme and in action for a long time. Today, the Russian society serve as the tragic example of what can happen to a citizenry where epistemic crisis gave way to epistemic nihilism.

But let’s start de-tangling the Russian system best we can by outlining some themes. The Russian population has suffered an assault on their epistemology long before social media systems were even conceived, and this came not out of nowhere.

In complex systems, understanding initial conditions is critical to describe system behavior, and while it is not always easy to pick where to start, maybe the previous system state before transition to the current one can be helpful. So we have to talk about the Soviet Union before talking about post-Soviet Russia. From Stalinist regime to final collapse of the Soviet Union, Russians had been living in a state of fear, terror and uncertainty, threatened by random purges, gulags, economic crises and paranoid leaders, and Putin is very much product and continuation of that heritage.

Conspiracy theories became a principal element of Russian society thinking twenty years ago as a reaction to the sudden and inexplicable collapse of the Soviet Union. — Shinar Chaim, SSRN, 2016

However, there was a short period of hope after the collapse of the Soviet Empire that democratic reformers will manage to secure the stability of a young democracy, which ultimately failed (McFaul M., Journal of Democracy, 2021), not in small part by the economic havoc caused by reckless privatization (‘shock therapy’) on behest of dubious American economists like Jeffrey Sachs, an autocratic power player who nowadays turned full Putin apologist, Uighur genocide denier and lableak conspiracy theorist. But I digress…

Nearly all the post-Soviet states suffered deep and prolonged recessions after shock therapy, with poverty increasing more than tenfold.

Point is, millions of deaths of despair after the system collapse and instability, while simultaneously causing redistribution of wealth to a kleptocratic oligarchic elite, would certainly foster resentment and desire for a strongman who can put the Soviet empire back together. Enter Putin.

In the name of this stability, he has consolidated power in his own person in an astounding way. In his first two terms, from 2000 to 2008, he brought down the oligarchs, thereby regaining total control of the news media and orchestrating the breakup of Yukos, the giant oil company (and jailing its chief executive, Mikhail Khodorkovsky), which returned two important power sources to the state. His loyal friends now run most of Russia’s important industries. Unfettered democracy also pointed the way to chaos, and so he developed something his advisers called “managed democracy,” providing only the semblance of popular will. ­Opposition parties were neutered, and Russians lost the ability to vote in direct elections for local or regional governments. — Steven Lee Myers, “The New Tsar, The Rise and Reign of Vladimir Putin”, 2015

With Putin’s regime came a war on critical media. Since then, many books have been written about the Kremlin’s weaponization and utilization of conspiracy theories (see e.g books from Yablokov, Borenstein, or Abrams) as part of their “active measures” for political influence operations against their own people as well as other nations. These range from the basics of exerting control over media companies, even the newly formed tech giant Yandex (a google competitor who got transformed into a propaganda tool), to an array of sophisticated covert information operations like “tainted leaks” (Hulcoop A. et al., Citizen Lab Research report, 2017) to thwart political enemies. The details are often bizarre and dystopian. The effect of the sum total of this assault on the media infrastructure in Russia can be understood as “digital repression” (Earl J. et al., Science Advances, 2022). And the effects are chilling.

Many Russians today live with a learned cynicism towards the concept of obtainable truth, swallowing up conspiratorial talking points delivered by sycophantic state media, playing on the same few emotional themes of fear, doubt, uncertainty and patriotism (to the pain of the remaining independent journalists in Putin’s Russia who have not been bought, locked, up or killed). The average Russian believes that everybody and everything is lying to them, that nothing and nobody can be trusted, and that no matter who is in power, nothing will make a difference (Skillen D., Post-Truth and normalized lies in Russia, 2019).

“If everybody always lies to you, the consequence is not that you believe the lies, but rather that nobody believes anything any longer. And a people that no longer can believe anything cannot make up its mind. It is deprived not only of its capacity to act but also of its capacity to think and to judge. And with such a people you can then do what you please.” — Hannah Arendt, political theorist, philosopher and holocaust survivor

This systemic nihilism towards obtainable truth and subsequent apathy to politics seems like a key pillar of Putin’s grip over Russia, but then again, I do not have any special expertise on offer, and in complex systems, causal mechanisms are not linear or independent. Many Russia experts however noted the role of sedative effect of nihilism in the Russian society.

Although Russia is fascist at the top, it is not fascist through and through. […] Putin’s regime functions not by mobilizing society with the help of a single grand vision, as fascist Germany and Italy did, but by demobilizing individuals, assuring them that there are no certainties and no institutions that can be trusted. This habit of demobilization has been a problem for Russian leaders during the war in Ukraine because they have educated their citizens to watch television rather than take up arms. Even so, the nihilism that undergirds demobilization poses a direct threat to democracy.- Timothy Snyder, foreign affairs, 2022

Epistemic nihilism does offer an explanation why so many Russians looked the other way, ignored, or were ready to swallow up propaganda about the reasons for the “special military operation” that would see Russia invade Ukraine, a nation where many of their brothers, sisters, cousins, and relatives lived. Why even months into the war, they supported Putin’s war.

Where were the mass protests, civil disobedience, and Russians fleeing the country at the start of the war?

Epistemic nihilism is of course not a sustainable condition when the reality of the war comes crashing in, for example in the form of mass mobilization Russians cannot ignore any longer because it is their children, not just minorities, who are drafted to be cannon fodder. Not that the reality on the frontline has been any less sobering to the soldiers misled about their mission.

Intercepted phone call of a Russian soldier confronted with the reality that he has been fed lies. Source: The New York Times

That the real reason for the invasion has been a democratic transition of Ukraine that threatened the political legitimacy and power of the Putin regime is barely worth mentioning, and we will refrain from sweeping declarations about history that has not yet been written. Just support Ukraine however you can.

There is just one last point I want to make.

Putin’s regime has also acted to subvert democracies all around the world (as did other regimes, information warfare is a global phenomenon with many combatants), and this is an illustrative example of how complex systems interact with each other, forming their own meta-dynamics. What impact did covert information operations from Russia have on US media? What about Russia’s financing of far-right European parties (Futàk-Campbell, Atlantisch Perspectief, 2020)? Would they have made the political gains they did if Russia were a democracy? What about Brexit? How can we exclude the possibility that the sum total of the countless information operations coming from asymmetric actors actually influenced, accelerated or even triggered event cascades leading to the democratic backsliding we observe around the world?

In complex systems, small inputs can have disproportional output.

Intentionally or not, hateful narratives seem to be tinder in a complex world at odds with its own technological disruptions.

The gift of Russian propagandists has been to take things apart, to peel away the layers of the onion until nothing is left but the tears of others and their own cynical laughter. — Timothy Snyder, foreign affairs, 2022

But that complex system’s coin might have two sides too. While democratic decay in some nations might empower democratic backsliding in others, the fight for democracy in one nation might inspire others to defend the democracy they have too. That is my hope at least.

If only we manage to find the epistemic clarity to understand the reality of our circumstances, preferably before our planet burns or we get drafted into war too…

Conclusion Chapter 3:

The technological disruption of our information infrastructure has resulted in an assault on science, education, journalism, and more generally the notion of obtainable truth.

Our epistemic crisis is a systemic weakness that favors the powerful in multiple ways to the detriment of the public, democracy and the planet, as this chapter has painfully documented.

Democracies might not survive the assault from asymmetric forces of influence in our information age, and I am certainly not alone in my worries:

Right now, the huge potential of technology to advance our societies has been undermined by the business model and design of the dominant online platforms. But we remind all those in power that true human progress comes from harnessing technology to advance rights and freedoms for all, not sacrificing them for the wealth and power of a few.

We urge rights-respecting democracies to wake up to the existential threat of information ecosystems being distorted by a Big Tech business model fixated on harvesting people’s data and attention. — Maria Ressa & Dmitry Muratov, 2021 Nobel Peace Prize laureates, calling for pro-democracy action against big tech

It is important to understand that technological change has certainly disrupted the “societal information industry”, but technology is not deterministic, nor is technology the sole driving factor (complex systems, remember?!) in upholding our currently broken info sphere.

Many powerful forces or newly empowered actors, from autocratic states to capitalist platforms and businesses, all the way down to influencers, amplification dynamics, and our own lazy cognitive biases, try their hardest to entrench the current state of affairs in an unprecedented power grab for our attention and our beliefs, and with it, our capability for self-determination.

It is an all-out assault on our democratic values and worldview, yet every day, democratic citizens find themselves getting pushed, pulled, or sucked in more and more by fragmented realities full of half-truths, lies and conspiracy theories, building walls of conflict along arbitrary lines against each other while asymmetric actors make strategic gains against the public good. We have our attention stolen and our beliefs manipulated by system dynamics and emergent phenomena we are part of but don’t fully understand. We have reached a point of epistemic crisis where the public good is sabotaged by accident and Machiavellian interests, where large amounts of citizens are role-playing in hateful fantasy realities made up by self-serving political actors, sometimes driven into epistemic nihilism and cynicism to the extent where a Russian populace discovers itself willing to shed the blood of a former brother at the behest of an imperialist madman, under the outrageous but profitable applause of too many western influencers and sympathizers. Is this really the authentic will of free, democratic citizens?

We are witnessing the descent into a darker future that is less egalitarian and less democratic, and that leaves us less capable to take agency over our own lives and shape the systems we are part of. This is neither right nor sustainable. It will lead to continued humanitarian, ecological, and environmental catastrophes, and eventual system collapse.

Is there really no way out of this gradually self-fulfilling misery course we have charted?

Chapter 4: Science as a candle in the dark

Democratic revival and an info sphere 2.0

This chapter will outline strategies to actively target, fight and neutralize asymmetric actors and threats to the common good, as well as summarize possible avenues to pursue to protect democracies in the information age.

Bad epistemology makes for bad democracies, and bad democracies are not stable.

How to mount a digital counter-offensive for democracy

First and foremost, everybody I talked to agrees that there is not one single technological solution or action that could solve the detrimental impact of our current information systems on democracy.

“There is no magic bullet to solve our epistemic crisis“ — Prof. Stephan Lewandowsky (personal communication)

Just as there is no single actor or force that threatens our democratic processes (complex systems!), solutions will have to be specific, targeted, and actionable. That does not mean that we can not have a larger framework or theory from which we can abstract some general principles that we can put into action. Think about our immune system, while there is an endless array of pathogens it has to protect against, it operates on a small number of general principles, which include capabilities to detect and destroy infection, protect uncorrupted ‘self’, store and use the memory of repeat invaders, and self-regulation of its response once the attack is over (Lentz AK., et al. Nutr. Clin. Prac., 2003). This sounds like a pretty good starting framework for protecting a information systems and democracy too.

  • Detecting and destroying informational pathogens

In general, when it comes to detecting and destroying infection, we have to differentiate between harmful information products, harmful information spreaders, and harmful information system behavior.

In many ways, discussions about content moderation, ranking algorithms and choice architectures primarily target information products, and as others have noted, it is damn difficult to identify harmful content among the mess of human language, expressions and behaviors, especially when it comes to distinguishing genuine harmful information products from satire, humor or criticism of said information. Misinformation is pernicious, even true information can still be taken out of context, distorted or used to manipulate. So while necessary to use some automated way to detect and reign in harmful information products, no machine learning system can solve the problem of harmful information products in our shared info sphere.

A more promising approach might be to target harmful information spreaders. Especially when we understand that harmful information products have limited impact unless amplified and spread to many other nodes of the network. Sometimes called “information cascades” in research (more commonly known as something going viral), influential spreaders (i.e influencers) are the tipping points of harmful information cascades and bear responsibility when contaminating the rest of the system.

However, this also means that targeting those harmful information spreaders might be an effective way to counteract the spread of misinformation. Initial scientific studies show the promise of this approach.

Take de-platforming, (which is basically the removal of an information-pathogen spreading node in our network), researchers have shown is highly effective at reducing misinformation and even has positive knock-on effects on the rest of the network (Rauchfleisch R. et al., SSRN, 2021)

If we spin this though further into graph theory, we can use the network topography to identify and disrupt information spread through, for example, generalized network dismantling.

The removal or deactivation of even a small set of nodes may dismantle the network into isolated subcomponents and thereby stop the malfunctioning of a system. The effectiveness of node removal depends on the network structure and the removal strategy. — Ren XL. et al., PNAS, 2019

Graph theory, math, and epidemiological science can help combat asymmetric actors by offering insights into how to contain or dismantle anti-democratic and conspiratorial networks (Ren XL. et al., PNAS, 2019)

While approaches using this framework look promising to keep larger democratic platforms from getting infected, isolated sub-networks are still breeding ground for radicalization and polarization, and these fragmented realities do become more numerous too.

It’s clear that this alone will be insufficient to turn the tide of autocratic infection.

  • Protecting citizens from infection

Using technological tools to remove harmful information products or limited their spread can often not prevent citizens to come in contact with some informational pathogens. If we know about the pathogen in advance, citizens can be immunized through prebunking, basically vaccinating (inoculation) them against a specific type of information they will be exposed to. Technological solutions can help with implementing inoculation strategies, for example to build immunity against online manipulations (Roozenbeek J. et al., The Conversation, 2022)

Inoculation is an effective method to empower citizens against manipulation techniques. (Source: inoculation.science, see also: Roozenbeek J. et al., The Conversation, 2022

The major problem I see with inoculation is the catastrophic signal-to-noise ratio of our info sphere, how will prebunking content ever compete on attention with the inherent attractiveness of emotional narratives, crafty influencers and pay-for-play actors? Certainly not if we keep the attention economy as is.

We obviously need to look at reforming the whole online information architecture, and that needs politicians to apply pressure and regulations to these antidemocratic tech platforms.

  • Changing the tech platform dynamics

Just a few months ago, a whole array of proposed regulatory actions was agreed on by the EU (broadly falling into demanding oversight, content moderation & consumer protection) to reign in tech platforms.

The Digital Services Act is the new EU law that aims to limit the spread of illegal content online. It establishes a new set of obligations for private actors with the aim to create a secure and safe online environment for all. It is the first time in the history of EU platform governance regulation that people’s fundamental rights are put at the forefront.- Prikova E., “The Digital Services Act: your guide to the EU’s new content moderation rules”, 2022

This is laudable and absolutely necessary, European citizens can be proud to have at least some political organization (the EU) standing up for their democratic rights and values online. This is not a given in today’s world. Notable is also the role of science to inform policymakers, I can highly recommend (Lewandowsky S. et al., Publications Office of the European Union, 2020) which grapples with much of the same problems technology poses to democracy that we outlined in this article. The Digital Services Act will come into force in 2024 and should change our digitial environments for the better (quick summary here), but I am afraid it will not be sufficient.

One concern I have with political regulations is not that they can not be effective or wonderful, but that they are often reactionary when something went catastrophically wrong, or are too slow to keep up with the dynamic changes in the information sphere. If we had the Digital Services Act in let’s say 2014, instead of 2024, how different would Europe look? Would Brexit have happened, and if so, would it have happened in this hard-Brexit shape? How would the Syrian migrant crisis of 2015 have played out? Would Hungary and Poland have gone down the anti-democratic route they did? Even by 2024, what can the Digital Services Act do against new & emerging threats coming from deep fakes (Veerasamy N. et al., ICCWS, 2022) or artificially created content and actors, indistinguishable from authentic users (Kreps S. et al., JEPS, 2022)?

Remember also that in Chapter 1, we talked about the need for complex systems to be flexible and adapt to arising challenges to stay alive.

It would be unrealistic for certain complex systems to have survived in competitive environments if they had not adapted to it by evolving their self-defense capabilities.

Coming from a cancer research background, I have a learned appreciation of how difficult it can be to target a competing adaptive complex system with its own dynamics and adaptive strategies (Markolin P. et al., Genomics, 2021). It is challenging to defeat a malicious outgrowth that is slowly eating away at the larger whole. So you will have to excuse me to lean a bit on that cancer analogy in this section because it is illustrative of the general frameworks we need to employ to build effective immunity measures against the cancer of autocracy festering in our democracies.

If a complex dynamical system is globally asymptotically stable then any limited-time perturbation applied to the system will dissipate, and the system, sooner or later, will end up within its unique asymptotic attractor.- Rosenfeld S., Gene Regulation and Systems Biology, 2011

In non-complexity speech, this means that the real challenge of targeting adaptive systems like cancer is their robustness to perturbation (Rosenfeld S., Gene Regulation and Systems Biology, 2011), meaning interventions that do not destroy the whole will allow the remaining system to rebounce again and again once the interventions drop or become ineffective. Cancer can adapt, and so can asymmetric actors and alternative autocratic networks and systems.

This is to warn you that while there are short-term actions that are effective, they are not addressing the root causes, and will be unable to deal with the autocratic cancer in our system in the long run.

Let’s consider de-platforming again, which is highly effective at reducing misinformation and even has positive knock-on effects to the rest of the network (Rauchfleisch R. et al., SSRN, 2021). Yet disinformation researchers have been noticing for some time that de-platformed influencers increase the robustness in hate networks by spreading to multiple alternative platforms (Guhl J. et al., ISD report, 2022). In cancer, we might call that metastasizing. Metastases make both systemic protection and cancer treatment much harder.

We also talked about how the attention economy will keep emotional narratives alive artificially, constantly mutating and adapting them to new circumstances with the help of USP-hunting influencers, without any specific further input needed when the attention changes. That’s why no matter what is in the news today, commentators will find that wokeness is actually at fault. Masks? Blame wokeness! Vaccines? War? Wokeness! It will never die. Cancer cells do the same, they mutate and diversify over time (phenotypic heterogeneity), some lineages might die out after a short run while others gain traction, overall making sure the system stays alive through environmental change (Ortega-Sabater S., et al., biorxiv, 2022).

That is why there is no magic bullet, in fact, even a magic shotgun might not be enough

So while we might need acute technological treatment in the short run, we also need cures that can defeat ever-changing autocratic systems in the long run.

´For that, society’s immune system might need a software update.

Resolving our epistemic crisis

“A democracy, if we can keep it”

…to borrow from Ben Franklin’s sentiment when prompted by Elizabeth Willing Powel; holding up any form of government comes down to the people.

Because each one of us is a constituent part of a democracy, and also contributes to creating the epistemic crisis, all of us also have to be part of any solution. Sorry, it is not only going to be about defeating bad actors, by now you should know it is more complex than that.

This need for societal participation and empowerment is also clearly outlined in e.g. Lewandowsky S. et al., Publications Office of the European Union, 2020, and should be plainly obvious to democratic citizens.

But where to start?

When we talked about the complex functions of ‘living’ in Chapter 1, we noted a somewhat trivial observation:

Largely similar autonomous units (cells) behaving a bit differently can produce vastly different complex functions, behaviors, and emergent phenomena.

Conversely, this also means that changing emergent phenomena might only require largely similar autonomous units to just behave a bit differently.

After all the asymmetries we talked about, I think this symmetry is worth exploring.

How can we, individually and collectively, act a bit differently to support the democracies we are part of?

It is a question I have been struggling with for a long time, and often, I discover that inspiration can be found in literature. In one of my all-time favorite books named “The Demon-Haunted World”, written almost 30 years ago by Carl Sagan, the famous science communicator, talks about his dangerous vision of the future:

I have a foreboding of an America in my children’s or grandchildren’s time — when the United States is a service and information economy; when nearly all the manufacturing industries have slipped away to other countries; when awesome technological powers are in the hands of a very few, and no one representing the public interest can even grasp the issues; when the people have lost the ability to set their own agendas or knowledgeably question those in authority; when, clutching our crystals and nervously consulting our horoscopes, our critical faculties in decline, unable to distinguish between what feels good and what’s true, we slide, almost without noticing, back into superstition and darkness… — Carl Sagan, The Demon-Haunted world

As you might now realize, his vision hits a bit too close to home for comfort. When I read his book the first time maybe eight years ago, I was baffled by the prevailing conspiratorial thinking at the time. Horoscopes, UFO abductions, psychics… Carl Sagan, a full professor at Cornell at the time, playing a leading role in American space missions, spend great personal effort and the full arsenal of scientific tools to debunk them. Why would he care? It all seemed so alien to me (pun intended). How could these obviously wrong and lunatic ideas (and their believers) ever be more than just fringe? How could these conspiratorial ideas and their proponents ever exert any real political power in a democracy? How could this magical thinking impact the lives of people who are way too smart to fall for them? Why did Carl Sagan even bother to put in the effort debunking them? In a way, I had the sense that these conspiracy theories were something to laugh at, or at most feel compassionate for the few people who fell for them.

Turned out, I was wrong. Or more precisely: I was arrogant, ignorant, and too short-sighted to see the larger picture. Technological advancements of information sharing technologies did not only make our communication capabilities more powerful, but also made us more vulnerable to an arsenal of information drugs. Drugs few of us would ever come in close contact with in the past, but that are now ubiquitous today. Nobody is immune to holding beliefs that are not supported by scientific evidence, or even contradicted by it. It is in part our human nature. Magical thinking is widespread. Intelligence is not the issue, because smarter people are just better to justify their own irrational beliefs to themselves. A large proportion of the population is especially susceptible to conspiratorial ideation that unfortunately lends itself to be weaponized by politicians and other powerful actors. Our human biology has not changed much in the last 40.000 years, but our society and our technology has.

We have paleolithic emotions, medieval institutions and godlike technology. — Edward O. Wilson

Carl Sagan was fond of history. In his book, he explored how humans had held supernatural beliefs for thousands of years, the way they justified those beliefs, and why they held onto them, often doubling down despite the evidence against them. He foresaw not only the allure and danger of magical or conspiratorial thinking rooted in our human cognitive biases and preferences, he understood them at a deeper level as the threat to democracy as they are. Bad epistemology makes for bad democracies, and bad democracies are not stable.

If the light of science were to ever drop, he prophesied, we would return to a demon-haunted world we hoped to have left behind for good. Take this metaphor as you will.

For Carl Sagan, as for me, science is a critical public good for a democratic society, both through its body of knowledge and as a way of thinking. It is a tree planted many generations ago by the aspirations of our most hopeful fore-bearers.

“Society grows great when old men plant trees whose shade they know they shall never sit in.” ― Anonymous Greek Proverb

The scientific endeavor is our only true humanity-spanning collaborative project, the motor that powers our civilization and the one method we have to ground the fragmented realities of our messy human beliefs in shared larger truths. It is the antidote to epistemic nihilism. Science also provides the necessary cool against the heat of public discussion, the pace of day-to-day commentary, and our hot-headed human immaturity that resurfaced with the new information systems.

Enlightenment is the human being’s emancipation from its self-incurred immaturity. — Immanuel Kant (translation)

That is why science as a global public institution for humanity is a threat to populist influencers and narratives, powerful individuals, businesses, and state actors, all forces who dominate in the current world of fragmented realities and have an interest to entrench their separate little epistemic fiefdoms. Is it really a surprise we observe these forces doing everything in their power to poison, subvert or limit access to the fruits of the tree of the scientific process and its positive knock-on effects on society?

Good epistemology works like a vaccine against bad ideas, it inoculates us, makes us less susceptible to being manipulated, and empowers us to knowledgeably make our decisions.

So how does the epistemic vaccine work? What metaphorical ingredients do we need? Even when talking about democracy as a dynamic complex system, the answers and solutions are not really new.

First and foremost, science is not alone in the fight (complex systems!). The public’s defense against antidemocratic forces abusing our current epistemic crisis relies on at least three interconnected factors: education, journalism, and science. The trifecta of good epistemology in the modern world. Because we are a bit more accommodated to system’s thinking now, lets quickly sketch out their systemic functions:

  • Education is the innate immune system of a democratic society. Education has the potential to thwart many information threats to democracy, it teaches humans how to think for themselves (ideally). From history (those who forget history are bound to repeat it) to media literacy (how to identify propaganda and manipulation tactics), from STEM to social science (how do the world and humans function), from liberal arts to literature (what makes us human and what should we want for ourselves and others), education allows individuals to get a baseline grip of the world and their place in it, thereby giving us agency over our lives. Education also outlines the borders of uncertainty between what is known and what is made up. A solid education is like a strong innate immune system, it might not protect against some very viral threats, but at least it protects us against most of the half-baked informational pathogens (think flat eartherism), that are surrounding us.

Public interest journalism ought to become an adaptive immune system against information threats

  • Independent journalism is a signal processing hub and a critical feedback loop for society’s robustness against information combatants. Journalism in a democracy has the duty to filter, disseminate and qualify information to make it accessible and useful for the wider citizenry. Gatekeeping for good information is not censorship, it is noise filtering. We talked about our catastrophic signal-to-noise ratio, and public interest journalism has the duty to help. Free speech and free information flow is currently threatened because it is drowned out by noise, often deliberate and motivated, as we have seen in chapter 2. In the 21st century, even more important than freedom of speech might be freedom from manipulation, and that requires access to reliable information and institutions that protect us from becoming casualties of information warfare, and I believe public interest journalism ought to become part of an adaptive immune system against information threats to democratic society (How to thread that needle might be a different topic)
  • Science is the central control hub when it comes to information, it has the inherent authority to create, assert, dispute, and correct information, thus it is the ultimate arbiter of solving informational conflicts or contradictions and defines shared reality. Only when information and reality perceptions are grounded in scientific reality, will we be able to ensure that productive cooperation towards shared goals remains possible for any democratic society.

The public’s defense against antidemocratic forces abusing the epistemic crisis: education, journalism, and science.

We have to empower, strengthen and upgrade these epistemic institutions collectively, as well as put them in charge of information and information distribution systems, rather than making them subservient to the economic and political whims of the attention economy.

It is certainly possible, but that will require all of us to just do a little bit better too. Small changes can have disproportional outputs, after all.

Conclusion Chapter 4

Our information systems are in dire need of a democratic software upgrade for the digital age. The infosphere 2.0 will contain technological enhancements to already existing democratic mechanisms that kept our info spheres stable even through transformational change and never-ending environmental challenges.

There are multiple approaches to do so, from using the power of learning systems to detect and destroy informational pathogens, cutting out infected amplifiers, to reshaping the network structures to reduce the risk of epidemic spread.

Technological interventions and initiatives can also help to inoculate democratic citizens on a large scale against informational pathogens and manipulations.

The most important technological intervention we need is however not found on these platforms, but about these platforms. We need regulation, from transparency and oversight to anti-trust and consumer protection; we need benefitial algorithms, data protection and taxation. The European Digital Service Act is a necessary start to box democratic values back into these authoritarian companies and aristocratic systems, but it will be a prolonged fight. Democracy usually is.

Even more importantly, we humans need a democratic software upgrade too. Democracy presupposes an equality of influence, yet even before our voices are drowned out by the noise on social media, or overwritten by information combatants, we lose our agency and decision autonomy to systems we don’t fully understand. Even worse, systems that too many of us do not even care to understand.

We live in a society exquisitely dependent on science and technology, in which hardly anyone knows anything about science and technology. — Carl Sagan

The whole point of the enlightenment, the scientific revolution, our educational and epistemic institutions was to empower individuals to find a way out of their self-imposed immaturity, to take agency over their lives and make informed decisions. We need epistemic clarity about reality, for without it, our agency will be blunted and our freedom limited.

It is public-interest education, journalism and science that truly make individuals in a democratic society free, not the right to cast a vote or to elect a representative. So that part is on us, collectively.

And that hard work has just started.

Epilogue:

So, you might ask, after all these ills our technologically empowered epistemic crisis brings to the outside world and democracy, why have we not at least shut down these destructive social media platforms for good? Why have we not yet tackled big tech, like Nobel laureate Maria Ressa is demanding? Why have we humans not stopped participating in the failed social experiment online? Or at least shut down the anti-democratic conspiracy theories flourishing on them?

Unfortunately, we cannot travel back in time, only move forward. It is also worth noting that the attention economy goes beyond social media, shutting platforms down would not make outlets like Fox News go away, would not dismantle alternative hate networks, nor would journalism magically regain the authoritative role and quality it once had.

There are circumstances beyond technology that make people susceptible to conspiratorial thinking and acting, that motivate and drive them towards political extremism, polarization, and authoritarianism. The cat is out of the bag, so to speak, and powerful actors will continue to abuse any and all information systems in the 21st century to create fragmented realities to their benefit. They have seen their opportunities and learned their lessons, so the real question is: Why have we not learned ours?

I do not have a great answer for that, but I believe it is largely because we have created a world that has few things to offer to vast numbers of people who find themselves trapped in fragmented realities, often created and nudged towards by specific engagements algorithms on these manipulative social media platforms.

The addictive dopamine kicks from sharing, responding, and liking are one thing. However, the psychological dependence on confirmation (and other cognitive) biases to make sense of our uncertain world, the feeling of control, community, and belonging are all somewhat to be found on social media, albeit in the worst and most abusive way possible; for the extraction of our data and at the cost of our sanity and often humanity.

We are not only the products, but we are also dependent on the attention economy and many cling to its distractions to avoid another crisis of modern times; a quest for meaning and purpose, a quest for belonging, and a quest for being part of something bigger than ourselves. These are fundamental needs that make us human, yet how many avenues does modern society offer for average people to participate and self-actualize meaningfully?

This already too-long article cannot talk here about the flawed meritocracy of our times, about inequality of opportunity, about the lack of hopeful narratives in a world heading towards dystopias written by technocratic, political, or financial elites and systems we have no or little control over. We should make no mistake that when we find ourselves pulled towards customized fragmented realities, or when we get sucked into believing conspiracy theories, it is not solely by their inherent attractiveness, we are also pushed towards them by other complex systems, interactions, and mechanisms.

In nature, no complex system stands alone, it would be naive to believe that we can switch off, or even dramatically change, the fragmented realities we live in without changing the lived reality of people currently stuck without other options. Education, journalism, and science have the power to elevate us out of our limited perspectives and return to a shared reality, if we let them. There are no technocratic interventions that will solve our human problems for us, only aid in our endeavors.

So even fixing something small and technical like the obviously flawed social media ranking algorithms pushing conspiratorial garbage will require a struggle on multiple fronts, inside and outside of the complex information system itself.

It would be naive to believe we can dramatically change the landscape of fragmented realities without changing the lived reality of people currently stuck in it

Scientists, journalists and citizens have outlined several promising strategies to turn the tide on democratic decline. Some politicians are acting on it, but they will need their citizens to help and participate.

There is always hope.

Dynamical complex systems do not have to rest in equilibrium until an unprecedented dramatic shock or disruption kicks them out of it. We do not need autocratic strongmen to change the system, they will only make it worse. Truly understanding this means that we do not have to succumb to violent revolutions, wars, or catastrophes for systems to change. I reject that proposition, and so should you, because evolving systems such as ourselves, and democracies, are non-linear, chaotic, and complex. Nobody can say for certain how they will play out in the future, and small inputs can have disproportional outcomes. Collective actions matter in such systems. In the rarest of cases, even the (metaphorical) unexpected beat of a butterfly wing can ripple forward in time and be the impetus for changing the system we are all part.

Most likely none of us will ever be that butterfly causing dramatic change, and yet, wouldn’t we be foolish to collectively give up flying when a better tomorrow is possible?

In non-linear complex systems, individual or collective actions inevitably shape the future. For better or worse. So I say let’s try to do better, fellow humans, with science, ingenuity, and compassion for our shared humanity.

Please share this article if you think it is worth it.

Disclaimer:

Please consider that no single article can summarize all the valuable scientific contributions that scientists, journalists, and citizens have made in support of investigating the interface of technology and democracy. It is a large and growing body of evidence, and often a paradox wrapped in a contradiction inside an irony, as Prof. Stephan Lewandowsky puts it.

My contribution to this topic was to focus on explaining key scientific ideas and arguments for non-experts, connecting various different fields of thought under a complex systems framework. While I did go to great lengths to avoid misrepresentations, I cannot always control how my words will be interpreted, nor am I a domain expert in the fields covered here. If there are uncertainties arising from my simplifications, omissions of brevity, or bad analogies, I advise you to first consult the primary literature for clarification rather than presume there is an obvious mistake in the science or reasoning of scientists. My goal is not to influence policy, nor to influence your political views, but to educate and equip citizens with conceptual tools and new perspectives to make sense of the world we inhabit.

This article took a lot of time and effort to conceptualize, research, and produce, actually almost irresponsibly so given that I do not monetize my scicomm and certainly do not plan to start with it now.

I see this work as a public good that I send out into the void of the internet in hopes others will get inspired to act.

You are also invited to deepen this work or just derive satisfaction from understanding our chaotic modern world a bit better.

So feel free to use, share or build on top of this work, I just ask you to properly attribute (Creative Commons CC-BY-NC 4.0).

Cite this work:

Markolin P., “Asymmetric power in the information age”, Medium, November 10, 2022

Direct access link: https://protagonist-science.medium.com/1a6d6f2634c6?source=friends_link&sk=4c7abb9354138ca629858102e0ace8bc

Also, since this topic is close to my heart, I’d be happy to hear your thoughts. I’d be even happier to hear what you plan on doing next.

If you feel this work should have some compensation, please consider donating in support of the Ukrainian people, who do their utmost to defend themselves and democracy in wider Europe from the terrorist assault of Putin’s Russia. Decades of asymmetric misinformation warfare have contributed to gradually shaping the Russian nation and people of 150 million to become cynical, disengaged, or ignorant enough to allow a kleptocratic madman in power to usher in a new dark age of imperialist war, slaughter, and nuclear escalation. While absolutely terrifying, this is merely one outcome of a broken info sphere & epistemic nihilism on a population scale. Let’s not find out all the others.

When nothing is true, everything becomes possible

But the buck has to stop here. With us.

References:

Bessi A. et al., PLoS ONE, 2016

Bruns H. et al., Publications Office of the European Union, 2022

Konrad Adenauer Stiftung, 2020

Boese VA. et al., Democratization, 2022

PEW research, 2019

Whitson JA. et al., Journal of Experimental Social Psychology, 2015

Willy C. et al., European Journal of Trauma, 2003

Kross E. et. al., Trends in Cognitive Sciences, 2021

Meyers RA, 2012

Elliott E. and Kiel DL, University of Michigan Press, 1996

Mazzocchi F., EMBO reports, 2008

Gasparatos A., Environmental Impact Assessment Review, 2008

Grosz MP. et al., Perspectives on Psychological Science, 2020

Wiesner K., European Journal of Physics, 2019

Hornung G. et al., PLOS Computational Biology, 2008

Wang SSH. et al., PNAS, 2021

Jiang B, IMF economic Review, 2021

Lan G. et al., Nature Physics, 2012

Cowan NJ. et al., Integrative & Comparative Biology, 2014

Veccio DD. et al., J. R. Soc. Interface., 2016

Acemoglu, D., & Robinson, J. A. The American Economic Review, 2001,

De Mosquita B. et al, MIT Press book 2003

Besley and Kudamasu, 2008).

Gallagher M. & Hanson J, 2009

Leon G., Public Choice, 2014

Gerschewski, J., Perspectives on Politics, 2018

Linde & Ekman, European Journal of Political Research, 2003

Bocaletti S. et. al., Physics Report, 2006

Nitin Bhardwaj et al., PNAS, 2010

Toelstede, B. , Rationality and Society, 2020

Alonso FR et al., IEEE, 2015

Aldhaheri S, MDPI, 2020

Jackson MO., PNAS, 2015

Gibler & Wolford, 2006

Powell et al., Democratication, 2018

Kakar, M. Hassan, University of California Press, 1995

Murtazashvili JB., Journal of Democracy, 2022.

Kozyreva A. et al., Psychological Science in the Public Interest, 2020

Tsimring L., Rep. Prog. Phys., 2014

Junge K. et al., Systems Research and Behavioral Science, 2020

Tyloo M. et al., Phys. Rev. E, 2019

Leonard NE. et al., PNAS, 2021

Levin SA. et al., PNAS, 2021

Krasodomski-Jones A. et al., Demos, 2019

Lewandowsky S. & Pomerantsev P., Memory, Mind & Media, 2022

Bankler Y. et al., Network propaganda, 2018

Matz SC. et al., PNAS, 2017

CASM technology report, 2022

Lamb, W., et al., Global Sustainability, 2020

King J. et al., Institute for strategic dialogue, 2022

Union of Concerned Scientists. The disinformation playbook

Zenone M. et al., PLOS Global Public Health, 2022

Berry CR., et al., Proc. Natl. Acad. Sci. USA, 2021

Lewis Dyanis, Nature News & Views, 2022

Ren ZB. et al., researchgate, 2021

Uscinski J. et al., PLOS One, 2022

(Prooijen JW & Douglas KM, Memory Studies, 2017

Imhoff R. et al., Nature Human Behavior, 2022

Ren ZB et al., Curr. Opin. Psychol., 2022

Green R. et al., Journal of Experimental Social Psychology, 2022

Hahl O. et al., American Sociological Review, 2018

Douglas KM. et al., Political Psychology, 2019

Uscinski, J. E et al, HKS misinformation review, 2020

Goreis A. & Voracek M, Front. Psychol., 2019

Albright J., medium, 2018

Weinberg & Dawson, Sociology, 2020

McFaul M., Journal of Democracy, 2021

Steven Lee Myers, “The New Tsar, The Rise and Reign of Vladimir Putin”, 2015

Hulcoop A. et al., Citizen Lab Research report, 2017

Earl J. et al., Science Advances, 2022

Skillen D., “Post-Truth and normalized lies in Russia”, 2019

Futàk-Campbell, Atlantisch Perspectief, 2020

Timothy Snyder, Foreign affairs, 2022

Lentz AK., et al. Nutr. Clin. Prac., 2003

Rauchfleisch R. et al., SSRN, 2021

Ren XL. et al., PNAS, 2019

Roozenbeek J. et al., The Conversation, 2022

Prikova E., “The Digital Services Act: your guide to the EU’s new content moderation rules”, 2022

Lewandowsky S. et al., Publications Office of the European Union, 2020

Veerasamy N. et al., ICCWS, 2022

Kreps S. et al., JEPS, 2022

Markolin P. et al., Genomics, 2021

Rosenfeld S., Gene Regulation and Systems Biology, 2011

Guhl J. et al., ISD report, 2022

Ortega-Sabater S., et al., biorxiv, 2022

Lorenz-Spreen P. et al., Nature Human Behavior, 2022

Acknowledgments

I would like to thank Prof. Stephan Lewandowsky & Carl Miller for discussions, as well as Prof. Matthew Browne, Dr. Laura de Vargas Roditi & Chris Boutte for giving critical feedback on a draft version of this article.

PS: Despite writing opinionated and sharing it online, I try hard to not be a person who prides himself on coming up with unique “hot takes” (I leave that to influencers…) about a topic that many smart people have already worked on, and where much of our knowledge is already available. While marketable in the attention economy, re-inventing the wheel is a stupid waste of time, and having amateur influencers selling their USP-optimized “hot take” about how ‘we all should try to make square wheels for a change’ encapsulates much of what I hate about the current epistemic crisis. Expertise is important in a civilized society.

In that sense, I prefer being a rather less popular “cold take” guy. If you have found the above suggestions of strengthening our defenses and institutions boring and predictable, well tough luck.

It had to be said, is agreed upon by experts, and is the most direct path forward we have.

Defending education, sponsoring and reviving public-interest journalism, and sticking to science remain brilliant strategies for maintaining an information sphere compatible with democracy.

It might even be foundational to democracy. These interventions are however a struggle to implement properly, they are the bitter and boring pills we have to make our entertainment-addicted immature society swallow to get healthy again, every day from here on forward, and that will demand hard work, time, money, and resources from all of us. There are no quick & easy fixes. It’s up to us.

So how is that cold take for you?

--

--

Philipp Markolin

Science holds the keys to a world full of beauty and possibilities. I usually try something new.