Richard Golian

1995-born. Charles University alum. Head of Performance at Mixit. 10+ years in marketing and data.

#myjourney #myfamily #health #cognition #philosophy #digital #artificialintelligence #darkness #security #finance #politics #banskabystrica #carpathians

Castellano Slovenčina

Manage subscription Choose a plan

RSS
Newsletter
New articles to your inbox
Richard Golian

Hi, I am Richard. On this blog, I share thoughts, personal stories — and what I am working on. I hope this article brings you some value.

Manipulation of Attention Through Outrage, Anger and Fear

Political manipulation: outrage and fear tactics

By Richard Golian

Listen to this article
0:00 / 0:00

You see it. You hear it. You read it. Every day.

It triggers outrage. Anger. Or fear.

Then something new appears — even stronger.

And then again. And again. And again.

Very often, this is not random.

It is a system.

A system of working with attention, built on an understanding of the weaknesses of human nature — which I have written about before — and on detailed data about how we behave and make decisions.

The goal is not to make people believe a specific thing.

The goal is to leave them without the space to focus on anything.

This is used daily by political, economic and other interest groups, as well as powerful individuals. It is addressed at a scientific and expert level, backed by large amounts of data.

It is a system of manipulation — however it may sound, I cannot describe it otherwise.

We are being systematically manipulated.

Systematic Manipulation Based on Science

In expert circles, this mechanism is described through a combination of concepts:

  • Agenda-setting means that the media do not determine what we think, but what we think about.
  • Attention economy refers to an environment where human attention is the most scarce resource.
  • Outrage-driven content is content designed to trigger strong emotional reactions — most often anger or fear.

Systematic Manipulation Based on Data

At the same time, there is data.

Data about what people search for, read and react to is collected at scale and traded. It is divided into segments — groups defined by shared interests, concerns and behavioural patterns.

This data can be analysed and targeted. It shows what people are focused on, what they fear and how they are likely to respond.

Communication can then be aligned with these patterns — not randomly, but based on data.

How and Why Manipulation Works

When new emotional content keeps arriving, a simple effect emerges: attention constantly shifts between stimuli and cannot focus on anything meaningful.

The same patterns tend to repeat:

  • controversial statements that immediately dominate the media space,
  • rapid topic switching that prevents deeper discussion,
  • strong emotional framing — “threat”, “attack”, “crisis”,
  • shifting attention at the moment when systemic problems should be addressed,
  • creating a constant sense of conflict and urgency.

In these cases, it is not just about content.

It is about timing, repetition and emotion.

Manipulation Without the Feeling of Being Manipulated

The most effective manipulation today is not about what we think.

It is about what we pay attention to.

If someone can decide which topics we see, in what order, in what emotional framing, and how long we stay with them, then they no longer need to directly change our opinions.

It is enough to shape the environment in which we form them.

And we feel as if we decided on our own.

This mechanism exists and works.

Understanding it is the first condition for having any chance against it.

How Data Is Used in Manipulation: The Cambridge Analytica Case

Cambridge Analytica worked with data from tens of millions of Facebook users, which were used for political purposes. The way this data was collected became the subject of official investigations — the UK’s data protection authority confirmed that personal data had been obtained and processed without sufficient consent, and the US Federal Trade Commission later fined Facebook a record $5 billion for privacy violations.

This was not just basic demographic data.

It was behavioural data — what people like, share and what content they consume.

Research has shown that such digital traces can be used to estimate personality traits with considerable accuracy. In other words, online behaviour can be used to predict what type of content a specific person is likely to respond to.

Based on this, it became possible to create psychological profiles of individuals and tailor communication to them. This approach, known as microtargeting, has been documented for example in reports by the UK Parliament on disinformation and political advertising.

In practice, this meant that the same political message could exist in multiple versions, each tailored to a specific type of person. Different groups of people were shown different versions — some emphasising threat or risk, others loss, injustice or identity.

Everyone saw something different.

Not because reality was different, but because communication was adapted to the expected reaction.

The key was not personalisation itself, but how the content was optimised.

According to testimonies of former employees and investigation findings, content was tested and adjusted based on how people reacted — not based on truth or quality, but on its ability to generate a response.

Research shows that emotionally charged content, especially content triggering anger and fear, spreads faster and generates more engagement than neutral content.

This case is therefore not just about data or one company.

It is a concrete example of a mechanism that connects behavioural data, psychological models and digital platforms into a system that optimises content for reaction.

Human attention cannot be sustained long-term, but it can be captured repeatedly — and the easiest way to capture it is through emotion.

If someone knows what a person watches, reacts to and fears, they can show them content that triggers that reaction. Not once, but repeatedly.

Such a system does not work by convincing everyone of one big idea.

It works by showing each person something that affects them — and thereby capturing their attention.

Cambridge Analytica does not only reveal a privacy issue.

It reveals something more fundamental.

Content can be designed not to inform, but to trigger a reaction.

And if it is optimised for anger, fear and outrage, it becomes a tool that significantly shapes our attention — what we choose to focus on.

Manipulating Emotions Through Content: The Facebook Experiment

In 2014, Facebook conducted an experiment — later published in the Proceedings of the National Academy of Sciences — in which it modified the content shown to approximately 700,000 users without their knowledge.

This was not about changing features or design, but about changing what type of posts users saw in their feed. Some were systematically shown fewer negative posts, others more. The goal was to test whether this would influence how they themselves communicated and what emotions they expressed.

Continue reading:

Full access to my thoughts, personal stories, findings, and what I learn from the people I meet.

Join the Library
or just this article

Get the full article by email and feel free to reply if you want to discuss it further.

Visa Mastercard Apple Pay Google Pay

Summary

The most effective manipulation does not tell you what to think. It determines what you think about. Behavioural data, emotional targeting, and algorithmically optimised content shape public opinion — not through force, but by controlling the environment in which we form our views. From Cambridge Analytica to Facebook's emotional contagion experiment, the mechanisms are concrete and well-documented.

Sources

Facebook fined $5 billion for privacy violations: Federal Trade Commission (FTC)
Study on moral-emotional content spreading on social networks: Proceedings of the National Academy of Sciences (PNAS)
Richard Golian

If you have any thoughts, questions, or feedback, feel free to drop me a message at mail@richardgolian.com.

Newsletter

New articles to your inbox

Common questions on this article's topic

What is agenda-setting and how is it used to manipulate public attention?
Agenda-setting is a concept from media studies which holds that the media do not determine what people think, but what they think about. By selecting which topics receive coverage and how prominently they are presented, media and political actors can steer public attention toward certain issues and away from others — without needing to explicitly persuade anyone of a particular viewpoint.
What is the attention economy?
The attention economy describes an environment in which human attention is the scarcest resource. In a world of infinite content, the ability to capture and hold someone's focus has become more valuable than the information itself. Political and commercial actors compete for this attention, often using emotionally charged content because it generates the strongest and most immediate reactions.
How did Cambridge Analytica use personal data for political manipulation?
Cambridge Analytica collected behavioural data from tens of millions of Facebook users — not just demographics, but information about what people liked, shared, and consumed. This data was used to build psychological profiles and deliver politically targeted content tailored to individual personality types. Different groups saw different versions of the same message, each optimised to trigger a specific emotional response such as fear, anger, or a sense of injustice.
Can social media algorithms influence our emotions without us realising it?
Yes. A study conducted by Facebook in 2012 and published in 2014 in the Proceedings of the National Academy of Sciences involved approximately 689,000 users whose News Feeds were modified to show more or fewer negative posts. Users exposed to more negative content began posting more negatively themselves. The study demonstrated that emotions can spread through algorithmically curated content alone — without any direct interaction between people.
What is the firehose of falsehood strategy?
The firehose of falsehood is a strategic communication approach that relies not on one convincing message, but on a high volume of different, often contradictory claims spread rapidly and repeatedly. The goal is not to persuade but to overwhelm — to keep shifting attention from one claim to another so that no single issue can be examined in depth. The response to the MH17 crash in 2014, where dozens of mutually contradictory narratives appeared almost simultaneously, is a well-documented example.
How can I recognise when my attention is being manipulated?
Key patterns include: controversial statements that dominate the media space immediately, rapid switching between topics that prevents deeper analysis, strong emotional framing using words like threat or crisis, attention shifts at moments when systemic problems should be addressed, and a constant sense of conflict and urgency. If you notice your emotional state changing rapidly in response to a stream of content — especially toward anger or fear — it may be a sign that the content is optimised for reaction rather than information.