Hi, I am Richard. On this blog, I share thoughts, personal stories — and what I am working on. I hope this article brings you some value.
Manipulation of Attention Through Outrage, Anger and Fear
Political manipulation: outrage and fear tactics
You see it. You hear it. You read it. Every day.
It triggers outrage. Anger. Or fear.
Then something new appears — even stronger.
And then again. And again. And again.
Very often, this is not random.
It is a system.
A system of working with attention, built on an understanding of the weaknesses of human nature — which I have written about before — and on detailed data about how we behave and make decisions.
The goal is not to make people believe a specific thing.
The goal is to leave them without the space to focus on anything.
This is used daily by political, economic and other interest groups, as well as powerful individuals. It is addressed at a scientific and expert level, backed by large amounts of data.
It is a system of manipulation — however it may sound, I cannot describe it otherwise.
We are being systematically manipulated.
Systematic Manipulation Based on Science
In expert circles, this mechanism is described through a combination of concepts:
- Agenda-setting means that the media do not determine what we think, but what we think about.
- Attention economy refers to an environment where human attention is the most scarce resource.
- Outrage-driven content is content designed to trigger strong emotional reactions — most often anger or fear.
Systematic Manipulation Based on Data
At the same time, there is data.
Data about what people search for, read and react to is collected at scale and traded. It is divided into segments — groups defined by shared interests, concerns and behavioural patterns.
This data can be analysed and targeted. It shows what people are focused on, what they fear and how they are likely to respond.
Communication can then be aligned with these patterns — not randomly, but based on data.
How and Why Manipulation Works
When new emotional content keeps arriving, a simple effect emerges: attention constantly shifts between stimuli and cannot focus on anything meaningful.
The same patterns tend to repeat:
- controversial statements that immediately dominate the media space,
- rapid topic switching that prevents deeper discussion,
- strong emotional framing — “threat”, “attack”, “crisis”,
- shifting attention at the moment when systemic problems should be addressed,
- creating a constant sense of conflict and urgency.
In these cases, it is not just about content.
It is about timing, repetition and emotion.
Manipulation Without the Feeling of Being Manipulated
The most effective manipulation today is not about what we think.
It is about what we pay attention to.
If someone can decide which topics we see, in what order, in what emotional framing, and how long we stay with them, then they no longer need to directly change our opinions.
It is enough to shape the environment in which we form them.
And we feel as if we decided on our own.
This mechanism exists and works.
Understanding it is the first condition for having any chance against it.
How Data Is Used in Manipulation: The Cambridge Analytica Case
Cambridge Analytica worked with data from tens of millions of Facebook users, which were used for political purposes. The way this data was collected became the subject of official investigations — the UK’s data protection authority confirmed that personal data had been obtained and processed without sufficient consent, and the US Federal Trade Commission later fined Facebook a record $5 billion for privacy violations.
This was not just basic demographic data.
It was behavioural data — what people like, share and what content they consume.
Research has shown that such digital traces can be used to estimate personality traits with considerable accuracy. In other words, online behaviour can be used to predict what type of content a specific person is likely to respond to.
Based on this, it became possible to create psychological profiles of individuals and tailor communication to them. This approach, known as microtargeting, has been documented for example in reports by the UK Parliament on disinformation and political advertising.
In practice, this meant that the same political message could exist in multiple versions, each tailored to a specific type of person. Different groups of people were shown different versions — some emphasising threat or risk, others loss, injustice or identity.
Everyone saw something different.
Not because reality was different, but because communication was adapted to the expected reaction.
The key was not personalisation itself, but how the content was optimised.
According to testimonies of former employees and investigation findings, content was tested and adjusted based on how people reacted — not based on truth or quality, but on its ability to generate a response.
Research shows that emotionally charged content, especially content triggering anger and fear, spreads faster and generates more engagement than neutral content.
This case is therefore not just about data or one company.
It is a concrete example of a mechanism that connects behavioural data, psychological models and digital platforms into a system that optimises content for reaction.
Human attention cannot be sustained long-term, but it can be captured repeatedly — and the easiest way to capture it is through emotion.
If someone knows what a person watches, reacts to and fears, they can show them content that triggers that reaction. Not once, but repeatedly.
Such a system does not work by convincing everyone of one big idea.
It works by showing each person something that affects them — and thereby capturing their attention.
Cambridge Analytica does not only reveal a privacy issue.
It reveals something more fundamental.
Content can be designed not to inform, but to trigger a reaction.
And if it is optimised for anger, fear and outrage, it becomes a tool that significantly shapes our attention — what we choose to focus on.
Manipulating Emotions Through Content: The Facebook Experiment
In 2014, Facebook conducted an experiment — later published in the Proceedings of the National Academy of Sciences — in which it modified the content shown to approximately 700,000 users without their knowledge.
This was not about changing features or design, but about changing what type of posts users saw in their feed. Some were systematically shown fewer negative posts, others more. The goal was to test whether this would influence how they themselves communicated and what emotions they expressed.
Full access to my thoughts, personal stories, findings, and what I learn from the people I meet.
Join the LibraryGet the full article by email and feel free to reply if you want to discuss it further.
Summary
Sources
Study on moral-emotional content spreading on social networks: Proceedings of the National Academy of Sciences (PNAS)
If you have any thoughts, questions, or feedback, feel free to drop me a message at mail@richardgolian.com.