Richard Golian

1995-born. Charles University alum. Head of Performance at Mixit. 10+ years in marketing and data.

Castellano Français Slovenčina

Manage subscription Choose a plan

RSS
Newsletter
New articles to your inbox

Article

Manipulation of Attention Through Outrage, Anger and Fear

Political manipulation: outrage and fear tactics
Richard Golian
Richard Golian · 1 074 reads
Hi, I am Richard. On this blog, I share thoughts, personal stories — and what I am working on. I hope this article brings you some value.
Listen to this article
0:00 / 0:00

You see it. You hear it. You read it. Every day.

It triggers outrage. Anger. Or fear.

Then something new appears — even stronger.

And then again. And again. And again.

Very often, this is not random.

It is a system.

A system of working with attention, built on an understanding of the weaknesses of human nature — which I have written about before — and on detailed data about how we behave and make decisions.

The goal is not to make people believe a specific thing.

The goal is to leave them without the space to focus on anything.

This is used daily by political, economic and other interest groups, as well as powerful individuals. It is addressed at a scientific and expert level, backed by large amounts of data.

It is a system of manipulation — however it may sound, I cannot describe it otherwise.

We are being systematically manipulated.

Systematic Manipulation Based on Science

In expert circles, this mechanism is described through a combination of concepts:

  • Agenda-setting means that the media do not determine what we think, but what we think about.
  • Attention economy refers to an environment where human attention is the most scarce resource.
  • Outrage-driven content is content designed to trigger strong emotional reactions — most often anger or fear.

Systematic Manipulation Based on Data

At the same time, there is data.

Data about what people search for, read and react to is collected at scale and traded. It is divided into segments — groups defined by shared interests, concerns and behavioural patterns.

This data can be analysed and targeted. It shows what people are focused on, what they fear and how they are likely to respond.

Communication can then be aligned with these patterns — not randomly, but based on data.

How and Why Manipulation Works

When new emotional content keeps arriving, a simple effect emerges: attention constantly shifts between stimuli and cannot focus on anything meaningful.

The same patterns tend to repeat:

  • controversial statements that immediately dominate the media space,
  • rapid topic switching that prevents deeper discussion,
  • strong emotional framing — “threat”, “attack”, “crisis”,
  • shifting attention at the moment when systemic problems should be addressed,
  • creating a constant sense of conflict and urgency.

In these cases, it is not just about content.

It is about timing, repetition and emotion.

Manipulation Without the Feeling of Being Manipulated

The most effective manipulation today is not about what we think.

It is about what we pay attention to.

If someone can decide which topics we see, in what order, in what emotional framing, and how long we stay with them, then they no longer need to directly change our opinions.

It is enough to shape the environment in which we form them.

And we feel as if we decided on our own.

This mechanism exists and works.

Understanding it is the first condition for having any chance against it.

How Data Is Used in Manipulation: The Cambridge Analytica Case

Cambridge Analytica worked with data from tens of millions of Facebook users, which were used for political purposes. The way this data was collected became the subject of official investigations — the UK’s data protection authority confirmed that personal data had been obtained and processed without sufficient consent, and the US Federal Trade Commission later fined Facebook a record $5 billion for privacy violations.

This was not just basic demographic data.

It was behavioural data — what people like, share and what content they consume.

Research has shown that such digital traces can be used to estimate personality traits with considerable accuracy. In other words, online behaviour can be used to predict what type of content a specific person is likely to respond to.

Based on this, it became possible to create psychological profiles of individuals and tailor communication to them. This approach, known as microtargeting, has been documented for example in reports by the UK Parliament on disinformation and political advertising.

In practice, this meant that the same political message could exist in multiple versions, each tailored to a specific type of person. Different groups of people were shown different versions — some emphasising threat or risk, others loss, injustice or identity.

Everyone saw something different.

Not because reality was different, but because communication was adapted to the expected reaction.

The key was not personalisation itself, but how the content was optimised.

According to testimonies of former employees and investigation findings, content was tested and adjusted based on how people reacted — not based on truth or quality, but on its ability to generate a response.

Research shows that emotionally charged content, especially content triggering anger and fear, spreads faster and generates more engagement than neutral content.

This case is therefore not just about data or one company.

It is a concrete example of a mechanism that connects behavioural data, psychological models and digital platforms into a system that optimises content for reaction.

Human attention cannot be sustained long-term, but it can be captured repeatedly — and the easiest way to capture it is through emotion.

If someone knows what a person watches, reacts to and fears, they can show them content that triggers that reaction. Not once, but repeatedly.

Such a system does not work by convincing everyone of one big idea.

It works by showing each person something that affects them — and thereby capturing their attention.

Cambridge Analytica does not only reveal a privacy issue.

It reveals something more fundamental.

Content can be designed not to inform, but to trigger a reaction.

And if it is optimised for anger, fear and outrage, it becomes a tool that significantly shapes our attention — what we choose to focus on.

Manipulating Emotions Through Content: The Facebook Experiment

In 2014, Facebook conducted an experiment — later published in the Proceedings of the National Academy of Sciences — in which it modified the content shown to approximately 700,000 users without their knowledge.

This was not about changing features or design, but about changing what type of posts users saw in their feed. Some were systematically shown fewer negative posts, others more. The goal was to test whether this would influence how they themselves communicated and what emotions they expressed.

Continue

Join the Library

Full access to my thoughts, personal stories, findings, and what I learn from the people I meet.

Join the Library — €29.99 per year
Or just this article · €2,99

Get the full article by email and feel free to reply if you want to discuss it further.

Visa Mastercard Apple Pay Google Pay

Sources

Facebook fined $5 billion for privacy violations: Federal Trade Commission (FTC)
Study on moral-emotional content spreading on social networks: Proceedings of the National Academy of Sciences (PNAS)

Summary

The most effective manipulation does not tell you what to think. It determines what you think about. Behavioural data, emotional targeting, and algorithmically optimised content shape public opinion — not through force, but by controlling the environment in which we form our views. From Cambridge Analytica to Facebook's emotional contagion experiment, the mechanisms are concrete and well-documented.

Common questions on this article's topic

What is agenda-setting and how is it used to manipulate public attention?
Agenda-setting is a concept from media studies which holds that the media do not determine what people think, but what they think about. By selecting which topics receive coverage and how prominently they are presented, media and political actors can steer public attention toward certain issues and away from others — without needing to explicitly persuade anyone of a particular viewpoint.
What is the attention economy?
The attention economy describes an environment in which human attention is the scarcest resource. In a world of infinite content, the ability to capture and hold someone's focus has become more valuable than the information itself. Political and commercial actors compete for this attention, often using emotionally charged content because it generates the strongest and most immediate reactions.
How did Cambridge Analytica use personal data for political manipulation?
Cambridge Analytica collected behavioural data from tens of millions of Facebook users — not just demographics, but information about what people liked, shared, and consumed. This data was used to build psychological profiles and deliver politically targeted content tailored to individual personality types. Different groups saw different versions of the same message, each optimised to trigger a specific emotional response such as fear, anger, or a sense of injustice.
Can social media algorithms influence our emotions without us realising it?
Yes. A study conducted by Facebook in 2012 and published in 2014 in the Proceedings of the National Academy of Sciences involved approximately 689,000 users whose News Feeds were modified to show more or fewer negative posts. Users exposed to more negative content began posting more negatively themselves. The study demonstrated that emotions can spread through algorithmically curated content alone — without any direct interaction between people.
What is the firehose of falsehood strategy?
The firehose of falsehood is a strategic communication approach that relies not on one convincing message, but on a high volume of different, often contradictory claims spread rapidly and repeatedly. The goal is not to persuade but to overwhelm — to keep shifting attention from one claim to another so that no single issue can be examined in depth. The response to the MH17 crash in 2014, where dozens of mutually contradictory narratives appeared almost simultaneously, is a well-documented example.
How can I recognise when my attention is being manipulated?
Key patterns include: controversial statements that dominate the media space immediately, rapid switching between topics that prevents deeper analysis, strong emotional framing using words like threat or crisis, attention shifts at moments when systemic problems should be addressed, and a constant sense of conflict and urgency. If you notice your emotional state changing rapidly in response to a stream of content — especially toward anger or fear — it may be a sign that the content is optimised for reaction rather than information.
Richard Golian

If you have any thoughts, questions, or feedback, feel free to drop me a message at mail@richardgolian.com.

NEWSLETTER
What I write about, what I am working on, what I learned.
Sent the first Sunday of the month. Unsubscribe anytime.

Related articles

We Are Weak and They Know It

Manipulation without the feeling of being manipulated is the most effective kind.

28 February 2026 ·1 269 reads
Robert Fico Forgets When and How the Second World War Began

How can we expect to come to terms with history or learn from it if we do that?

7 May 2025·1 856 reads
Robert Fico: Dividing Slovak Families Through Politics

The result is a fractured society where even family gatherings often become battlegrounds for ideological clashes.

25 January 2025·1 727 reads

More articles

Where the Money Goes When AI Takes the Work

Prague, 13 May 2026. On my way to work I started thinking about something that stayed with me for days. If most routine work on a computer disappears in the next ten years, and a large share of repetitive manual work disappears with it, what happens to the flow of money? Who pays whom for what? Which economic layers will exist, how large will they be, and what relationships will run between them? This is the six-layer map I sketched as an answer.

15 May 2026·40 reads
Building an AI Stock Market Prediction System That Grades Itself

I am building an AI system to predict the S&P 500. It runs on my own machine, uses free public data — yfinance, FRED, the Shiller dataset — and grades every forecast against reality. This series documents the build itself: the decisions, the methodology, the mistakes. What I will eventually share from the running system is a separate question, and an honest one.

26 April 2026·611 reads
AI sales forecast: 9 traps so far

Yesterday I could not tear myself away from the computer. When I lifted my head, it was half past eight in the evening. I had been sitting alone upstairs for about three hours.

25 April 2026·581 reads
Will AI take my job?

Will AI take my job? A certified Google trainer told me in June 2024 that my profession would cease to exist. Twenty-two months later, my job title has not changed — but ninety percent of what I do during the day is different. I have delegated more of my thinking to AI agents than I thought possible. I am not afraid. This is why, and what it means for anyone asking the same question.

23 April 2026·364 reads
€50,000 Quote vs. Two Hours with Claude Code

One hour. Fifty-five minutes. That is how long it took to build what a Czech software firm had quoted at over €50,000. I built it with Claude Code. Not a prototype. Not a proof of concept. A working tool — the one the company actually needed. By the evening of the same day, it was running on staging. This is not about Claude Code. It is about what Claude Code exposes.

18 April 2026·718 reads
Is AI Making Us Dumber?

I have conducted roughly one hundred and fifty practical interviews over the past four years. Fifty for data specialist roles. A hundred for advertising and performance marketing specialists. Almost every one of them involved sitting down with a candidate over a practical task — something close to a real problem we actually need to solve at the company. Not theory. Not trivia. Applied problem-solving. Over time, I started noticing a pattern.

14 April 2026·670 reads
What AI Hides From You

Before you can teach AI to understand anything, you need to see what it is hiding from you.

11 April 2026·667 reads
When Your AI Agent Joins the Team

The moment other people needed access to it, the problem changed completely. It was no longer about whether the agent could learn. It was about who gets to teach it.

8 April 2026·823 reads
Training an AI Agent That Learns Between Sessions

I wanted to build an agent that doesn't just assist. One that acts.

4 April 2026·874 reads
Local AI Model Limitations: Why I Switched from Ollama to Claude for Autonomous Agents

This is what I learned about local vs cloud AI, and why I switched to Claude Code.

3 April 2026·1 473 reads
Full AI agents or fully offline.

Four days in Catalonia. No computer, no AI, almost no social media. I bought this notebook so that I could write down what I would think about, and what I would come across and learn on the trip.

10.5.2026·322 reads
NEWSLETTER
What I write about, what I am working on, what I learned.
Sent the first Sunday of the month. Unsubscribe anytime.