Article
We Do not Think, We Just Consume.
Lately, I have been noticing something new in job interviews—candidates come in "prepared," but in a strange way. They get an hour to prepare, and they use it to pull as much information as possible from ChatGPT. Then they confidently present it to me. They mention a metric that, according to them, is crucial for evaluating advertising performance. But when I ask them what the metric actually tells us, they do not know. They have no idea how to calculate it.
I have never had so many interviews where I had to teach candidates the meaning of a key metric in online advertising. Never.
But let us be clear—this is not the same as when calculators were introduced. You can work with someone who does not understand logarithms. But you cannot seriously discuss or collaborate with someone who does not even understand which two numbers to divide to get meaningful insight in advertising. You just cannot.
And it is not just interviews. More and more, I feel like real thinking is fading away. People do not form their own ideas anymore; they just adopt prepackaged opinions that flood them from all directions. Information is instantly accessible, and it requires no effort from us.
Many believe they have broad knowledge because they follow multiple sources. But I have my doubts. Real thinking takes effort. Thinking deeply about something means spending time questioning, challenging established views, and arriving at your own conclusions.
When Even Criticism Is Just Another Form of Consumption
One of the strange things about our time is that even criticism has turned into a kind of consumption.
I see it clearly in debates on serious social issues – take Covid, for example. Entire crowds jumped into criticising one expert opinion or another, often with great confidence, while having only a shallow understanding of the topic.
They felt the need to criticise, to take a stand – but without having arguments of their own. So they just adopted someone else’s critical take.
We repeat another person’s objections and feel like we have been thinking. But what we have really done is just accepted a different ready-made opinion. That is not critical thinking.
I get the sense it is connected to a fear of independent thinking.
Continue reading
Enter your email to unlock this article and join the newsletter. You can unsubscribe anytime.
Summary
Common questions on this article's topic
How is AI affecting the quality of job candidates?
What is the difference between consuming information and actually thinking?
How has criticism itself become a form of consumption?
Why are people afraid of independent thinking?
Does the speed of the internet make deep thinking harder?
What can individuals do to think more independently?
Related articles
I have conducted roughly one hundred and fifty practical interviews over the past four years. Fifty for data specialist roles. A hundred for advertising and performance marketing specialists. Almost every one of them involved sitting down with a candidate over a practical task — something close to a real problem we actually need to solve at the company. Not theory. Not trivia. Applied problem-solving. Over time, I started noticing a pattern.
Before you can teach AI to understand anything, you need to see what it is hiding from you.
When we hear artificial intelligence, many people imagine something mysterious. Something that thinks. Something that understands. It does not. I work with AI every day and the more I use it, the clearer the truth becomes: artificial intelligence is applied mathematics. It predicts what the next word should be. That is the entire mechanism.
More articles
I am building an AI system to predict the S&P 500. It runs on my own machine, uses free public data — yfinance, FRED, the Shiller dataset — and grades every forecast against reality. This series documents the build itself: the decisions, the methodology, the mistakes. What I will eventually share from the running system is a separate question, and an honest one.
Yesterday I could not tear myself away from the computer. When I lifted my head, it was half past eight in the evening. I had been sitting alone upstairs for about three hours.
Will AI take my job? A certified Google trainer told me in June 2024 that my profession would cease to exist. Twenty-two months later, my job title has not changed — but ninety percent of what I do during the day is different. I have delegated more of my thinking to AI agents than I thought possible. I am not afraid. This is why, and what it means for anyone asking the same question.
One hour. Fifty-five minutes. That is how long it took to build what a Czech software firm had quoted at over €50,000. I built it with Claude Code. Not a prototype. Not a proof of concept. A working tool — the one the company actually needed. By the evening of the same day, it was running on staging. This is not about Claude Code. It is about what Claude Code exposes.
The moment other people needed access to it, the problem changed completely. It was no longer about whether the agent could learn. It was about who gets to teach it.
I wanted to build an agent that doesn't just assist. One that acts.
This is what I learned about local vs cloud AI, and why I switched to Claude Code.
What happened — and how can it be reversed?
