Richard Golian

1995-born. Charles University alum. Head of Performance at Mixit. 10+ years in marketing and data.

#myjourney #myfamily #health #cognition #philosophy #digital #artificialintelligence #darkness #security #finance #politics #banskabystrica #carpathians

Castellano Français Slovenčina

Manage subscription Choose a plan

RSS
Newsletter
New articles to your inbox
Richard Golian

Hi, I am Richard. On this blog, I share thoughts, personal stories — and what I am working on. I hope this article brings you some value.

I built my own analytics with the help of AI – cookie-free and GDPR-compliant

Cookie-free web analytics with AI

By Richard Golian

Listen to this article
0:00 / 0:00

Not long ago, I realised I wanted a better understanding of my blog’s traffic.

I like clean and accurate data. But traditional analytics tools like Google Analytics come with serious downsides: inaccurate numbers (due to blocked consent requests and various tracking blockers), the need for a cookie banner, and questionable transparency around privacy.

That is why I decided to build my own analytics – with pure, unfiltered data and full respect for the privacy of my visitors. And since we are living in 2025, I built it with the help of generative AI.

When it comes to programming, I have always considered myself an eternal beginner, ever since I started with web development at the age of 12. But AI has opened up skills and opportunities for me that I never imagined I would have. I have already written here on my blog about how AI helps me improve my coding skills, explaining its steps and decisions along the way. Thanks to that, my horizons have expanded far beyond what I thought possible. One of the results is my own custom-built analytics tool.

Defining the Goals and First Prompts

When building a project like this today, you really need to have a clear idea of what you want it to do and why. Just saying "I want my own analytics" is not enough.

So we started by defining exactly what I wanted to measure: daily traffic, traffic sources, purchases of my premium articles, and a deeper analysis of visitor types (humans vs. suspicious activity vs. bots).

Richard Golian analytics
An analytics tool powered by artificial intelligence

At first, creating the initial tables and charts seemed easy. But it did not take long before the first real challenges emerged.

The issues I had to solve

The first SQL queries we used were slow and inefficient. They dragged the whole website down. We gradually tuned them, replaced wasteful LIKE comparisons with exact matches, and optimised the logic to only process the data we really needed – and suddenly, it started to fly.

Continue reading:

Full access to my thoughts, personal stories, findings, and what I learn from the people I meet.

Join the Library
or just this article

Get the full article by email and feel free to reply if you want to discuss it further.

Visa Mastercard Apple Pay Google Pay

Summary

Google Analytics was showing inaccurate numbers — blocked consent, tracking blockers. So I built my own cookie-free, GDPR-compliant analytics system using AI. Daily traffic, sources, premium purchases, bot detection. AI expanded what an eternal beginner like me can build.
Richard Golian

If you have any thoughts, questions, or feedback, feel free to drop me a message at mail@richardgolian.com.

Newsletter

New articles to your inbox

Common questions on this article's topic

Why is Google Analytics inaccurate?
Google Analytics relies on JavaScript tracking that can be blocked by ad blockers, privacy extensions, and consent banners. Research shows that sites with consent banners miss approximately 20% of visitor data, and on tech-savvy audiences the figure can reach nearly 60%. In the article, these accuracy problems — combined with the need for cookie banners and concerns about privacy — motivated building a custom solution from scratch.
Is it possible to build web analytics without cookies?
Yes. Cookie-free analytics works by collecting aggregated data without tracking individual users. Tools like Plausible, Fathom, and Umami demonstrate this approach commercially. In the article, a custom cookie-free system was built that tracks daily traffic, sources, and premium purchases without storing personal data — making it naturally GDPR-compliant with no cookie banner required.
What percentage of web traffic comes from bots?
By 2024, bot traffic exceeded human traffic for the first time in a decade, accounting for approximately 51% of all web traffic. Bad bots specifically represented about 37% of global web traffic. In the article, distinguishing real human visitors from bots and suspicious activity was a key challenge, solved by flagging suspicious visits and tracking confirmed bot behaviour separately.
Can someone with limited programming experience build custom analytics using AI?
In the article, this is exactly what happened. Starting as a self-described eternal beginner in programming, the custom analytics system was built entirely with AI assistance. AI did not just generate code — it explained each step, guided through problems, and taught better approaches along the way. Research shows that developers using AI coding assistants can be up to 55% more productive, making complex projects feasible for non-specialists.
How does privacy-first analytics work without tracking individuals?
In the article, the approach is straightforward: all traffic data is anonymised and contains no personal information. There is no need to know that a specific person from a specific city read a specific article. What matters is how many real humans visited, how many chose to buy premium content, and what the main traffic sources were. Sensitive calculations and database logic are kept in a protected area that no regular visitor can access.
What were the biggest technical challenges in building custom analytics?
In the article, three main challenges are described. First, initial SQL queries were slow and dragged the website down — solved by replacing inefficient comparisons with exact matches and optimising query logic. Second, attribution without individual tracking required a temporary anonymous session identifier. Third, separating real visitors from bots and suspicious traffic required a classification system with clear definitions and separate tracking.