The Algorithm Made Me Do It
- Shreya Nautiyal

- 16 hours ago
- 3 min read

A teenage boy in Guldborgsund, Denmark, is searching for a pirated stream of the newly released Project Hail Mary on YouTube. Five videos later, he is watching a man wearing a crisp Oxford shirt calmly making the case that immigration to Europe is eroding Western civilisation. The boy does not pause to think. He does not resist the idea. He now actively searches for content that feeds this narrative. By the third day, the algorithm, having already registered his latent preferences with an inhuman precision, feeds him a steady stream of content that echoes and amplifies the same idea.
This is an online rabbit hole of ideologically biased content. It is invisible, frictionless and algorithmically driven.
The Myth: The Algorithm Did It
The popular narrative wants algorithms to take the blame. Algorithms tend to create ‘filter bubbles’ which limit an individuals’ knowledge of counter-narratives. Terrorist organisations like the Islamic State strategically appropriate pop culture references to render their messaging legible and draw the youth into this bubble. Social media networks, with their countless profiles, further amplify these narratives and normalise the glorification of extremist leaders. Alongside this, they disseminate graphic and violent material to intimidate and desensitise viewers. Thus, what follows is sheer terror disguised as mainstream social media content.
The fact that so many people are getting self-radicalised is now less about dangerous ideological potency and more about the speed at which belonging is now manufactured online. We cannot overlook the fact that the process of radicalisation that once unfolded over months or years now typically takes days or even hours. This largely stems from the prevalence of extremist short-form online propaganda.
The Truth: The Need to Belong
In August 2024, police in Vienna conducted raids in a small town of Ternitz where three suspects were believed to be radicalised through TikTok. These individuals had been exposed to videos by Islamist influencers glorifying jihad. One of them subsequently created a Telegram channel to further propagate these ideas and expand the network. This episode is not captured adequately by the metaphor of a rabbit hole. It is rather a trap door. It is sudden and irreversible. The immediate question is then not simply why radicalisation takes place but also why this generation appears so reachable. While scholars may argue that algorithms accelerate the exposure to extremist material, such claims risk overstating technological determinism. Algorithms do not create the appetite; they serve what is readily available. Nor do they operate in a social vacuum.
Europe’s deeper crisis is not just digital and seemingly recent, but one rooted in the growing deficit of belonging. Online platforms and social networking applications provide more than just information; they offer a sense of recognition. These platforms give a sense of visibility to individuals who believe that their identity is marginalised. The transition from being alienated to active participation in extremist actions is gradual but consequential. These affectively charged online networks promise meaning and purpose to individual participation in everyday social life and immersion in tightly bound platforms. This is precisely what extremists offer. They do not simply recruit but offer belonging.
Thus, to frame online engagement on radicalisation issues as inherently addictive or pathological misses the point. These platforms are then less like an addictive drug than an open-door venue. They offer a space that one enters, often voluntarily, in search of a meaningful connection. Europe’s policy response, however, has largely focused on regulating this venue, rather than confronting the underlying reality: why so many individuals feel they have nowhere else to belong.
Image: Wikimedia Commons/Doyle of London Licence.
No image changes made.
.png)



Interesting perspective on how people blame “the algorithm” while human choices and accountability still matter behind the scenes. Very relevant in today’s AI-driven world.
Really thoughtful piece. I liked how you framed algorithms as amplifiers rather than the actual cause, the focus on identity and belonging made the argument much stronger.
Do you think offline communities can still counter this effectively, or has online identity become too powerful now?