top of page

The Rise of “Synthetic Fetish Economies”: Artificial Intelligence and the Next Phase of Online Pornography

Last month, the BBC reported on a growing cluster of Instagram accounts posting AI-generated images and videos of women with disabilities – including Down’s syndrome, amputations and even fabricated “conjoined twins” – in highly sexualised scenarios designed to attract followers and direct traffic toward monetised adult platforms.


At first glance, these profiles might appear to be little more than another grotesque curiosity of the internet. Yet the phenomenon points toward something more structural. Over the past year, while tracking similar accounts appearing across Instagram and adjacent platforms, I have begun to think of these networks as the early emergence of “synthetic fetish economies” – environments in which marginalised identities are not represented by real individuals but simulated, sexualised, and subsequently monetised through generative AI systems.


The scale these accounts can reach is striking. One profile claiming to depict “conjoined twins” reportedly amassed around 400,000 followers despite only joining Instagram in late 2025. In the course of documenting similar material, I have also encountered accounts built around even more surreal personas - including a fabricated influencer presenting herself as a three-breasted woman, followed by over 850,000 users and accompanied by comment sections filled with seemingly oblivious engagement from both real users and complicit bots.


What distinguishes these emerging synthetic personas from earlier forms of online pornography is not simply that they are artificial. It is the economic environment in which they circulate. Online sexual content markets have long been shaped by intense competition for attention, in which novelty, extremity and niche differentiation become commercially advantageous. Researchers and law-enforcement officials increasingly describe this as an “escalating pathway”, in which repeated exposure to online pornography can gradually push audiences toward more extreme material as novelty wears off and algorithms recommend adjacent content. 


In saturated digital markets, attention becomes the scarce commodity. Sexual content creators therefore compete not only through volume, but through differentiation: increasingly niche categories, more extreme scenarios, and spectacles designed to break through the noise of an already crowded ecosystem.


Recent viral controversies illustrate this dynamic. OnlyFans creator Lily Phillips drew global attention after filming a stunt in which she had sex with 101 men in a single day, while another performer, Bonnie Blue, claimed to have slept with more than 1,000 men in 12 hours for content filmed for online distribution. Whether framed as empowerment, performance art, or outrage-bait marketing, these events demonstrate the same underlying logic: in an attention economy, extremity itself becomes a competitive advantage.


Yet even these spectacles remain constrained by the realities of physical bodies, labour, and risk. Generative AI removes those constraints.


Where escalation in traditional pornography required real performers willing to participate in ever more extreme scenarios, synthetic media allows operators to fabricate sexual personas entirely on demand. Bodies, identities and scenarios can now be engineered for hyper-specific niches – simulated disability, exaggerated anatomy, impossible biological features – and reproduced indefinitely across platforms. The result is not simply more pornographic content but the emergence of a fundamentally different market structure, in which sexualised identities themselves become programmable assets.


Social media platforms then perform the scaling. Recommendation systems are designed to detect engagement signals – clicks, watch time, comments – and distribute content accordingly, allowing increasingly specialised niches to find receptive audiences. In practice, this means that even highly unusual sexual personas can rapidly accumulate large followings once engagement begins to cluster around them. 


Reporting by news organisations suggest this phenomenon is already spreading. In a separate investigation to that conducted by the BBC, ITV News identified a network of Instagram accounts using AI “Down syndrome” filters to create sexualised videos of pre-existing women and direct viewers toward subscription platforms such as OnlyFans. In one case, content posted by a 16-year-old girl was altered using an AI disability filter and reposted with suggestive captions, including “Do you find my Down syndrome attractive?” 


Such examples illustrate how synthetic manipulation increasingly intersects with a form of digital identity fraud. Ordinary social media posts made by celebrities and members of the general public alike can be extracted from their original context, modified through generative tools and republished across networks of accounts that present the altered individual as someone they are not. A person who posted a routine video online may find their likeness repurposed as a pornified character designed to attract attention and drive traffic toward monetised adult platforms.


What emerges is an ecosystem in which identities themselves become manipulable digital assets. Faces, bodies and personal images can be altered, replicated and redistributed across algorithmically driven attention markets, often without the knowledge or consent of the individuals whose likeness forms the raw material of the content.


The implications extend beyond fetish markets. Similar technological dynamics are already visible in debates surrounding deepfake pornography and the rising threat posed by AI-generated child sexual abuse imagery. In these contexts, synthetic media introduces a further complication: it can sever the evidentiary link between image, victim and offender that traditionally underpins investigation and prosecution. When sexualised images can be generated without a corresponding victim, law enforcement agencies face a more ambiguous evidentiary landscape - one in which attribution, intent and harm become significantly harder to establish.


Seen in this light, the Instagram accounts highlighted by the BBC represent more than mere offensive curiosities. More plausibly, they represent an early signal of a broader transformation in digital sexual economies. Generative AI, platform attention markets and monetised adult content are beginning to converge in ways that allow entirely synthetic identities to be created, scaled and exploited for profit.


This development poses a challenge not only for platform governance but for how harm itself is conceptualised in the digital environment. Regulatory frameworks have historically been built around identifiable victims, perpetrators and acts of abuse. Synthetic media disrupts this structure. When sexualised identities can be fabricated on demand, distributed algorithmically and monetised through subscription platforms, the boundary between simulation and exploitation becomes far harder to police.


A deeper concern is how these systems may interact with the dynamics of escalation already witnessed in online pornography consumption. When recommendation algorithms reward novelty and generative tools allow sexualised personas to be produced without constraint, the cycle of differentiation and intensification becomes easier to sustain. Over time, the effects are unlikely to remain confined to digital markets alone. As algorithmically amplified sexual content continues to shape expectations around intimacy, power and consent, its influence may increasingly spill into relationships, social norms and, in some cases, the pathways that lead toward more harmful or criminal forms of sexual behaviour.


What begins as synthetic spectacle may therefore mark the next stage in a system where the economics of attention steadily reshape sexual behaviour itself.




No image changes made.

Comments


bottom of page