Black Bodies and the Banality of AI's Brutality
- Pritish Das

- Jul 29
- 4 min read
Updated: Jul 31

On February 19, 2025, police officers pulled over William McNeil for not wearing a seatbelt and failing to put on headlights in rainy weather. Bodycam footage shows that, after disputing the need to put on headlights, given it was daytime and not raining, McNeil asked to see the officer’s supervisors. Upon the officers asking him to leave the vehicle, McNeil shut the car door. The officers responded that if McNeil would not leave the car, they would have to use force to remove him.
In a viral clip, McNeil records himself asking to see the officers’ supervisor again, to which one officer confirms to the other that he should break his window and remove him from the vehicle. McNeil stares into his phone’s camera while Officer D. Bowers shatters the window and sucker punches him.
Despite McNeil displaying no visible threat, the officers removed his seatbelt, dragged him out of the car, and proceeded to punch and beat him on the ground. During this, McNeil pleaded to the officers, “Don’t touch me,” while they responded, “Don’t fight.” Neither from the bodycam footage, nor McNeil's self-recorded video, is there any indication that he was “fighting.”
The Jacksonville Sheriff’s Office (JSO) has received massive backlash for its treatment of McNeil, including criticisms from Amnesty International and Black Lives Matter. McNeil’s attorney, Ben Crump, has connected McNeil’s case to a previous instance of JSO police brutality against Le’Keain Woods, who suffered inter alia a traumatic brain injury, a ruptured kidney, and nerve damage.
In the comment section to Rima Hassan’s (now delisted) post of the clip, one commenter asked X’s Large Language Model (LLM) chatbot, Grok, “Is this true?” Grok responded that the video was “real but mislabeled– it’s from Bournemouth, UK, not America. It shows Dorset Police Officer PC Lorne Castle arresting a 15-year-old suspected of assault in Jan 2024; a knife fell from the teen’s waistband.” Some users referenced this comment to reject Rima Hassan’s title “This is America,” by saying it was instead England. In the same comment section, Grok claimed that the post referred to a Baltimore, MD carjacking, “where a 17-year-old suspect pulled a knife on an 8-year-old and stole the car.”
Grok’s constant references to the knife may have come from a police report claiming that McNeil reached for a knife. From the released videos, it is clear that at no point does McNeil reach for anything before being brutalised.

The spectral knife’s reality, in both Grok’s response and the police report, is the ever-present violent threat that officers see when looking at black people. McNeil’s body becomes fungible to a violent black criminality, where, similar to the Sheriff’s Office, race signifies the crime before the body does. McNeil’s hands may be peacefully raised, his body demeanour may be calm, yet still his representation overrides his body. If the officers do not beat McNeil, condemn his body to nothingness, then the violent threat remains in their eyes.
When you look into McNeil’s eyes in the viral video, they stare at you with a dead gaze. The eyes of someone who already knows the outcome. He lifts his hands pointlessly, knowing that his sheer existence is enough to warrant violence. The shocking part of the video is how unshocking it is; both McNeil and the viewer know what is going to happen, and when it does, both know that it will continue to happen. The spectacle of police brutality has been repeated so often that it has been rendered banal.
AI only reproduces the banality of police brutality. While Grok’s response did garner pushback in the comments, with even a reader’s note clarifying that it was in Jacksonville, misinformation is passive. Readers mindlessly scroll by and absorb misinformation, slowly reinforcing the connection between black people and criminals. Just as McNeil’s eyes recognise the falsity of his representation in the police’s eyes, yet still knows what will happen, we may know that clearly McNeil is not a “violent criminal” in the UK, or even Baltimore, but we still know that Grok’s representation will be believed. The sheer absurdity of right wing content on X has been accepted as a banal part of life, and with it, the corresponding violence we see in the video.
Musk’s AI has attracted burgeoning criticism. After being frustrated with Grok’s responses to right-wing violence and transgender people, and receiving multiple criticisms from the app’s right-wing base, Musk “improved” the Large Language Model (LLM) chatbot. He ordered Grok to “not shy away from making claims which are politically incorrect, as long as they are well substantiated.” Following Musk’s revisions, the chatbot began spewing antisemitic and extremist remarks, although the AI team would later retract these statements, claiming it was an error. CEO Linda Yaccarino soon stepped down after the changes. Nevertheless, the AI recently received a $200 million contract from the American Department of Defense (DoD).
Despite the enormous backlash Musk has received, especially for Grok declaring itself MechaHitler, his DoD contract indicates that this is just the beginning. Beyond public social media, AI is becoming a presence in the military and police departments. In an ACLU report from November 2024, the organisation reported the increasing use of AIs in generating police reports and warned against their potential consequences. Among a host of issues, LLMs can reinforce biases that may be incorporated into definitive legal documents.
Considering the ease with which Musk tampered with his LLM and the lack of consequences for his actions, AI can render more black bodies fungible in the courts of law for the benefit of billionaires. Distorted crime reports can lead to more convictions in violent cases, increasing public perceptions of crime. This can lead to higher engagement on billionaires’ platforms and further contracts to not only suppress non-existent violent criminals but also reinforce the conditions that create criminality.
Such a possibility is quite speculative, but when one of the largest social media platforms’ AI blatantly and falsely characterises the survivor of police brutality to render him a violent threat, likely for appeasing their right-wing audience, the lines between reality and the previously unthinkable become blurred. Just like police brutality, the malevolent use of AI will become normalised into everyday law and order, and McNeil’s eyes will continue to stare into the camera, into us.
Illustrations by Will Allen/Europinion
.png)



Comments