Navigating the surveillance state for trans liberation
The 2024 Paris Olympics was meant to be a celebration of athletic might yet it became a political battleground where the politics of identity were as fiercely contested as the sports themselves. The Olympics showcased an unsettling moment when Algerian boxer Imane Khelif faced a torrent of bigotry for allegedly not appearing woman enough. The likes of JK Rowling, Logan Paul and tech libertarian Elon Musk were amongst the many that amplified the disinformation campaign against Khelif, which has then spiralled into a public spectacle that not only cast doubt on Khelif's identity but also painted a stark picture of the broader systemic issues at play. Khelif is not trans, but the furor at the Paris Olympics over her gender identity mirrors the disturbing ideologies that trans people have long endured at the hands of transvestigators, who intrusively scrutinise appearances in their witch hunt to “expose” the next trans individual. Even popstar Taylor Swift did not escape scrutiny as people zoomed in on her ‘bulge’ under her navy blue bathers. It's called mons pubis, people.
This is an example on how Visage Technologies classifies if a face is feminine or masculine. Screenshot from https://visagetechnologies.com/gender-detection/
And this is what Automated Gender Recognition (AGR) seeks to automate. AGR attempts to correlate physical attributes with gender identity, a premise that has already been challenged by contemporary research. In an editorial, Nature journal strongly argues that anatomy does not definitively determine someone's gender. It highlights the complexity of sex and gender as spectrums, which include a variety of biological, psychological, and cultural factors. It further emphasises the inaccuracy and harm of using certain characteristics to assign gender, noting that gender identity involves more than just visible or genetic markers. Another paper worth mentioning is that of Daphna Joel, who finds that human brains exhibit a “mosaic” of features, some more common in females compared with males vice versa. This undermines the notion that there are distinctly male or female brains, highlighting the complexity and variability of brain characteristics across genders.
By embedding these flawed assumptions into its algorithms, AGR technology institutionalises the discriminatory practices that sparked such controversy at the Olympics. Not only does it systematically enforce a flawed understanding of gender, it has also leveraged surface-level data to make profound decisions about people’s identities. This obsession has real life repercussions of perpetuating, at best, biases and exclusion, and at worst, death. At airports, AGR’s potential to out transgender individuals could be especially harrowing, particularly for those living in countries where their existence means capital punishment. And with the rapid advancement and distribution of these tools, we are carrying the risk of it falling into the hands of governments and people who right now use transgender individuals as scapegoats to foster fear or justify discriminatory policies for political gains. We are beyond the issue of privacy here and even past beyond just acknowledging risk. We are venturing into realms where the very existence of transgender people are at stake.
Outside airport security, the reach AGR could extend into everyday spaces such as public restrooms. And this is not just speculative as we are already seeing AGR being employed in various contexts, such as the Giggles for Girls app, the dating app L’App, and even a restaurant in Oslo that targets ads based on gender, showing men pizza and women salad. In areas with deeply conservative values, like the American Bible Belt, the deployment of AGR systems in these spaces to enforce gender norms is a real possibility. The city of Odessa, Texas, has already introduced a $10,000 bounty for reporting transgender individuals who use bathrooms that correspond to their gender identity. This illustrates a troubling trend where AGR could potentially be used to enforce discriminatory laws and stoking fear among transgender communities.
The application of AGR in public security systems, online platforms, and even in everyday consumer technologies frames trans bodies as subjects of suspicion and scrutiny. This invasive oversight is driven by the marriage of the state and the capital that seek to monitor and control societal norms, including rigid adherence to gender binaries. In this setting, trans people are perceived as deviations, bugs in the system if you will, and consequently, treated as threats to social or public order, as is the case for the arrest of transgender people in many countries including Malaysia, Indonesia, India and the Philippines.
Transgender people, by their very existence, challenge these rigid gender norms and, by extension, the division of labour that underpins many economic and social policies. Capitalism relies heavily on the gender binary to sustain the nuclear family model, which in turn supports the reproduction of existing power structures. Angela Davis wrote a strong critique about this in her book ‘Women, Race and Class’. She highlights that Black women were subjected to a dual exploitation—both as labourers and as reproducers of more slaves, which was crucial for the perpetuation of the slave economy. This exploitation was not just a byproduct of slavery but a deliberate effort to uphold and benefit from the racist and sexist economic structures. The implications of this history are vast and echo in many ways modern capitalist societies continue to exploit bodies considered 'other'— may it be through racial, gender, or sexual discrimination. The surveillance and control of trans bodies through technologies like AGR is a continuation of this legacy, in which where certain bodies are monitored and regulated more strictly to conform to existing social norms that benefit the capitalist system.
Surveillance becomes a tool to 'correct' deviations within the system. It allows those in power to determine whose lives are deemed livable and whose are not, enforcing a normative standard from which deviation must be monitored and controlled. For trans people, this can mean the difference between visibility and erasure, between recognition and death. The utilisation of surveillance technologies not only strips individuals of their agency but also places them at an increased risk of violence and discrimination. When the state and societal institutions possess the tools to 'watch' and 'correct,' they wield a significant power that transforms surveillance from a simple security measure into a tool of social control.
We are past the point of reform. Surveillance, as it exists at the very depths of the capitalist inferno, is and always has been a tool for subjugating one class by another. Any reform to surveillance may appear to arise from below, but it still fundamentally revolves around power dynamics. As Brazilian educator Paulo Freire poignantly noted, “When education is not liberating, the dream of the oppressed is to become the oppressor.” No amount of diversifying the faces of those who operate the surveillance apparatus will alter its intrinsic function as an instrument of control. It is a tool designed not just to watch but to maintain and enforce the status quo, to keep existing power structures intact. Changing the operators does not change the fundamental operation of the system itself, which is to exert power over others, to reduce complex identities into manageable data points that can be controlled and manipulated. Thus, the challenge is not to reform surveillance but to dismantle the systems that necessitate and propagate it.
As we reflect on this reality today, on Transgender Day of Remembrance, we are compelled to honour the memory of those who have suffered and died under the weight of such oppressive mechanisms. Today, as we remember, we must also act and strive for a world where surveillance is no longer a tool of oppression but where safety and respect for all identities are upheld without the need for invasive oversight. Let this day serve as a call to action to build a more just and equitable society, one that truly honours the diverse tapestry of human experience and fiercely protects it from the corrosive effects of unwarranted surveillance.