Even in 2018, most queers know something about hiding. We know why you don’t have to be a criminal to need the cover of darkness. So do refugees and migrants, people with dark skin, and people who wear hijabs or turbans. And while women in many countries today don’t need to hide it when they leave the house unaccompanied or go on a date with a man they aren’t married to, they might still want to hide it if they get an abortion. They might just want to be able to escape the notice of an angry ex-husband. And they might try to remember why even if you’re not doing anything wrong, you might have reasons that you want privacy.
Unfortunately, all over the world in major cities, there are cameras at every intersection, hovering on the edges of buildings, and staring at you on buses or trains. This technology is often cloaked in the neutral language of “public safety” and “smart cities,” But the dangerous and intersecting ways it affects racial and religious minorities, LGBTQ people, immigrants, refugees, and women are still ignored in the praises sung by governments, fearful residents, and camera manufacturers alike. Local legislation, security practices, and sometimes simply fearlessness can change the shape of surveillance locally even when national governments cannot be budged–and women would do well to take that fight seriously.
What is video surveillance?
Video surveillance is more powerful than ever–networked, high-definition, and often controlled from a large central control room that combines myriad other forms of data as well. It can be combined with other information and technology including:
- Automated License Plate Readers that can read and transmit to databases thousands of license plate numbers a minute
- Facial Recognition Technology, including mobile facial recognition such as sunglasses
- Other biometrics such as gait recognition, tattoo recognition, and more
- Sensors embedded in roadways
- cell phone tracking through bluetooth, open wifi networks, and cell site simulators (aka IMSI catchers)
- Social media surveillance
- Algorithmic and artificial intelligence assisted analysis of social media feeds
These technologies combined allow governments, and sometimes private corporations, to track individuals comprehensively. For example, in London advocates estimate that there are over 500,000 cameras, fed to central control rooms- some scattered throughout London’s boroughs, and some focused on London’s underground system. London police have been using automated license plate readers since the 2012 Olympics, as well as cell phone tracking and social media surveillance software. London police also recently expanded a trial of facial recognition systems across the city – despite a staggering 98% error rate.
One of the most disturbing examples of the end game of comprehensive surveillance is China’s “social credit” system. This is a system that combines China’s vast surveillance systems with judgments about the worthiness of particular individuals to participate in society. Video surveillance is an integral part of the system, and it’s not just looking for illegal or frowned upon behavior like disagreeing with the government. It could take into account ordinary decisions like buying more junk food and less vegetables at the grocery store. This system has already had consequences- “in recent months the Chinese state has blocked millions of people from booking flights and high-speed trains.”
Imagine this level of surveillance watching over ever marginalized community. Now imagine that power in the hands of the fervent nationalists and racists who are gaining power across the globe today in Brazil, the United States, Italy, and even in Germany.
A radical and intersectional power analysis of surveillance
Intersectional discussions of surveillance, or even projects that focus on how surveillance affects marginalized groups, have not been mainstream- perhaps because so many writers on surveillance are not sitting at any particular intersections. Cisgender white men from Anglophone and Western European countries, some of the least marginalized people in the world, have often been the face of discussions about surveillance.
It should not even be a question whether surveillance affects different groups differently, reflecting the privilege and power of the society that creates it. In the US, examples of discriminatory surveillance abound, from the wholesale targeting of Arab and Muslim communities by the New York Police Department to the FBI’s focus on the Civil Rights movement, in particular, the Black Panthers, during its notorious COINTELPRO program of the 60s and 70s. In fact, in 2017 Foreign Policy revealed that the FBI had created the term “Black Identity Extremist.” As Malkia Cyril put it, the FBI, “invented a brand new label and a brand new threat” that it could turn on the Black Lives Matter movement. This was just one in a long string of revelations about how law enforcement surveilled Black Lives Matter activists, and it won’t be the last.
But that’s not enough. Professor Kimberle Crenshaw coined the term intersectionality to describe how different systems of power “intersect” to affect specific groups. She pointed out that existing theories of antidiscrimination and feminism did not take into account the substantively different experiences of black women. Being a black woman is not always black + woman. As Crenshaw says, “sometimes, they experience discrimination as Black women-not the sum of race and sex discrimination, but as Black women.” Each of the axes of discrimination and misogyny acts upon others.
It shouldn’t be that hard to see these intersections. One of the means of fighting surveillance is arming oneself with knowledge about the threats one faces through security trainings. These trainings are often rooted in the concept of “risk analysis,” or “threat modeling.” This technique looks at the specific adversaries and threats faced by an individual or group in order to narrow them down and help make decisions about security tools or tactics easier. When done honestly, this can make discrimination, as well as intersections, quite obvious.
Case studies: Welfare and trans women on the street
This can be seen in the surveillance of welfare recipients in the United States. The discussion of surveillance in the United States has focused on how “mass surveillance” affects “ordinary Americans” (code for white, middle class Americans). Nathalie Maréchal writes about how welfare recipients have been left out of that discussion even though they have been heavily surveilled for decades. She notes that “as black Americans fought to access welfare programs, public perception of the average welfare recipient shifted from the virtuous white widow heroically raising her children alone to the lazy, promiscuous, deviant (and wholly imaginary) black “welfare queen” mythologized by Ronald Reagan.” This perception has allowed wholesale surveillance of the behaviors and bodies of welfare recipients; welfare recipients have to account for how they spend their days. When they get food-related benefits, their purchases are assessed and analyzed using big data. Single mothers can be required “to identify their children's’ biological father, and agencies can”‘require unwed minors to live with a parent or guardian.’” States also “have the discretion to deny benefits to unmarried teenage mothers.”
This kind of surveillance of body and behavior is not limited to the United States, either. In Germany, special financial assistance is available for single mothers- but this assistance has been denied to women who could not identify the father because of “one night stands.” What’s more, recipients of aid at jobcenters experience a general level of surveillance, including home visits, but this also takes on a gendered- and racial- lens. Home visits have been justified by the concern that “there are citizens who apply for social assistance here and then fly to Turkey and work there as normal.” And a few years ago a questionnaire from a jobcenter in Stade circulated that ask detailed questions about a woman’s sex life. The woman’s support was denied when she refused to fill it out. The questionnaire was withdrawn, but as her lawyers asked, was this really an isolated case?
Let’s think about how this might play out in the streets when it comes to video surveillance. In the United States, around 20% of trans people have unstable housing or are homeless, meaning trans people are especially vulnerable to street surveillance in the first place. As advocate Tamika Spellman points out, “Many survival sex workers are trans women of color who have been denied access to employment, housing, and other resources due to discrimination.” Sexwork plays an important part in keeping queer communities alive, but regardless of whether one is doing sexwork or not, being constantly “profiled as being engaged in sex work, public lewdness, or other sexual offenses,” can be incredibly dangerous. As Spellman points out, “Being targeted by police often means being criminalized for things that aren’t actually illegal or wrong.” In fact, it means the use of “policing tactics [like using possession of condoms as evidence of prostitution,] that hyper-sexualize LGBT people, and presume guilt or dishonesty based on sexual orientation or gender identity, are deployed by law enforcement every day.” It means harassment from police, sometimes aided with street level surveillance. And for Black or Indigenous trans women, that interaction will be far more dangerous, as police are three times as likely to kill Black and Native American people than white people, and these shootings often begin with seemingly-benign police contact. That’s in addition to the threat of sexual violence from the police. As DC organization DECRIMNOW points out, “Sexual violence is the second most common form of police misconduct, after excessive force,” yet another reason why “[f]or trans people doing [or assumed to be doing] sex work, police and law enforcement are often threats, not protectors, of their safety.”
We are not docile bodies
Philosopher Michel Foucault was hardly a feminist. In fact, he was most likely a misogynist. But he did create a concept that remains relevant today and has been employed by myriad feminist scholars, perhaps most notably Judith Butler. In Discipline and Punish, Foucault talks about myriad ways in which power shapes human existence, turning us into “docile bodies” that internalize what is required of us by modern society .
Has surveillance turned us into docile bodies? Not exactly. That is perhaps why trans people are so often the face of deviance in society- because we refuse to conform our appearance and behavior to suit society, even when we know we are being captured by cameras. But it is important to see that self-censorship and fear is part of the goal of ubiquitous surveillance.
Where now- seize the power
First and foremost, don’t internalize video surveillance. Knowing what is out there doesn’t mean you shouldn’t leave your house. Instead, educate yourself. Local groups all over the world have cataloged and mapped cameras and other street-level surveillance technologies in their communities- see, for example, the Lucy Parsons Primer on Chicago surveillance, the ICU Oakland Camera Map project, and the “Surveillance under Surveillance” project.
Second, although there are few tools that can effectively fight most street-level surveillance technologies, a general understanding of security practices can help. Check out this “guide to security guides.” Also check out the excellent work over at GenderIT, which includes thoughtful articles and links to tools. Consider how video surveillance interacts with other information streams that you have more control over, such as social media. Take control of your information.
Finally, communities have to fight back against this kind of surveillance before it happens whenever possible. There are a number of encouraging examples of this in the United States, perhaps most notably in Oakland, California. When you discover a proposal to install more surveillance cameras or related technology, do some research on the technology or enlist the help of civil society. Point out error rates of facial recognition technology, and the numbers that show how little cameras appear to actually help. Adapt limitations from other cities, or even other countries, to challenge unlimited access to these technologies.
And remember- we are not docile bodies.