Feminism affects everyone

feminist sticker

Digitization affects everyone, but not everyone benefits equally.

From a feminist perspective, we therefore call for equal access to the Internet and digital content, protection against online violence and the creation of non-discriminatory spaces. We demand the right to personal data, to privacy, data security and data protection. We call for and promote a critical digital public sphere and a sustainable copyright-policy.

Digital Violence

We need to fight Digital Violence. Digital violence is a form of discrimination that aims at excluding people through sexist, racist, homophobic, transphobic or other inhuman hate speech. It is the violent continuation of discrimination. Digital violence undermines freedom of expression and poses a threat to democracy. It includes identity theft, rumours and false allegations, intimidation/pressing, insult, threat, stalking, defamation/ obeyance, doxing, swatting and threats of rape. Often feminist positions are tackled by digital violence, this is what we call “silencing”. There are well-organized communities built upon anti-feminism in the area of gaming, in the context of Reddit’s nerd supremacy, in right-wing extremist to right-wing populist milieus, and even in Incel forums.

Surveillance

We need to fight unauthorized mass surveillance. We’re being watched every step of the way. Whether we travel by public transport, withdraw money, shop online or ask search engines. We are observed by various actors: the state, private security service providers, multinational corporations and not least ourselves. In public spaces, even our mere presence is enough to consent to video camera surveillance. Surveillance in public spaces comes with the promise of greater security and often feminist demands for the prevention of violence against women in public spaces are used as legitimation. But greater security always means greater control. Those groups who are most affected by this are marginalized groups. For LGBTQI*, surveillance carries a much higher risk.

Big Data

We need to develop feminist AI. Autonomous driving, household robotics and language assistants* – the buzzword AI nearly pops up everywhere. One thing is clear: technology in general and algorithmic processes in particular are not conceivable without reference to power and domination. It is precisely for this reason that these systems must be viewed critically, evaluated and redeveloped against the background of feminist perspectives and values. The basic mathematical formula of the algorithms must therefore be as follows: If AI, then feminist. Algorithms or artificial intelligence can enable or help if, for example, they detect tumours on X-ray images with much greater accuracy and much faster than would be possible for humans. But artificial intelligence can also restrict or discriminate against people if, for example, AI decides whether a person is creditworthy or gets health insurance. Neither the data basis nor the technologies are neutral. Discriminatory stereotypes, which have already manifested themselves in the world and thus in the data, are (unconsciously) transferred into the code. Lacking transparency then leads to a consolidation and intensification of discrimination.

This article was first published (12th November 2019) online via hiig.de and is part of the publication "Critical Voices, Visions and Vectors for Internet Governance".