Eurograd message

Message posted on 18/06/2020

Reminder: CfP: Special issue on feminist data protection, Internet Policy Review, 26/06/2020

CfP: Special issue of Internet Policy Review on 'feminist data protection'

Deadline for abstracts: 26 June 2020

More information and dates:

Topic and relevance
The notion of data protection is now an integral part of legal and
political discourse in Europe, as exemplified by the inclusion of a
right to data protection in the EU’s Charter of Fundamental Rights. Yet
there have been relatively few engagements in thinking and framing data
protection from an explicitly feminist perspective. This stands in stark
contrast to the notion of privacy, with which data protection is often
conflated and which has been the subject of extensive feminist critique
and exploration, particularly insofar as it relates to the distinction
between public and private spheres (e.g., Allen, 1988; MacKinnon, 1989;
Bhattacharjee, 1997; DeCew, 2015; Weinberg, 2017). The starting point of
this Special Issue is that the notion of data protection, once
disentangled from privacy (González Fuster, 2014), warrants further
examination from a perspective of intersectional (Crenshaw, 1991) feminism.

Data protection may be understood by considering the power imbalance
with which individuals are confronted when data about them are processed
(de Hingh, 2018): public and private entities can collect data without
the individuals’ knowledge and it is hardly possible for individuals to
control their data, once collected (Steinmüller et al., 1971). The
processing of data thereby creates inherent risks for individuals –
particularly so for those already marginalised or subject to
discrimination (e.g., Noble, 2018; Chander, 2020; Gandy, 2010; Guzik,
2009; de Vries, 2010) – and may further exacerbate the distribution of
power in our societies. Thus, data protection, like feminism, aims at
theorising and changing structural inequalities and power relations.
Scope of the Special Issue

We wish to discuss these structural issues as well as potential answers
through the lens of emancipatory approaches, such as feminist, queer,
Marxist, post-colonial, critical race or disability studies
perspectives, from all relevant disciplines. Contributions focussing on
the intersection of different oppressive structures (i.e. not only
gender but also racialisation, class, marginalisation of religious
minorities, etc.) are particularly welcome.

We invite submissions on the topic of feminist and other emancipatory
approaches to data protection addressing the power imbalance faced by
individuals, especially with regard to discrimination, marginalisation
and oppression. We are interested in a wide variety of perspectives on
the intersections between feminism and data protection, both in Europe
and beyond, whether they are focused on mutual critique or on how either
can benefit from the other and what their common or related aims could
be (Peña & Varon, 2019). Topics of interest include, but are not limited to:

- Data protection and privacy: How to analyse the relation between these
“positive” notions in one discourse and the negative image of private
spaces whose “legal vacuum” facilitates the exploitation of structural
inequalities? How can these notions be brought into a dialogue, and
which lessons can be learnt from history (Lake, 2016)?

- Data activism, data justice, digital rights, and feminism: Around
which issues are European and worldwide feminist initiatives focusing on
data processing practices emerging? Which are the tensions or
intersections between such initiatives and data protection?

- Countering illegitimate data processing: How are women and
marginalised groups targeted in the political economy of data gathering
(McEwen, 2018)? How can they profit from the networking effects of
social networks in order to organise while being protected from the
fallout inherent in their capitalist business models, i.e. tracking and

- Surveillance: How is technology developed and used to oppress certain
groups (Browne, 2015)? What are the dangers disproportionately affecting
women, especially women of colour, in the context of surveillance
(Gandy, 2010; Guzik, 2009; Lyon, 2003)? How could or should surveillance
be avoided, subverted or countered?

- Artificial intelligence (AI) and ‘big data’: Should these practices be
conceived of as a form of automated and inherent discrimination or as
tools for visualising and countering existent discrimination? What
biases are built into them, and what are their regulatory effects
(Buolamwini & Gebru, 2018)? And how does data protection fit into
proposed ways forward (i.e. ‘AI ethics’?)

- Online gender ascription: How is information about gender being
collected and processed (Bivens, 2017)? Which parameters determine
gender ascription in so-called Automated Gender Recognition (AGR)
technologies? Is the gender identity of individuals (including
non-binary persons) respected, and how could data protection law further
this cause?

- Practices of categorisation and (mis)representation: How are gendered
categories constructed, by which actors, and what is their impact,
particularly for oppressed groups such as women or trans and non-binary
people (Keyes, 2019)? How are biases and stereotypes built into data
systems, and how should we respond to this? How can algorithms and
protocols not only be designed but also used in line with principles of
fairness and non-discrimination?

- Data processing and identity formation: What role do notions such as
the (male) gaze, visibility, hiding, deception, outing, and
performativity play in the context of data processing and reproduction
of gender norms and gendered identities (Abu-Laban, 2015; González
Fuster et al., 2015; Beauchamp, 2019)? Can and should data protection
intervene in such processes?

- Data subjects and rights: Can we rethink notions of data protection
law in ways which go beyond the neoliberal focus on the ostensibly
gender-neutral, self-determining individual? How to effectively
complement data subject rights (e.g., group rights, or other languages
of resistance) without falling into identity traps?

Special Issue Editors
Regina Ammicht Quinn, Andreas Baur, Felix Bieker, Gloria González
Fuster, Marit Hansen, Jens T. Theilen

For any editorial inquiry, please email us at:

Andreas Baur
Research Associate
International Centre for Ethics in the Sciences and Humanities (IZEW)
University of Tübingen · Wilhelmstr. 19 · 72074 Tübingen · Germany
Phone +49 7071 29-77988 ·
PhD Candidate
Amsterdam Institute for Social Science Research (AISSR)
University of Amsterdam · ·

EASST's Eurograd mailing list
Eurograd (at)
Unsubscribe or edit subscription options:

Meet us via

Report abuses of this list to

view as plain text
Follow by Email