Processing Citizenship. Digital registration of migrants as co-production of individuals and Europe

by Annalisa Pelizza

 

The “Processing Citizenship” project was funded in late 2016 as a Starting Grant by the European Research Council (ERC). Launched in March 2017, it is interested in how migration enacts Europe. As the project’s homepage goes (http://processingcitizenship.eu), this question can be legally and politically answered, as most policy-makers, sociologists and journalists do, or technically. How do data infrastructures for processing migrants and refugees co-produce individuals and Europe?

The project aims to extend to non-European citizens the study of how the digital circulation of data assets about populations and territory is re-enacting European governance along new boundaries (Pelizza, 2016). Historically, data infrastructures on populations and territories have contributed to the formation of the most powerful techno-social assemblage for knowledge handling – the nation-state (Agar, 2003; Foucault, 2007; Mitchell, 1991; Mukerji, 2011). The project asks how contemporary data infrastructures for processing migrants and refugees at the border, as well as inside Europe, shape the European order. As such, the project aspires to contribute to technology studies on the infrastructural construction of Europe (Misa and Schot, 2005).

“Processing Citizenship” is hosted by the Science, Technology and Policy Studies department (STePS), Faculty of Behavioural, Management and Social Science at the University of Twente. As such, it is deeply embedded in the STS core tradition of the department, while it addresses a new research field in governance by technologies under a mid-term transnational perspective.

Between summer and fall 2017, the Principal Investigator, Annalisa Pelizza, will be joined by an interdisciplinary team of five, including anthropologists, computer scientists and sociologists. Despite (or, more likely, thanks to!) the differences in background, the common goal has become to re-articulate the two main approaches to migration studies – i.e., ethnographic interest in migrants’ own experience and political science’s focus on policy challenges – by stressing how technological artefacts and infrastructures for “processing alterity” mediate the co-production of migrants and polities (Pelizza, Under review). Indeed, with “processing” we refer to the set of bureaucratic procedures through which the individual Other and institutional actors (i.e., as loci of power, be they Member States, Europe or incipient hybrid networks of agencies at different scales) are co-produced through the mediation of data infrastructures.

Drawing upon the “Vectorial Glance” research framework that conceives of government digitization as an entry point to detect incipient transformations in the order of authority (Pelizza, 2016), “Processing Citizenship” looks at data infrastructures as interfaces that can reveal transformations in late modern governance. Following the STS tradition, infrastructures as interfaces are conceived of as crystallizing relational processes. Therefore, they are both methodologically and theoretically relevant. Methodologically, recognizing data infrastructures as interfaces allows conceiving of them as analytical sites in which broader, heterogeneous processes become visible. Theoretically, it introduces a performative understanding which is missing in mainstream explanations of information technologies as causes of state disassembling.

 

The measure of alterity

The project is meant to throw light on how three types of identity are co-produced: migrants’ identities, polities and territory. The first set of questions asks which aspects of migrants’ life are measured, filled in the systems and come to constitute their digital identity when dealing with European actors.

Early evidence reveals the proliferation of databases, not only at European borders, but at any stage of alterity processing. Diverse information systems are run by diverse organizations (e.g., international organizations, national and local reception facilities, NGOs, medical organizations, European agencies), support diverse policies (e.g., contrast to trafficking, prevention of illness outbreaks, asylum), underpin diverse identity-building techniques. European Commission’s databases Eurodac and Dublinet, for example, deal with asylum applications and contain asylum seekers’ fingerprints. However, they record slightly different data: while Eurodac is a hit/no hit system and records only minimal data like name and fingerprint, Dublinet contains also more ‘soft’ data about a person’s life.

Different databases enact migrants in different ways, as individuals or as populations, as members of a family or as potential workers, as vulnerable persons or as potential perpetrators. While it is only by comparing data models that such differences become relevant, our team has encountered an unexpected lack in contemporary literature on the analysis of ontologies as texts (Bowker and Star, 1999), and is thus working towards developing new analytical methods in this field.

In this first stage of investigation, we are also interested in the chain of artefacts deployed at Hotspots that translate previous identities into new European-readable ones. This line of investigation is key in light of recent developments in the European migration landscape. The goal of the so called “Hotspot approach”, introduced in 2015, is to operationally support frontline Member States (i.e., Greece and Italy) in “swiftly identify[ing], register[ing] and fingerprint[ing] incoming migrants” (Commission, 2015a: 1). Hotspots are thus the first step in the procedure of sorting migrants into three alternative paths: “relocation” or “resettlement” to another Member State (for those identified as in clear need of international protection), or “return” to the country of legal residence (for those who are not deemed in need of protection). They can be conceived as “routers” that create “early entrenchments” (Star and Lampland, 2009) in sorting individuals, liminal situations in which past identities are assessed and translated into proto-decisions.

It is evident that routers do not work in a vacuum. Which material devices “speak for” the previous identity of the individual, and which database categorizations are decisive to be granted a future European identity are crucial questions that recall the material nature of such decisions. While EU policy documents mention specific criteria for relocation, resettlement and return, they might be partially “lost in translation” when it comes to embed policy into the different materiality of digital information systems, or vice-versa that new technical rigidities be introduced. For “Processing Citizenship”, there is a need to keep trace of similar trans-material shifts.

A further interest concerning how migrants’ identities are shaped deals with migrants’ own “dis-inscriptions” (Akrich and Latour, 1992). How do migrants interact with officers and data infrastructures? This point raises a series of questions about the status of migrants. What information would migrants need in order to behave in the new context? Which possibilities are foreseen for individuals to define, protect and release their digital identities? The way identities are crafted can allow or conversely restrain migrants’ potentialities to action. As Schinkel (2009) has noted, identities forced onto groups can also have empowering effects. “Processing Citizenship” thus asks which – if any – potentialities to action are enabled by the way migrants interact with their identities “inscribed” in information systems.

 

Novel orders of governance

The second set of questions investigates how European polities are shaped by alterity processing. According to studies on IT-enforced borders, biometrics has marked a shift from border management to identity processing. Nation states are said to have lost retention of control over physical borders. Access to welfare and redistribution rights has replaced territorial access, and become the bone of contention (Engbersen, 2003). As Amoore and De Goede (2008: 176) have put it, “the physical jurisdictional border seeps into data and databases.”

On the other hand, border studies have contested universalizing arguments about the disappearance of state boundaries (Paasi, 2005). By acknowledging the cultural and sociological “thickness” of boundaries, they have recognised state borders as important devices to attribute meaning to state institutions. Especially after 9/11 and the war of terror, state borders are seen as retrieving a key role in political studies.

For “Processing Citizenship”, however, the point is not so much establishing whether nation states retain more or less control over their physical borders, but to investigate which loci of power are constituted by bureaucratic practices of data circulation. As historians of technology have recalled, the construction of infrastructural Europe was characterized by the proliferation of new, non-governmental actors (Schot and Schipper, 2011). Which loci of power are emerging from practices of alterity processing? A revised version of the nation state, maybe with sub-national units been granted new powers? A more centralized configuration of Europe? Or even a novel distributed techno-social network made of public agencies and private contractors at different scales? Understanding how data about migrants and refugees are collected and circulate across European, national and local agencies is one way to answer these questions that reveals unexpected de facto geographies. As these latter are not easily representable on maps, Processing Citizenship plans to develop new forms of visualization of such geographies.

Current European responses to migration are indeed not only sorting migrants out, but activating multi-level institutional dynamics. On one hand, European institutions are asking for common standards, protocols and classification systems by Member States. The rationale is that if Europe wants to keep the Schengen system going, then it has to strengthen its outer borders, and data gathered at those borders should be standardized and made available Europe-wide. On the other hand, Member States might try to resist technical standardization. For example, in September 2015 the European Commission adopted 40 infringement decisions against Member States who did not register migrants at EU borders (Commission, 2015b). Here, the definition of “registration” is crucial, as at the European Commission level it usually refers to registration on European databases, but in other contexts it might also well refer to national databases, which are not always interoperable with European infrastructures. This evidence suggests that access to databases is an important aspect that defines new types of boundaries that do not necessarily coincide with existing political and administrative ones.

 

Conclusion – A history of the present?

All in all, by looking at itself as a new chapter in the studies on the infrastructural construction of Europe, “Processing Citizenship” eventually aims to conduct a history of the present. In order to explain this ambition, let us conclude with a quote from Foucault:

“History is a given way for a society to acknowledge and process a bunch of documents from which it cannot separate anymore […] Traditionally, history tried to memorise past monuments into documents. […] Today, history is that activity that transforms documents into monuments” (Foucault, 1969: 15)

We suggest that analysing alterity processing as part of Europe building is a way to keep track of how documents are transformed into monuments. While histories of technology can methodologically rely on that form of textual reproduction of memory which is the archive, in the case of Processing Citizenship – dealing with not yet stabilized developments – the methodological function of archives is fulfilled by oral memories (collected through interviews), practices (accessed through observation), legislative and design document and data logs.

The reason to keep track of the transformation from documents into monuments is suggested by the fact that data infrastructures are mainly developed by contractors who, not being bound to public service duties, are not likely to see value added in creating archives, not even when it comes to practices of population ordering that are expected to have a say in how Europe is going to be built. In this sense, we suggest that “Processing Citizenship” and other similar projects that look not at data per se, but at the architecture for data collection, translation and circulation, are attempting to conduct “histories of the present”.