Message posted on 08/03/2018

Digital Methods Summer School 2018 -- Call for Participation

                Dear all,
<br>
<br>The Digital Methods Initiative (DMI) will host its 12th annual Digital
<br>Methods Summer School from July 2-13, 2018 at the University of
<br>Amsterdam, the Netherlands. Below please find the call for
<br>participation.
<br>
<br>This year’s theme is: "Retraining the machine: Addressing algorithmic
<br>bias". The deadline for application is May 4, 2018. More information
<br>is available at bit.ly/dmi18-ss-call or email to
<br>summerschool@digitalmethods.net
.
<br>
<br>Best regards,
<br>
<br>Fernando van der Vlist
<br>Research Associate, Collaborative Research Centre "Media of Cooperation",
<br>University of Siegen
<br>Research Associate, Digital Methods Initiative, University of Amsterdam
<br>Lecturer, New Media and Digital Culture, University of Amsterdam
<br>
<br>--
<br>
<br># CALL FOR PARTICIPATION
<br># DIGITAL METHODS SUMMER SCHOOL 2018
<br># JULY 2-13, 2018
<br># UNIVERSITY OF AMSTERDAM
<br>
<br># RETRAINING THE MACHINE
<br># ADDRESSING ALGORITHMIC BIAS
<br>
<br>--
<br>
<br>## DIGITAL METHODS SUMMER SCHOOL
<br>
<br>This year's Digital Methods Summer School is dedicated to approaches
<br>to studying so-called machine bias. Discussions have been focusing on
<br>how to hold algorithms accountable for discrimination in their
<br>outputting of results such as in the notorious cases of query results
<br>for 'professional hair' (white women's hair-do's) and 'unprofessional
<br>hair' (black women's' hair-do's). Recently, it was found that search
<br>engine image results for 'pregnancy' and 'unwanted pregnancy' are
<br>similarly divided, with the pregnancy queries returning white skinned
<br>women (mainly bellies, privileging the baby over the woman). 'Unwanted
<br>pregnancy' results in diverse ethnicities. These are new variations on
<br>classic, and still urgent, search engine critiques (once known as
<br>'googlearchies') which questioned the hierarchies built into rankings,
<br>asking who is being authorised by the engine to provide the
<br>information. That work moves forward at the Summer School, building on
<br>examinations of the volatility of engine results, as in the Issue
<br>Dramaturg project, which put on display the drama of websites rising
<br>and falling in their rankings after algorithmic updates, meant to
<br>fight spam, but having unintended, epistemological consequences. More
<br>recently, Facebook newsfeeds have been the source of critique for
<br>their privileging and burying mechanisms, however much they -- like
<br>the engine returns preceding them -- are not easily captured and
<br>documented. Saving engine results has been against the terms of
<br>service; making derivative works out of engine results also breaks the
<br>user contract. Saving, or recording, social media (newsfeed) rolls
<br>seems even less practicable given how feeds are even more
<br>personalised, presumably resisting generalisable findings. User
<br>surveys pointing out unexpected newsfeed results have led to calls for
<br>'algorithmic auditing', a precursor to machine bias critique. As
<br>reported in the technical press, querying social media ad interfaces
<br>shows highly segmented audiences (including racist ones such as
<br>publics to target for 'jew haters' among other available keyword
<br>audiences for sale).
<br>
<br>These ad interface results could be repurposed to show which
<br>population segments (as defined by the platforms) are driving the
<br>content choices reflected in the results served. How large are these
<br>discriminatory segments? Capturing, auditing, or repurposing results
<br>are diagnostic practices, identifying under which circumstances
<br>machines could or ought to be retrained. The larger question, however,
<br>concerns how to retrain the machine. One approach lies in query design
<br>-- fashioning queries so as to 're-bias' the results. Others concern
<br>corpus development. For example in stock photography efforts have been
<br>to reimagine ('re-image') women (in the well-known case of Getty
<br>Images' 'Lean In Collection'), however much the images are often used
<br>out of context, as has been found. Yet another one concerns training
<br>and maturing research accounts to trigger controlled algorithmic
<br>responses.
<br>
<br>The Digital Methods Summer School is interested in contributing not
<br>only to interpretations of celebrated cases of algorithmic or machine
<br>bias, but also providing diagnostic, query-related, research account
<br>and corpus-building research practices that seek to address the matter
<br>more conceptually.
<br>
<br>Expanding the case study collection is also of interest; age
<br>discrimination in Facebook ad interfaces (an American theme) is a
<br>recent example of a telling case study of in-built rather than organic
<br>machine bias, but the international landscape may contribute more to
<br>bias detection, as is the aim of the Summer School. In Twitter there
<br>are feminist bots striving to keep the #metoo space serious, since the
<br>spam has arrived. Which other practices of remaining on topic may be
<br>found, and how may their success and and complications be
<br>characterised? There is also the question of the ramifications of
<br>conceptual contributions to re-biasing for big data science. Which
<br>practical contributions could be made to big data critique?
<br>
<br>## APPLICATIONS: KEY DATES
<br>
<br>To apply for the Digital Methods Summer School 2018, please use the
<br>University of Amsterdam Summer School form. If that form is not
<br>working, please send (i) a one-page letter explaining how digital
<br>methods training would benefit your current work, (ii) enclose a CV
<br>(with full postal address), (iii) a copy of your passport (details
<br>page only), (iv) a headshot photo, and (v) a 100-word bio (to be
<br>included in the Summer School welcome package). Mark your application
<br>'DMI Training Certificate Program,' and send to
<br>summerschool@digitalmethods.net.
<br>
<br>* 4 May: Deadline for applications.
<br>* 7 May: Notifications. Accepted participants will later receive a
<br>welcome package in mid June, which includes a reader, a day-to-day
<br>schedule, and a face book of all participants.
<br>* 18 June: Deadline for summer school fee payments. Participants must
<br>send a proof of payment by this date.
<br>
<br>The cost of the Summer School is EUR 895 and is open to PhD candidates
<br>and motivated scholars as well as to research master's students and
<br>advanced master's students. Data journalists, artists, and research
<br>professionals are also welcome to apply. Accepted applicants will be
<br>informed of the bank transfer details upon notice of acceptance to the
<br>Summer School on 7 May. Note: University of Amsterdam students are
<br>exempt from tuition and should state on the application form (under
<br>tuition fee remarks) that they wish to apply for a fee waiver. Please
<br>also provide your student number.
<br>
<br>Any questions may be addressed to the Summer School coordinators,
<br>Esther Weltevrede and Fernando van der Vlist:
<br>summerschool@digitalmethods.net. Informal queries may be sent to this
<br>email address as well.
<br>_______________________________________________
<br>EASST's Eurograd mailing list
<br>Eurograd (at) lists.easst.net
<br>Unsubscribe or edit subscription options: http://lists.easst.net/listinfo.cgi/eurograd-easst.net
<br>
<br>Meet us via https://twitter.com/STSeasst
<br>
<br>Report abuses of this list to Eurograd-owner@lists.easst.net
            
view formatted text

EASST-Eurograd RSS

mailing list
30 recent messages