Mathematician & Interdisciplinary PhD candidate
Research at the intersection of Mathematics & Legal Philosophy & Science and Technology Studies
me [at] paolalopez [dot] eu
- PhD candidate | Legal Philosophy | University of Vienna | click
- Visiting Researcher | Politics of Digitalization | WZB Berlin Social Science Center| click
- Researcher | The Sustainable Computing Lab | click
Lopez, Paola (2019): Reinforcing Intersectional Inequality via the AMS Algorithm in Austria, Conference Proceedings of the STS Conference Graz 2019, 6th – 7th May 2019, pp. 289-309 DOI: 10.3217/978-3-85125-668-0-16
Published article available here: click
Preprint available here: LOPEZ_Preprint
Abstract. This paper examines the so-called AMS Algorithm from a mathematical perspective for a non-mathematical audience: this algorithmic system constitutes a predictive model that will be used by the Public Employment Service Austria (AMS) starting in 2020 to algorithmically classify job-seekers into three groups, each with different access to AMS support resources, according to their predicted chances on the labour market. Since the features gender, age, childcare responsibilities, disability and citizenship are explicitly implemented in the model and are thus linked to the availability of resources, this algorithmic system is to be considered very problematic. This paper is part of an ongoing research project, and it identifies three conceptual building blocks of the AMS Algorithm that are all based on human decisions and in which obvious societal bias can be located. Furthermore, this model is used as an illustrative example to address the larger question of what can be expected when predictions are made that are based solely on data that describes the past: If the predictions by these models result in unquestioned and confirmatory measures such as the redistribution of resources, a reproduction and reinforcement of inequality is possible. If these measures are now applied to vulnerable and highly dependent target groups, such as job-seekers, it will be more drastic: In a first step, these predictive models depict the reality of discrimination, then, in a second step, normatively reinforce it as a supposedly objective fact and finally, in a third step, return it to the social sphere by means of the resulting measures.
In the Media
- Netzpolitik.org, December 22, 2020 [German] click
- Futurezone.at, December 21, 2020 [German] click
- Futurezone.at, August 24, 2020 [German] click
- Arbeit&Wirtschaft Blog, June 15, 2020 [German] click
- Futurezone.at, February 24, 2020 [German] click
- Podcast Episode | NPP Podcast, Netzpolitik.org, November 30, 2019 [German] click
- Netzpolitik.org, October 10, 2019 [German] click
- AlgorithmWatch.org, October 6, 2019 [English] click
- Futurezone.at, October 3, 2019 [German] click
- Kurier, September 28, 2019 in print [German] click
- Futurezone.at, September 27, 2019 [German] click
Algorithmic inequality. Intersectional divisions in the digital society
STS Conference Graz | postponed to May 3-5, 2021 | click
Call for Papers. In recent years we have seen a new wave of politicization of digital technology, from the critique of the monopoly positions of digital platform companies (Srnicek 2016) to the interdisciplinary examinations of unfair and opaque algorithmic decision-making systems (Angwin et al. 2016, Chouldechova 2017). As ubiquitous computing environments, machine learning based prediction systems and the platformization of economic and political life are advancing into more and more aspects of society, a myriad of new ways of interactions and entanglements with dynamics of social exclusion and systemic inequality are evolving. Examples of the many pressing issues are race and gender biases in facial recognition systems (Buolamwini/Gebru 2018), new economic insecurities arising from the spread of crowd work (Gerber/Krzywdzinski 2019), discrimination embedded in the prediction systems used in welfare allocation (Eubanks 2018, Lopez 2019) and the amplification of existing disparities trough feedback loops in predictive policing (Ensign et al. 2018). There is a long tradition in STS to analyze the relation between structural inequalities and scientific and technological practices (e.g. Hess et al. 2017) and the challenge now lies in updating them to grasp the current shifts in power structures and inequalities implied by digital technologies guiding various governance and coordination practices. For instance, the wide range of works on classification and inequality in STS (Bowker/Star 2000) can be transferred to present data-driven techniques, and the emerging scholarship on digital STS (Vertesi/Ribes 2019) promises fruitful perspectives on social inequalities. In order to strengthen a genuine STS perspective on the new configurations of intersectional inequalities arising from the current phase of the digital transformation, we invite contributions that address some of the following questions:
- How and where do digital artifacts and practices exacerbate, reinforce, mitigate or transform existing inequalities?
- Which kinds of reconfigurations of intersecting axes of gendered, racialized and economic vulnerabilities can be observed or are to be expected in the near future?
- Which new dimensions of structural inequality are emerging from the role of digital technology in our current societies, and which existing dimensions are merely rendered visible by digital technologies?
- In which ways do existing technologies that originally aimed to counteract individual human biases actually amplify existing societal biases?
- How does the very materiality of data-based infrastructures relate to global inequalities?
- How can former discourses around a “digital divide” and more modern surveillance discourses be brought together, showing that visibility, as well as representation, is always ambivalent?
- How can perspectives like xenofeminism (Hester 2018) and concepts like surveillance capitalism (Zuboff 2019) be integrated into science and technology studies?
- How can technical terms related to the modern, probabilistic AI paradigm, such as “accuracy” and “prediction”, be approached through a critical STS analysis?
- How can we conceptually grasp the individualization of group-based inequalities through the platform user paradigm of “personalization” and which ramifications are arising for social justice movements?
- How can we use established perspectives and methodologies to study digital inequalities and which new analytical lenses need to be developed?
Co-Founder of AK MatriX
The AK MatriX (Arbeitskreis Mathematik trans- und interdisziplinär | Working Group on Trans- and Interdisciplinarity in Mathematics) studies societal and political issues arising from the application of mathematical methods in societal contexts. click