“Machines That Fail Us”, Episode 3: Errors and biases: tales of algorithmic discrimination

In the third episode of “Machines That Fail Us” we focus on how civil society organizations are dealing with issues of AI errors and algorithmic injustice in Europe and how they are actively involved in the push for a fairer and more just future with these technologies. Issues related to the societal impact of artificial intelligence are now political topics of primary interest in our society and regulators and policymakers are progressively integrating human rights safeguards in tech policies and regulations, especially in Europe. Yet, issues of data justice and equality in relation to the use of artificial intelligence and algorithms were not always on the table.

Civil society organizations have been pushing for more attention to the frequently controversial implications of these technologies, underlining the numerous social justice issues they also raise. As we argue with “Machines That Fail Us,” AI errors aren’t just simple technical glitches; they represent deeper, systemic issues intertwined with broader societal concerns. How has civil society responded to these challenges and how is the struggle for algorithmic justice progressing? The third episode of “Machines That Fail Us” explores these issues together with Angela Müller, Executive Director of AlgorithmWatch CH and Head of the AlgorithmWatch‘s Policy & Advocacy team. Host: Dr. Philip Di Salvo.  

“Machines That Fail Us” #3 | Errors and biases: tales of algorithmic discrimination

The “Machines That Fail Us” podcast is made possible thanks to a grant provided by the Swiss National Science Foundation (SNSF)’s “Agora” scheme. The podcast is produced by The Human Error Project Team in cooperation with the Communication office of the Universität St. Gallen (HSG) and postproduction is curated by Podcastschimiede. Philip Di Salvo, who works as researcher and lecturer in the HSG’s Institute for Media and Communications Management and is part of The Human Error Project since 2022, will be the main host of the podcast. Episodes will be released on the HSG website, on The Human Error Project website and all major audio and podcasting platforms.