• Home
  • The Project
  • The Team
  • Research Diary
  • Research Findings
  • “Machines That Fail Us” Podcasts
  • Contact
The Human Error Project
  • Home
  • The Project
  • The Team
  • Research Diary
  • Research Findings
  • “Machines That Fail Us” Podcasts
  • Contact
Search site...
    Loading posts...
  • body-figures

    AI Errors and the Human Body

    In the last six months, multiple errors have been making some headlines accusing algorithms to be racist, sexist or harmful in their ‘reading of the human body’.  If we want to really…

    February 2, 2021
    Algorithmic bias
  • Why Data Feminism Matters

    Gender bias is one of many forms of algorithmic discrimination. However, decades of feminist scholarship have shown that gender bias in digital environments is neither new, nor is it only about questions…

    November 20, 2020
    Algorithmic bias, Gender Bias

Recent Posts

  • “Machines That Fail Us”, Season 2, Episode 3: Who governs AI?
  • “Machines That Fail Us”, Season 2, Episode 2: The Hidden Human Labor Behind AI Systems
  • “Machines That Fail Us”, Season 2, Episode 1: How AI Powers Disinformation
  • When machines fail us: discussing the human error of AI
  • “Machines That Fail Us”, Episode 5: The shape of AI to come

Categories

  • AI accountability
  • AI bias
  • AI Discourses
  • AI errors
  • AI Ethics
  • Algorithmic bias
  • Anthropology and AI
  • Data Justice
  • Datafied Citizen
  • Digital Profiling
  • Facial Recognition
  • Gender Bias
  • Human nature and AI
  • Human rights
  • Machines That Fail us
  • Project Team
  • Publications
  • Research Diary
  • Research Findings
  • Talks
  • Uncategorized

Privacy Policy

    © Copyright 2024 The Human Error Project.  Developed by Untangled Web