• Home
  • The Project
  • The Team
  • Research Diary
  • Research Findings
  • “Machines That Fail Us” Podcasts
  • Contact
The Human Error Project
  • Home
  • The Project
  • The Team
  • Research Diary
  • Research Findings
  • “Machines That Fail Us” Podcasts
  • Contact
Search site...
    Loading posts...
  • Why Data Feminism Matters

    Gender bias is one of many forms of algorithmic discrimination. However, decades of feminist scholarship have shown that gender bias in digital environments is neither new, nor is it only about questions…

    November 20, 2020
    Algorithmic bias, Gender Bias

Recent Posts

  • “Machines That Fail Us”, Season 2, Episode 2: The Hidden Human Labor Behind AI Systems
  • “Machines That Fail Us”, Season 2, Episode 1: How AI Powers Disinformation
  • When machines fail us: discussing the human error of AI
  • “Machines That Fail Us”, Episode 5: The shape of AI to come
  • “Machines That Fail Us”, Episode 4: Building different AI futures

Categories

  • AI accountability
  • AI bias
  • AI Discourses
  • AI errors
  • AI Ethics
  • Algorithmic bias
  • Anthropology and AI
  • Data Justice
  • Datafied Citizen
  • Digital Profiling
  • Facial Recognition
  • Gender Bias
  • Human nature and AI
  • Human rights
  • Machines That Fail us
  • Project Team
  • Publications
  • Research Diary
  • Research Findings
  • Talks
  • Uncategorized

Privacy Policy

    © Copyright 2024 The Human Error Project.  Developed by Untangled Web