• Home
  • The Project
  • The Team
  • Research Diary
  • Research Findings
  • “Machines That Fail Us” Podcasts
  • Contact
The Human Error Project
  • Home
  • The Project
  • The Team
  • Research Diary
  • Research Findings
  • “Machines That Fail Us” Podcasts
  • Contact
Search site...
    Loading posts...
  • Reporting AI Errors? How News Media Sensationalizes AI Fallacy in Human Profiling

    by Rahi Patra AI systems and algorithmic logics are never 100% accurate. Even with a 97%  accuracy there is always a 3% margin of errors and these errors – when applied to…

    October 30, 2023
    AI Discourses, AI errors, AI Ethics, Data Justice, Human nature and AI, Research Findings, Uncategorized

Recent Posts

  • “Machines That Fail Us”, Season 2, Episode 2: The Hidden Human Labor Behind AI Systems
  • “Machines That Fail Us”, Season 2, Episode 1: How AI Powers Disinformation
  • When machines fail us: discussing the human error of AI
  • “Machines That Fail Us”, Episode 5: The shape of AI to come
  • “Machines That Fail Us”, Episode 4: Building different AI futures

Categories

  • AI accountability
  • AI bias
  • AI Discourses
  • AI errors
  • AI Ethics
  • Algorithmic bias
  • Anthropology and AI
  • Data Justice
  • Datafied Citizen
  • Digital Profiling
  • Facial Recognition
  • Gender Bias
  • Human nature and AI
  • Human rights
  • Machines That Fail us
  • Project Team
  • Publications
  • Research Diary
  • Research Findings
  • Talks
  • Uncategorized

Privacy Policy

    © Copyright 2024 The Human Error Project.  Developed by Untangled Web