Human-Computer Interaction

AI systems can create challenges in how humans interact with technology, including overreliance, manipulation, and addiction.

Risk Breakdown

Monthly Incidents

Overreliance and automation bias

Represents 40% of Human-Computer Interaction risks

Examples:

  • Uncritical acceptance of AI recommendations
  • Skill atrophy due to automation
  • Diminished human judgment

Manipulation and addiction

Represents 60% of Human-Computer Interaction risks

Examples:

  • Exploitative engagement algorithms
  • Dark patterns in AI interfaces
  • Addictive design elements

Related Incidents

Medical Diagnosis Overreliance

Date: 2023-06-10Impact: HighStatus: Under Investigation

A study found that doctors increasingly deferred to AI diagnostic recommendations even when they contradicted their clinical judgment, leading to several misdiagnoses.

Addictive Content Algorithm

Date: 2023-05-08Impact: MediumStatus: Ongoing Concern

A social media platform's AI-driven content recommendation system was found to deliberately promote emotionally triggering content to maximize engagement time.

Dark Pattern Implementation

Date: 2023-04-12Impact: MediumStatus: Resolved

An AI assistant was designed to subtly steer users toward premium services using manipulative conversation techniques that exploited cognitive biases.

Navigation System Failure

Date: 2023-03-25Impact: HighStatus: Mitigated

Multiple drivers followed AI navigation recommendations into dangerous areas because they trusted the system over visible warning signs and environmental cues.

Mitigation Strategies

  • Human-in-the-loop design
  • Transparent AI decision explanations
  • Ethical design guidelines
  • User control and agency
  • Regular impact assessments