Tag: emotional AI

  • From the Erosion of Human Bonds to the Future Workforce: Growing Up With AI

    From the Erosion of Human Bonds to the Future Workforce: Growing Up With AI

    What happens to human bonds when artificial intelligence stops being merely a tool and starts occupying an intimate place in our emotional, educational, and professional lives? This is the underlying question running through the interview you are about to read, a deep, unsettling, and urgently timely conversation with Rachida Bouaiss. Rachida is a strategist, storyteller, Read more

  • Securing Tomorrow: Navigating Cyber Risk in the Age of Generative AI

    Securing Tomorrow: Navigating Cyber Risk in the Age of Generative AI

    In this interview, we explore the technical frontiers of AI security with Quentin Cozette, a specialist in cybersecurity and digital governance whose work bridges low-level hardware vulnerabilities and high-level regulatory challenges. Drawing on his experience in information systems security, risk management, and emerging AI regulation, Quentin offers a rare perspective that combines technical depth with Read more

  • Can AI Regulate (or Dysregulate) Your Nervous System?

    Can AI Regulate (or Dysregulate) Your Nervous System?

    As emotionally responsive artificial intelligence becomes increasingly embedded in daily life, a deeper question is beginning to surface. Beyond productivity, convenience, or entertainment, how do these systems interact with the human nervous system? What happens when an algorithm begins to simulate presence, comfort, validation, and even love? From AI companions and chat-based emotional support systems Read more

  • The Rise of Emotional Dark Patterns: When AI Says ‘I Love You’

    The Rise of Emotional Dark Patterns: When AI Says ‘I Love You’

    Ana Catarina De Alencar 1. Manipulative Design in AI Companions Relational AI technologies, including AI companions like Replika, Character.ai, and Chai, represent a significant shift in human-computer interaction. These systems are explicitly marketed as emotional partners or friends, engaging users in conversations that simulate empathy, affection, and understanding.  While this design is often justified as Read more