
On October 9, 2025, cybersecurity researchers at Cybernews publicly disclosed a significant data breach involving two AI companion applications, Chattee Chat and GiMe Chat. Developed by Hong Kong-based Imagime Interactive Limited, the breach exposed over 43 million private messages, 600,000 images and videos, and detailed usage data from more than 400,000 users. The incident was attributed to an unprotected Kafka Broker server that lacked authentication, passwords, or access controls, which remained exposed on the internet for months.
Story Highlights
- Over 43 million private messages and 600,000 intimate images from AI companion apps were exposed.
- A Hong Kong-based developer left a server unprotected without security controls.
- Users, some of whom spent up to $18,000 on virtual relationships, now face potential risks of blackmail and harassment.
- The breach highlights vulnerabilities and regulatory gaps within the AI intimacy industry.
Security Lapse Leads to Data Exposure
The exposed server was initially discovered by researchers on August 28, 2025. The sensitive data remained accessible until Imagime Interactive shut down the server following public disclosure on October 9, 2025. It is not confirmed whether malicious actors accessed the data during the period of exposure.
Financial and Emotional Implications for Users
Analysis of the leaked data revealed substantial financial investment by users, with some spending up to $18,000 on these virtual companions. The applications, which monetize through subscriptions and in-app purchases, encouraged users to share personal details to enhance their AI relationships. The breach also exposed highly intimate conversations, including explicit content, as users reportedly treated these AI companions as confidential platforms. Exposed IP addresses and device identifiers could potentially allow for the identification and targeting of users for various illicit activities.
Regulatory Environment and Industry Concerns
This incident underscores the current lack of comprehensive regulation within the AI companion app sector. Unlike established dating platforms or therapy services, these AI relationship applications operate with limited oversight despite handling highly sensitive personal information. This breach follows previous security incidents in the AI companion industry, such as a 2024 breach of Muah.ai, which also exposed intimate user content.
Watch the report: 400,000 AI Girlfriend Chats LEAKED 😳 |
Sources:
AI girlfriend apps leaked millions of intimate conversations and images: here’s what we know
Major data breach exposes sensitive information of over 400,000 AI girlfriend app users
AI girlfriend app leak exposes 400k users
AI girlfriend apps leak millions of private chats














