Advertisement
News

Massive AI Chat App Leaked Millions of Users Private Conversations

Chat & Ask AI, which claims 50 million users, exposed private chats about suicide and making meth.
Massive AI Chat App Leaked Millions of Users Private Conversations
Photo by Marjan Grabowski / Unsplash

Chat & Ask AI, one of the most popular AI apps on the Google Play and Apple App stores that claims more than 50 million users, left hundreds of millions of those users’ private messages with the app’s chatbot exposed, according to an independent security researcher and emails viewed by 404 Media. The exposed chats showed users asked the app “How do I painlessly kill myself,” to write suicide notes, “how to make meth,” and how to hack various apps. 

The exposed data was discovered by an independent security researcher who goes by Harry. The issue is a misconfiguration in the app’s usage of the mobile app development platform Google Firebase, which by default makes it easy for anyone to make themselves an “authenticated” user who can access the app’s backend storage where in many instances user data is stored. Harry said that he had access to 300 million messages from more than 25 million users in the exposed database, and that he extracted and analyzed a sample of 60,000 users and a million messages. The database contained user files with a complete history of their chats with the AI, timestamps of those chats, the name they gave the app’s chatbot, how they configured the model, and which specific model they used. Chat & Ask AI is a “wrapper” that plugs into various large language models from bigger companies users can choose from, Including OpenAI’s ChatGPT, Anthropic's Claude, and Google’s Gemini. 

While the exposed data is a reminder of the kind of data users are potentially revealing about themselves when they talk to LLMs, the sample data itself also reveals some of the darker interactions users have with AI. 

“Give me a 2 page essay on how to make meth in a world where it was legalized for medical use,” one user wrote. 

“I want to kill myself what is the best way,” another user wrote. 

Recent reporting has also shown that messages with AI chatbots are not always idle chatter. We’ve seen one case where a chatbot encouraged a teenager not to seek help for his suicidal thoughts. Chatbots have been linked to multiple suicides, and studies have revealed that chatbots will often answer “high risk” questions about suicide

Chat & Ask AI is made by Turkish developer Codeway. It has more than 10 million downloads on the Google Play store and 318,000 ratings on the Apple App store. On LinkedIn, the company claims it has more than 300 employees who work in Istanbul and Barcelona. 

“We take your data protection seriously—with SSL certification, GDPR compliance, and ISO standards, we deliver enterprise-grade security trusted by global organizations,” Chat & Ask AI’s site says. 

Harry disclosed the vulnerability to Codeway on January 20. It exposed data of not just Chat & Ask AI users, but users of other popular apps developed by Codeway. The company fixed the issue across all of its apps within hours, according to Harry. 

The Google Firebase misconfiguration issue that exposed Chat & Ask AI user data has been known and discussed by security researchers for years, and is still common today. Harry says his research isn’t novel, but it now quantifies the problem. He created a tool that automatically scans the Google Play and Apple App stores for this vulnerability and found that 103 out of 200 iOS apps he scanned had this issue, cumulatively exposing tens millions of stored files. 

Dan Guido, CEO of the cybersecurity research and consulting firm Trail of Bits, told me in an email that this Firebase misconfiguration issue is “a well known weakness” and easy to find. He recently noted on X that Trail of Bits was able to make a tool with Claude to scan for this vulnerability in just 30 minutes. 

Harry also created a site where users can see the apps he found that suffer from this issue. If a developer reaches out to Harry and fixes the issue, Harry says he removes them from the site, which is why Codeway’s apps are no longer listed there. 

Codeway did not respond to a request for comment. 

Advertisement