Don’t date robots — their privacy policies are terrible

Illustration people looking at AI person
Illustration by Alex Castro / The Verge

Talkie Soulful Character AI, Chai, iGirl: AI Girlfriend, Romantic AI, Genesia – AI Friend & Partner, Anima: My Virtual AI Boyfriend, Replika, Anima: AI Friend, Mimico – Your AI Friends, EVA AI Chat Bot & Soulmate, and CrushOn.AI are not just the names of 11 chatbots ready to play fantasy girlfriend — they’re also potential privacy and security risks.

A report from Mozilla looked at those AI companion apps, finding many are intentionally vague about the AI training behind the bot, where their data comes from, how they protect information, and their responsibilities in case of a data breach. Only one (Genesia) met its minimum standards for privacy.

Wired says the AI companion apps reviewed by Mozilla “have been downloaded more than 100 million times on Android devices.”

“To be perfectly blunt, AI girlfriends are not your friends. Although they are marketed as something that will enhance your mental health and well-being, they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you,” writes Misha Rykov in the report. For example, the CrushOn.AI app says in its privacy policy that it may collect sexual health information, prescribed medication, and gender-affirming care data.

Several of the apps also mention mental health benefits. Take Romantic AI, which says it’s “here to maintain your mental health.” But inside its terms and conditions, it says, “Romantiс AI MAKES NO CLAIMS, REPRESENTATIONS, WARRANTIES, OR GUARANTEES THAT THE SERVICE PROVIDE A THERAPEUTIC, MEDICAL, OR OTHER PROFESSIONAL HELP.”

Have you ever dreamed about the best girlfriend ever? Almost for sure! Now she can be at your fingertips. Romantic AI is destined to become your soulmate or loved one. She operates in two modes: general and romantic. Romantic AI is here to maintain your MENTAL HEALTH. But how does it work? Let’s explore letter-by-letter! Start from the romantic mode, which is M-E-N-T-A-L.
Image: Romantic AI
The Romantic AI website promising it “is here to maintain your MENTAL HEALTH.”

Another chatbot maker, Replika, has expanded beyond just AI companionship to make Tomo, a wellness and talk therapy app with an AI guide that brings the user to a virtual zen island. Since I tried the app, Tomo has published a privacy policy, echoing what I was told by Replika CEO Eugenia Kuyda last month: “We don’t share any information with any third parties and rely on a subscription business model. What users tell Tomo stays private between them and their coach.”

Still, Italy banned the company last year, prohibiting it from using personal data in the country since the bot “may increase the risks for individuals still in a developmental stage or in a state of emotional fragility,” according to Reuters.

The internet is rife with people seeking connections with a digital avatar, even before the rise of generative AI. Even ChatGPT, which expressly forbids users from creating AI assistants to “foster romantic relationships,” couldn’t stop people from creating AI girlfriend chatbots on the GPT Store.

People continue to crave connection and intimacy, even if the other person happens to be powered by an AI model. But as Mozilla put it, don’t share anything with the bots that you don’t want other people to know.