Fri. Jul 5th, 2024

AI girlfriends will only break your heart, privacy experts warn<!-- wp:html --><p>Replika brought back its erotic roleplay feature on Friday.</p> <p class="copyright">Getty Images</p> <p>A survey of the burgeoning AI romance app space revealed a scary truth.The chatbots foster "toxicity" and relentlessly pry user data, a Mozilla Foundation study found.One app can collect info on users' sexual health, prescriptions, and gender-affirming care.</p> <p>There's a potentially dangerous reality looming beneath the veneer of <a target="_blank" href="https://www.businessinsider.com/when-your-ai-says-she-loves-you-2023-10" rel="noopener">AI romance</a>, according to a new Valentine's Day-themed study, which concluded that the chatbots can be a privacy nightmare.</p> <p>Internet nonprofit The Mozilla Foundation <a target="_blank" href="https://foundation.mozilla.org/en/privacynotincluded/articles/happy-valentines-day-romantic-ai-chatbots-dont-have-your-privacy-at-heart/" rel="noopener">took stock</a> of the burgeoning landscape, reviewing 11 chatbots and concluding that all were untrustworthy — falling within the worst category of products it reviews for privacy.</p> <p>"Although they are marketed as something that will enhance your mental health and well-being," researcher Misha Rykov wrote of romantic chatbots in the report, "they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you."</p> <p>According to its survey of the space, 73% of the apps don't share how they manage security vulnerabilities, 45% allow weak passwords, and all but one (Eva AI Chat Bot & Soulmate) share or sell personal data.</p> <p>Furthermore, the privacy policy for CrushOn.AI states it can collect information on users' sexual health, prescription meds, and gender-affirming care, per the Mozilla Foundation.</p> <p>Some apps feature chatbots whose character descriptions feature violence or underage abuse, while others warned that the bots could be unsafe or hostile.</p> <p>The Mozilla Foundation noted that in the past, apps had encouraged dangerous behavior, including <a target="_blank" href="https://www.businessinsider.com/widow-accuses-ai-chatbot-reason-husband-kill-himself-2023-4" rel="noopener">suicide</a> (Chai AI) and an <a target="_blank" href="https://www.businessinsider.com/man-assassinate-queen-elizabeth-confided-plans-ai-chatbot-girlfriend-2023-7" rel="noopener">assassination attempt on the late Queen Elizabeth II</a> (Replika).</p> <p>Eva AI Chat Bot & Soulmate and CrushOn.AI didn't respond to Business Insider's request for comment. A representative for Replika told BI: <span>"Replika has never sold user data and does not, and has never, supported advertising either. The only use of user data is to improve conversations."</span></p> <p>For those who find the prospect of AI romance impossible to resist, the Mozilla Foundation urges several precautions, including not saying anything you wouldn't want a colleague or family member to read, using a strong password, opting out of AI training, and limiting the app's access to other mobile features such as your location, microphone, and camera.</p> <p>"You shouldn't have to pay for cool new technologies with your safety or your privacy," the report concluded.</p> <div class="read-original">Read the original article on <a href="https://www.businessinsider.com/ai-girlfriends-romantic-chatbots-toxic-privacy-nightmare-experts-warn-2024-2">Business Insider</a></div><!-- /wp:html -->

Replika brought back its erotic roleplay feature on Friday.

A survey of the burgeoning AI romance app space revealed a scary truth.The chatbots foster “toxicity” and relentlessly pry user data, a Mozilla Foundation study found.One app can collect info on users’ sexual health, prescriptions, and gender-affirming care.

There’s a potentially dangerous reality looming beneath the veneer of AI romance, according to a new Valentine’s Day-themed study, which concluded that the chatbots can be a privacy nightmare.

Internet nonprofit The Mozilla Foundation took stock of the burgeoning landscape, reviewing 11 chatbots and concluding that all were untrustworthy — falling within the worst category of products it reviews for privacy.

“Although they are marketed as something that will enhance your mental health and well-being,” researcher Misha Rykov wrote of romantic chatbots in the report, “they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you.”

According to its survey of the space, 73% of the apps don’t share how they manage security vulnerabilities, 45% allow weak passwords, and all but one (Eva AI Chat Bot & Soulmate) share or sell personal data.

Furthermore, the privacy policy for CrushOn.AI states it can collect information on users’ sexual health, prescription meds, and gender-affirming care, per the Mozilla Foundation.

Some apps feature chatbots whose character descriptions feature violence or underage abuse, while others warned that the bots could be unsafe or hostile.

The Mozilla Foundation noted that in the past, apps had encouraged dangerous behavior, including suicide (Chai AI) and an assassination attempt on the late Queen Elizabeth II (Replika).

Eva AI Chat Bot & Soulmate and CrushOn.AI didn’t respond to Business Insider’s request for comment. A representative for Replika told BI: “Replika has never sold user data and does not, and has never, supported advertising either. The only use of user data is to improve conversations.”

For those who find the prospect of AI romance impossible to resist, the Mozilla Foundation urges several precautions, including not saying anything you wouldn’t want a colleague or family member to read, using a strong password, opting out of AI training, and limiting the app’s access to other mobile features such as your location, microphone, and camera.

“You shouldn’t have to pay for cool new technologies with your safety or your privacy,” the report concluded.

Read the original article on Business Insider

By