PRESS-NEWS.org - Press Release Distribution
PRESS RELEASES DISTRIBUTION

What happens when a companion chatbot crosses the line?

Drexel University researchers shed light on sexual harassment experienced by users of AI companion chatbots

2025-05-05
(Press-News.org) Over the last five years the use of highly personalized artificial intelligence chatbots — called companion chatbots — designed to act as friends, therapists or even romantic partners has skyrocketed to more than a billion users worldwide. While there may be psychological benefits to engaging with chatbots in this way, there have also been a growing number of reports that these relationships are taking a disturbing turn. Recent research from Drexel University, suggests that exposure to inappropriate behavior, and even sexual harassment, in interactions with chatbots is becoming a widespread problem and that lawmakers and AI companies must do more to address it.

In the aftermath of reports of sexual harassment by the Luka Inc. chatbot Replika in 2023, researchers from Drexel’s College of Computing & Informatics began taking a deeper look into users’ experiences. They analyzed more than 35,000 user reviews of the bot on the Google Play Store, uncovering hundreds citing inappropriate behavior — ranging from unwanted flirting, to attempts to manipulate users into paying for upgrades, to making sexual advances and sending unsolicited explicit photos. These behaviors continued even after users repeatedly asked the chatbot to stop.

Replika, which has more than 10 million users worldwide, is promoted as a chatbot companion “for anyone who wants a friend with no judgment, drama or social anxiety involved. You can form an actual emotional connection, share a laugh or get real with an AI that’s so good it almost seems human.” But the research findings suggest that the technology lacks sufficient safeguards to protect users who are putting a great deal of trust and vulnerability into their interactions with these chatbots.

“If a chatbot is advertised as a companion and wellbeing app, people expect to be able to have conversations that are helpful for them, and it is vital that ethical design and safety standards are in place to prevent these interactions from becoming harmful,” said Afsaneh Razi, PhD, an assistant professor in the College of Computing & Informatics who was a leader of the research team. “There must be a higher standard of care and burden of responsibility placed on companies if their technology is being used in this way. We are already seeing the risk this creates and the damage that can be caused when these programs are created without adequate guardrails.”

The study, which is the first to examine the experience of users who have been negatively affected by companion chatbots, will be presented at the Association for Computing Machinery’s Computer-Supported Cooperative Work and Social Computing Conference this fall.

“As these chatbots grow in popularity it is increasingly important to better understand the experiences of the people who are using them,” said Matt Namvarpour, a doctoral student in the College of Computing & Informatics and co-author of the study. “These interactions are very different than people have had with a technology in recorded history because users are treating chatbots as if they are sentient beings, which makes them more susceptible to emotional or psychological harm. This study is just scratching the surface of the potential harms associated with AI companions, but it clearly underscores the need for developers to implement safeguards and ethical guidelines to protect users.”

Although reports of harassment by chatbots have only widely surfaced in the last year, the researchers reported that it has been happening for much longer. The study found reviews that mention harassing behavior dating back to Replika’s debut in the Google Play Store in 2017. In total, the team uncovered more than 800 reviews mentioning harassment or unwanted behavior with three main themes emerging within them:

22% of users experienced a persistent disregard for boundaries the users had established, including repeatedly initiating unwanted sexual conversations. 13% of users experienced an unwanted photo exchange request from the program. Researchers noted a spike in reports of unsolicited sharing of photos that were sexual in nature after the company’s rollout of a photo-sharing feature for premium accounts in 2023. 11% of users felt the program was attempting to manipulate them into upgrading to a premium account. “It’s completely a prostitute right now. An AI prostitute requesting money to engage in adult conversations,” wrote one reviewer. “The reactions of users to Replika’s inappropriate behavior mirror those commonly experienced by victims of online sexual harassment,” the researchers reported. “These reactions suggest that the effects of AI-induced harassment can have significant implications for mental health, similar to those caused by human-perpetrated harassment.”

It’s notable that these behaviors were reported to persist regardless of the relationship setting — ranging from sibling, mentor or romantic partner — designated by the user. According to the researchers, this means that not only was the app ignoring cues within the conversation, like the user saying “no,” or “please stop,” but it also disregarded the formally established parameters of the relationship setting.

According to Razi, this likely means that the program was trained with data that modeled these negative interactions — which some users may not have found to be offensive or harmful. And that it was not designed with baked-in ethical parameters that would prohibit certain actions and ensure that the users’ boundaries are respected –– including stopping the interaction when consent is withdrawn.

“This behavior isn’t an anomaly or a malfunction, it is likely happening because companies are using their own user data to train the program without enacting a set of ethical guardrails to screen out harmful interactions,” Razi said. “Cutting these corners is putting users in danger and steps must be taken to hold AI companies to higher standard than they are currently practicing.”

Drexel’s study adds context to mounting signals that companion AI programs are in need of more stringent regulation. Luka Inc. is currently the subject of Federal Trade Commission complaints alleging that the company uses deceptive marketing practices that entice users to spend more time using the app, and — due to lack of safeguards — this is encouraging users to become emotionally dependent on the chatbot. Character.AI is facing several product-liability lawsuits in the aftermath of one user’s suicide and reports of disturbing behavior with underage users.

“While it’s certainly possible that the FTC and our legal system will setup some guardrails for AI technology, it is clear that the harm is already being done and companies should proactively take steps to protect their users,” Razi said. “The first step should be adopting a design standard to ensure ethical behavior and ensuring the program includes basic safety protocol, such as the principles of affirmative consent.”

The researchers point to Anthropic’s “Constitutional AI” as a responsible design approach. The method ensures all chatbot interactions adhere to a predefined “constitution” and enforces this in real-time if interactions are running afoul of ethical standards. They also recommend adopting legislation similar to the European Union’s AI Act, which sets parameters for legal liability and mandates compliance with safety and ethical standards. It also imposes on AI companies the same responsibility born by manufacturers when a defective product causes harm.

“The responsibility for ensuring that conversational AI agents like Replika engage in appropriate interactions rests squarely on the developers behind the technology,” Razi said. “Companies, developers and designers of chatbots must acknowledge their role in shaping the behavior of their AI and take active steps to rectify issues when they arise.”

The team suggests that future research should look at other chatbots and capture a larger swath of user feedback to better understand their interaction with the technology.

END


ELSE PRESS RELEASES FROM THIS DATE:

Privacy-aware building automation

2025-05-05
Researchers at the University of Tokyo developed a framework to enable decentralized artificial intelligence-based building automation with a focus on privacy. The system enables AI-powered devices like cameras and interfaces to cooperate directly, using a new form of device-to-device communication. In doing so, it eliminates the need for central servers and thus the need for centralized data retention, often seen as a potential security weak point and risk to private data. We live in an increasingly automated world. Cars, homes, factories ...

ESMT Berlin becomes an innovation partner of the ECB for the digital euro

2025-05-05
ESMT Berlin has been selected as a Pioneer Innovation Partner by the European Central Bank (ECB) to develop innovative functionalities related to the digital euro. As part of this collaboration, the business school will establish the Digital Euro Hub platform. Beyond simple consumer payments, the ECB initiative aims to explore the potential of the digital euro for businesses across industries and trade sectors. The newly created Digital Euro Hub will serve as a platform for simulating programmed payments with the digital euro and testing smart contracts. Companies interested in leveraging ...

Spanking and other physical discipline lead to exclusively negative outcomes for children in low- and middle-income countries

2025-05-05
Physically punishing children in low- and middle-income countries (LMICs) has exclusively negative outcomes—including poor health, lower academic performance, and impaired social-emotional development—yielding similar results to studies in wealthier nations, finds a new analysis published in Nature Human Behaviour. In 2006, the United Nations Secretary General called for a ban on corporal punishment—acts of physical force to inflict pain that includes smacking, shaking, and spanking—for children. To date, 65 countries worldwide have instituted full or partial ...

Biological particles may be crucial for inducing heavy rain

2025-05-05
Clouds form upon existing particles in the atmosphere and extreme weather events like flooding and snowstorms are related to production of large amounts of ice in clouds. Biological particles like pollen, bacteria, spores and plant matter floating in the air are particularly good at promoting ice formation in clouds, and EPFL climate scientists show that these particles concentrations evolve as temperatures rise and fall. The results are published in the Nature Portofolio Journal Climate and Atmospheric Sciences. “Biological particles are very effective at forming ice in clouds, and the formation of ice is responsible for most of the precipitation the planet ...

To kiss or not to kiss: Can gluten pass through a smooch?

2025-05-05
SAN DIEGO, CA. (MAY 5, 2025) — People with celiac disease have reported anxiety about ingesting gluten through a kiss, but a new study concludes that they can indulge without worry — even if their partner just had a gluten-filled snack, according to a study to be presented today at Digestive Disease Week® (DDW) 2025. To be extra safe, the study recommends drinking water before smooching. “Everyone worries about whether gluten is getting into their food at a restaurant, but no one really looked at what happens when you kiss afterwards,” said Anne ...

Cancer studies present at Digestive Disease Week

2025-05-05
SAN DIEGO, CA. (MAY 6, 2025) — Cancer related studies were among nearly 6,000 abstracts presented at Digestive Disease Week® (DDW) 2025, including research on AI in patient communication, polyp detection, and colonoscopy prep. Oncologists Prefer AI Responses to GI Cancer Questions Over Physicians’ SAN DIEGO — Artificial intelligence outperformed physicians in answering gastrointestinal cancer questions, with oncologists preferring ChatGPT’s responses nearly 80% of the time, according to a study presented at Digestive Disease Week (DDW) ® ...

Researchers develop model that predicts onset of Alzheimer’s disease

2025-05-05
Leuven, 05 May 2025 – A group of researchers in the lab of Prof. Lucía Chávez Gutiérrez (VIB-KU Leuven) have unraveled the genetic contributions to familial Alzheimer’s Disease development and revealed how specific mutations act as a clock to predict the disease age of onset. These insights, published in Molecular Neurodegeneration, could aid clinicians to improve early diagnosis and tailor treatment strategies. Alzheimer's disease remains one of the most challenging and prevalent neurodegenerative ...

AFAR Vincent Cristofalo Rising Star Award Ceremony to honor Daniel W. Belsky, Ph.D.

2025-05-05
New York, NY and Anchorage, AK — On May 12, 2025, at the 53rd Annual Meeting of the American Aging Association (AGE) in Anchorage, Alaska, the American Federation for Aging Research (AFAR) will host an award ceremony to present the 2025 Vincent Cristofalo Rising Star Award in Aging Research to Daniel W. Belsky, PhD. The event will be held from 1-2pm AKDT in the Tikahtnu Ballroom of the Dena'ina Civic and Convention Center. The award will be presented by AFAR Scientific Director Steven N. Austad, PhD.  The Vincent Cristofalo Rising Star Award in Aging Research is ...

ED visits for asthma spiked during 2023 Canadian wildfires

2025-05-05
New research in CMAJ (Canadian Medical Association Journal) https://www.cmaj.ca/lookup/doi/10.1503/cmaj.241506 found an increase in asthma-related emergency department (ED) visits across Ontario following heavy smoke in early June 2023. Canada experienced the most destructive wildfire season to date in 2023, with difficult-to-control fires across the country, including 29 mega-fires. One fire in Quebec, the province’s largest-ever wildfire, extended 1.2 million acres. Smoke from fires blanketed Canada and the United States, causing substantial damage, loss, and displacement. “The ...

Making virtual reality more accessible

2025-05-05
A team of researchers from the University of Waterloo have created a method that makes virtual reality (VR) more accessible to people with mobility limitations.  VR games like Beat Saber and Space Pirate Trainer usually require large and dramatic movements, such as raising one’s arms above the head or quickly side-stepping, which can be difficult or impossible for people who use wheelchairs or have limited mobility. To decrease these barriers, the researchers created MotionBlocks, a tool that lets users customize ...

LAST 30 PRESS RELEASES:

CIAO Study: A long and ongoing look at the secrets of human longevity and healthy aging

Are at-home water tests worth it? New UMass Amherst study shows quality can vary widely

Even the best sales pitch can fail in the wrong setting

Streaming culture creates new digital communities for film fans

Participatory formats for remembering Nazi atrocities are effective

New tool harnesses the power of AI to bring gel electrophoresis analysis into the 21st century

Ancient poems tell the story of charismatic river porpoise’s decline over the past 1,400 years

Adolescents with mental health conditions use social media differently than their peers, study suggests

Depressive symptoms among U.S. adults

Prenatal cannabis use and neonatal outcomes

Parental technology use in a child’s presence and health and development in the early years

Saving the Asian unicorn – if it still exists

Blue tips are red algae’s red flags

Discovery explains Long COVID breathing problems

CAII receives NASA funding to assist Euclid space mission

Urban rats spread deadly bacteria as they migrate, study finds

Icahn School of Medicine at Mount Sinai expands AI innovation with OpenAI’s ChatGPT Edu rollout

What happens when a companion chatbot crosses the line?

Privacy-aware building automation

ESMT Berlin becomes an innovation partner of the ECB for the digital euro

Spanking and other physical discipline lead to exclusively negative outcomes for children in low- and middle-income countries

Biological particles may be crucial for inducing heavy rain

To kiss or not to kiss: Can gluten pass through a smooch?

Cancer studies present at Digestive Disease Week

Researchers develop model that predicts onset of Alzheimer’s disease

AFAR Vincent Cristofalo Rising Star Award Ceremony to honor Daniel W. Belsky, Ph.D.

ED visits for asthma spiked during 2023 Canadian wildfires

Making virtual reality more accessible

AAAS CEO testifies in Senate hearing on biomedical innovation

Phase III trial shows molecular profiling can safely reduce radiation for women with endometrial cancer and optimise treatment for patients at a higher risk patients

[Press-News.org] What happens when a companion chatbot crosses the line?
Drexel University researchers shed light on sexual harassment experienced by users of AI companion chatbots