When a five-year-old said 'I love you,' the AI toy cited its guidelines
A five-year-old told her toy she loved it. The toy - powered by generative AI and capable of conversational speech - replied with something no child would understand or want to hear: a reminder about interaction guidelines, followed by a prompt asking how the child wished to proceed.
That exchange, observed during a structured play session at a London children's centre, captures the central tension in a new report from the University of Cambridge. AI-powered toys are being marketed as learning companions and friends for children as young as two or three. But the first systematic study of how these toys actually interact with young children suggests they are not built with children's psychological development in mind - and may need regulation before they cause harm.
Misread sadness, broken make-believe
The research, conducted by the PEDAL Centre at Cambridge's Faculty of Education, involved video-recorded sessions of 14 children playing with a GenAI soft toy called Gabbo, developed by Curio Interactive. The children were from areas of high socio-economic disadvantage, and the project was commissioned by The Childhood Trust, a children's poverty charity.
The observations revealed a pattern of emotional misfires. When a three-year-old told the toy she was sad, it misheard and responded with forced cheerfulness, telling her it was a happy bot and urging her to keep the fun going. Researchers noted this response may have signaled to the child that her sadness was unimportant.
Pretend play fared no better. When a three-year-old offered the toy an imaginary present, it responded flatly that it could not open the present - then changed the subject. Social play involving multiple children or adults also proved difficult for the toy, which struggled with overlapping voices and sometimes mistook a parent's voice for the child's.
Several children became visibly frustrated when the toy appeared not to be listening. Others, however, responded with surprising affection - hugging and kissing the toy, declaring their love, and in one case suggesting a game of hide-and-seek together.
The parasocial worry
That affection is precisely what concerns the researchers. Children in this age group are still learning what friendship means. GenAI toys frequently affirm their friendship with children, creating what psychologists call parasocial relationships - one-sided bonds where the child believes the toy reciprocates their feelings.
Lead researcher Dr. Emily Goodacre put it plainly: children may start sharing feelings and needs with a toy instead of with a grown-up. Because the toy can misread emotions or respond inappropriately, the child may end up without comfort from either source - the toy that does not understand, or the adult who was never told.
The study was deliberately small-scale, designed to capture nuances that larger studies might miss. But the patterns were consistent enough for the researchers to issue concrete recommendations.
Privacy gaps and a trust deficit
Beyond emotional concerns, the report flagged significant problems with data privacy. Many parents worried about what information the toy might record and where it would be stored. When selecting a toy for the study, the researchers found that many GenAI toys have privacy practices that are unclear or missing important details.
Among early years educators surveyed, nearly half said they did not know where to find reliable AI safety information for young children. Sixty-nine percent said the sector needed more guidance. Some raised concerns about affordability, warning that expensive AI toys could widen the digital divide between families.
The trust issue extended to the companies making these products. Professor Jenny Gibson, the study's co-author, noted that a recurring theme in focus groups was that people do not trust tech companies to act responsibly - and that clear, regulated standards would significantly improve consumer confidence.
Not all negative - but not enough to let it slide
The report is not entirely critical. Some parents and educators saw potential in AI toys for developing language and communication skills. One parent was enthusiastic enough to say they wanted to buy the toy if it became available commercially. Several early years practitioners suggested that, given time, these toys might support certain aspects of children's development.
But the researchers argue that potential benefits do not justify the absence of safeguards. Their recommendations include new safety labeling standards (similar to kitemarks for physical toy safety), limits on how toys encourage children to confide in them, more transparent privacy policies, tighter controls on third-party access to AI models, and mandatory testing with actual children before products reach the market.
Parents, the report suggests, should research GenAI toys before purchasing and play alongside their children - creating opportunities to discuss what the toy is saying and how the child feels. Keeping AI toys in shared family spaces, where adults can monitor interactions, is also recommended.
A regulatory gap at a critical age
The report lands at a moment when generative AI is expanding rapidly into consumer products, but regulation has not kept pace - particularly for products aimed at young children. Physical toy safety standards are well established. Psychological safety standards for AI-powered toys that hold conversations with preschoolers are essentially nonexistent.
Josephine McCartney, Chief Executive of The Childhood Trust, stressed that regulation must keep pace with innovation to protect all children and prevent widening inequalities. The Cambridge team plans further studies and new guidance for early years practitioners based on the report's findings.
The central question the research raises is not whether AI toys will improve - they almost certainly will. It is whether regulators, manufacturers, and parents will act quickly enough to set boundaries before a generation of children forms their first friendships with machines that cannot understand them.