Medicine Technology 🌱 Environment Space Energy Physics Engineering Social Science Earth Science Science
Technology 2026-03-19

Robots that admit they have no feelings outperform those that fake it

A Chicago study finds that straightforward, emotionless robot tutors teach social-emotional skills just as well as — and sometimes better than — robots with scripted personalities

One robot tells a fourth-grader it feels nervous before a big test. The other robot tells the same child, plainly, that it has no feelings at all — but it can help the child think through what nervousness means. Both are sitting in a Chicago public school classroom. Both are running lessons on empathy, conflict resolution, and problem-solving. And according to new research, the honest robot does just as well. In some measures, it does better.

The fictional friend vs. the factual tool

The assumption in educational robotics has long been intuitive: give a robot a personality, a backstory, simulated emotions, and children will engage more deeply. It seems especially logical for social-emotional learning, or SEL — the curriculum that teaches kids to recognize and manage emotions, resolve conflicts, and build relationships. If you want a child to talk about feelings, shouldn't the robot model having them?

Lauren Wright, a PhD student at the University of Chicago's Department of Computer Science, wasn't so sure. Working under Assistant Professor Sarah Sebo, Wright assembled a team that included Chicago Public Schools teachers, administrators, and Kiljoong Kim at Chapin Hall to design a study that would test that assumption directly.

The approach was unusual from the start. Rather than arriving with a prototype and asking teachers to use it, the researchers spent months observing SEL instruction in CPS classrooms and interviewing teachers about what actually worked and what didn't. Only then did they begin designing how robots might help.

Fifty-two students, three conditions, one surprise

The experiment divided students into three groups. One group worked with robots programmed with fictional emotional dialogue — the robot claimed to have friends, described feeling scared or happy, and shared personal anecdotes. A second group worked with robots that spoke only in factual terms, openly acknowledging they had no feelings or personal experiences. The third group received their standard SEL curriculum with no robot involvement at all.

The lessons themselves drew from Second Step, a widely used SEL curriculum in CPS classrooms. The research team adapted group lesson plans into personalized one-on-one conversations between each student and a robot, while teachers continued working with the rest of the class.

Both robot groups outperformed the classroom-only group in mastering SEL concepts. That much was expected — individualized attention tends to beat whole-class instruction. But the factual robots, the ones with no pretense of inner life, often prompted deeper engagement. Students working with the straightforward robots used more lesson-specific vocabulary and problem-solving language than those interacting with the emotionally scripted versions.

Why fake feelings may have backfired

Wright has a theory about what went wrong with the fictional approach. When a robot claims to feel nervous or excited, it introduces a competing focus. The child may spend cognitive effort evaluating the robot's emotional performance — does it really feel that? is it like my nervousness? — rather than reflecting on the SEL concepts themselves.

The factual robot sidesteps that distraction entirely. It functions as a thinking tool rather than a social partner. And for nine- and ten-year-olds who are still developing their capacity for abstract emotional reasoning, that clarity may matter more than warmth.

The findings push back against a common design pattern in educational robotics. As Wright put it in presenting the results, the assumption that a fictional personality will always lead to better outcomes deserves scrutiny. Common doesn't mean optimal.

A timely result for the AI-in-schools debate

The research arrives at a moment when parents, educators, and policymakers are increasingly worried about children forming parasocial relationships with AI systems. Chatbots that simulate friendship. Voice assistants that say "I'm happy to help." The concern is that these interactions blur the line between tool and companion in ways that may not serve children's development.

The Chicago team's results offer a data point in that debate. If robots can deliver measurable learning gains without pretending to be something they're not, the case for emotional mimicry weakens. The factual approach isn't just ethically tidier — it performs.

Sebo has been careful to frame the technology as supplementary, not substitutive. The robots aren't replacing teachers. They're extending a teacher's reach, giving individual children one-on-one attention during moments when the teacher is working with the full class. In schools where SEL instruction happens once a week in a whole-group format, many students simply don't get the personalized engagement that makes the lessons stick.

Winning recognition, raising questions

The work was presented at the 2026 ACM/IEEE International Conference on Human-Robot Interaction in Edinburgh, where it received the Best Paper Award — the conference's top honor for research impact and innovation. That recognition signals the broader field is paying attention to the question of whether emotional performance in robots is always desirable.

Still, this is a single study with 52 students in one city's school system. The sample is small enough that individual classroom dynamics, teacher enthusiasm, or the particular cultural context of CPS could shape the results. The researchers used one specific curriculum (Second Step) and one age group (fourth grade). Whether factual robots outperform fictional ones with younger children, older adolescents, or different SEL frameworks remains an open question.

The study also doesn't address long-term retention. Students showed better vocabulary mastery immediately, but whether that translates into lasting behavioral change — the ultimate goal of SEL — isn't something a short-term experiment can answer.

The robot that doesn't pretend

There's something quietly radical about a robot that tells a child, in effect: I don't have feelings. Let's talk about yours. In a technology landscape that trends relentlessly toward anthropomorphism — toward making machines seem more human — the Chicago team is making a case for the opposite. Less pretense. More honesty. And, it turns out, comparable or better results.

For teachers in CPS and beyond, the practical implication is straightforward. Robot-assisted SEL doesn't require sophisticated emotional performance. A tool that knows its limitations, and says so, can still help a child learn what empathy means.

Source: University of Chicago Department of Computer Science. Research by Lauren Wright and Sarah Sebo, presented at the 2026 ACM/IEEE International Conference on Human-Robot Interaction (HRI), Edinburgh, Scotland. Best Paper Award recipient. Partners: Chicago Public Schools, Chapin Hall (Kiljoong Kim). Curriculum adapted from Second Step SEL program.