Ben-Gurion U. studies show promise using drones to elicit emotional responses
BEER-SHEVA, Israel...June 2, 2021 - As drones become more ubiquitous in public spaces, researchers at Ben-Gurion University of the Negev (BGU) have conducted the first studies examining how people respond to various emotional facial expressions depicted on a drone, with the goal of fostering greater social acceptance of these flying robots.
The research, which was presented recently at the virtual ACM Conference on Human Factors in Computing Systems, reveals how people react to common facial expressions superimposed on drones.
"There is a lack of research on how drones are perceived and understood by humans, which is vastly different than ground robots." says Prof. Jessica Cauchard together with Viviane Herdel of BGU's Magic Lab, in the BGU Department of Industrial Engineering & Management. "For the first time, we showed that people can recognize different emotions and discriminate between different emotion intensities."
BGU researchers conducted two studies using a set of rendered robotic facial expressions on drones that convey basic emotions. The faces use four core facial features: eyes, eyebrows, pupils, and mouth. The results showed that five different emotions (joy, sadness, fear, anger, surprise) can be recognized with high accuracy in static stimuli, and four emotions (joy, surprise, sadness, anger) in dynamic videos. Disgust was the only emotion that was poorly recognized. END
The research, which was presented recently at the virtual ACM Conference on Human Factors in Computing Systems, reveals how people react to common facial expressions superimposed on drones.
"There is a lack of research on how drones are perceived and understood by humans, which is vastly different than ground robots." says Prof. Jessica Cauchard together with Viviane Herdel of BGU's Magic Lab, in the BGU Department of Industrial Engineering & Management. "For the first time, we showed that people can recognize different emotions and discriminate between different emotion intensities."
BGU researchers conducted two studies using a set of rendered robotic facial expressions on drones that convey basic emotions. The faces use four core facial features: eyes, eyebrows, pupils, and mouth. The results showed that five different emotions (joy, sadness, fear, anger, surprise) can be recognized with high accuracy in static stimuli, and four emotions (joy, surprise, sadness, anger) in dynamic videos. Disgust was the only emotion that was poorly recognized. END
