Systematic review of multimodal physiological signals from wearable sensors for affective computing
Affective computing, proposed by Picard in 1997, aims to endow computational systems with the ability to recognize, interpret, and respond to human emotions. Early studies relied primarily on behavioral cues such as facial expressions and voice tone for modelling affective states.
Affective computing has entered a new phase—wearable devices are capable of continuously acquiring multimodal physiological signals from multiple sensor channels that differ in terms of sampling frequency, physiological origin, and signal characteristics.
In ...