Course 30: Multimodal Detection of Affective States: A Roadmap from Brain-Computer Interfaces, Face-Based Emotion Recognition, Eye Tracking and Other Sensors
Contribution & Benefit: This course presents devices and explores methodologies for multimodal detection of affective states, as well as a discussion about presenter’s experiences using them both in learning and gaming scenarios.
Abstract » One novel part of the design of interactions between people and computers, related with the facet of securing user satisfaction, is the capability of systems to adapt to their individual users showing empathy. Being empathetic implies that the computer is able to recognize user’s affective states and understand the implication of those states. Automatic detection of affective states requires the computer: to sense information; to process and understand information integrating several sources that could range from brain-waves signals and biofeedback readings, passing from gestures recognition, to posture and pressure sensing; and to apply algorithms and data processing tools to understand user’s affective states.
Through this course, attendees will:
a) Learn about sensing approaches used to detect affective states: brain-computer interfaces, face-based emotion recognition systems, eye-tracking system, and physiological sensors –including skin conductivity, posture, and pressure sensors—.
b) Understand the pros and cons of the diverse sensing approaches used to detect affective states.
c) Learn about the data that is gathered from each device and understand its characteristics.
d) Learn about approaches and tools to pre-process, synchronize, and analyze data.
This course is open to researchers, practitioners, and educators interested in incorporating affective computing as part of their adaptive and personalized technology toolbox.
The presentation will be a mix of enthusiastic instruction with demonstrations and exercises, all aimed to help making the topic concrete, memorable, and actionable.