Multimodal Evaluation of Human-Robot Interaction (HRI)


MuMMER project


In MuMMER (MultiModal Mall Entertainment Robot), we address the important and growing market of consumer robotics by developing a humanoid robot (based on Aldebaran’s Pepper platform) able to engage and interact autonomously and naturally in the dynamic environments of a public shopping mall, providing an engaging and entertaining experience to the general public. Using collaborative co-design methods, we will work together with stakeholders including customers, retailers, and business managers, to develop truly engaging robot behaviours. Crucially, our robot will exhibit behaviour that is socially appropriate, combining speech-based interaction with non-verbal communication and human-aware navigation. To support this behaviour, we will develop and integrate new methods from audiovisual scene processing, social-signal processing, high-level action selection, and human-aware robot navigation.



Multimodal Dialogue with a Robotic Head


This project was performed with a group of collaborators during the eNTERFACE’13 workshop. It explores a novel experimental setup towards building spoken, multi-modally rich, and human-like multiparty tutoring agent. A setup was developed and a corpus was collected that targeted the development of a dialogue system platform to explore verbal and nonverbal tutoring strategies in multiparty spoken interactions with embodied agents. The dialogue task was centered on two participants involved in a dialogue aiming to solve a card-ordering game. With the participants sit a tutor that helped the participants perform the task, and organized and balanced their interaction. Different multimodal signals captured and auto-synchronized by different audio-visual capture technologies were coupled with manual annotations, to build a situated model of the interaction based on the participants personalities, their temporally-changing state of attention, their conversational engagement and verbal dominance, and how that was correlated with the verbal and visual feedback, turn-management, and conversation regulatory actions generated by the tutor.



Computational Model of Robotic Emotions


Emotional expressions are able to benefit social communication between humans and robots. In this project I present a computational model of emotions for non-humanoid robot.