Abstract

This chapter presents a completely automated real-time character-based interface, where a scriptable affective humanoid 3D agent interacts with the user. Special care has been taken in making it possible multimodal natural user-agent interaction: communication is accomplished via text, image and voice (natural language). Our embodied agents are equipped with an emotional state which can be modified throughout the conversation with the user, and depends on the emotional state detected from the user's facial expressions. In fact, this nonverbal affective information is interpreted by the agent, which responds in an empathetic way by acompasing its voice intonation, facial expression and answers. These agents have been used as virtual presenters, domotic assistants and pedagogical agents in different applications and results are promising. The chapter has focused on two main aspects: the capture of the user emotional state from web cam images and the development of a dialog system in natural language (Spanish) that takes also emotional aspects into account. The facial expression recognizer is based on facial features' tracking and on an effective emotional classification method based on the theory of evidence and Ekman's emotional classification. From a set of distances and angles extracted from the user images and from a set of thresholds defined from the analysis of a sufficiently broad image database, the classification results are acceptable, and recent developments has enabled us to improve success rates. The utility of this kind of information is clear: the general vision in that is a user's emotion could be recognized by a computer, human computer-interaction would become more natural, enjoyable and productive. The dialog system has been developed so that the user can ask questions, give commands or ask for help to the agent. It is based on the recognition of patterns, to which fixed answers are associated. These answers, however, vary depending on the virtual character's emotional state, or may undergo random variations so that the user does not get the impression of repetition if the conversation goes on for a long time. Special attention has also been paid in adding an emotional component to the synthesized voice in order to reduce its artificial nature. Voice emotions also follow Ekman's ones and are modeled by means of modifying volume, speed and pitch. Several research lines remain open: ? Regarding Maxine, next steps are: o to allow not only facial expressions but body postures to be affected by the emotional state of the agent, o to use the user emotional information in a more sophisticated way: the computer could offer help and assistance to a confused user or try to cheer up a frustrated user and, hence, react in ways more appropriated than simply ignoring the user affective states, as is the case in most current interfaces, o to consider not only emotion but personality models for the virtual agents, o to give the system learning mechanisms, so that it can modify its display rules based on what appears to be working for a particular user, and improve its responses while interacting with that user, and o to carry out a proper validation of Maxine system and characters.

Keywords:
Embodied cognition Embodied agent Facial expression Natural (archaeology) Human–computer interaction Computer science Representation (politics) Gesture Communication Cognitive science Psychology Artificial intelligence

Metrics

9
Cited By
0.83
FWCI (Field Weighted Citation Impact)
32
Refs
0.75
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Social Robot Interaction and HRI
Social Sciences →  Psychology →  Social Psychology
Emotion and Mood Recognition
Social Sciences →  Psychology →  Experimental and Cognitive Psychology
Face recognition and analysis
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

JOURNAL ARTICLE

Affective Interfaces of Embodied Conversational Agents

Yi-Chen Hsu

Journal:   International Journal of Affective Engineering Year: 2013 Vol: 12 (2)Pages: 71-78
JOURNAL ARTICLE

Interaction with embodied conversational agents

Lewis Johnson

Year: 2005 Pages: 10-10
BOOK-CHAPTER

Effective Tutoring with Affective Embodied Conversational Agents

Sharon G. MoyoPaul Piwek

Frontiers in artificial intelligence and applications Year: 2009
JOURNAL ARTICLE

Nonverbal interaction in embodied conversational agents

Year: 2005 Pages: 503-503
© 2026 ScienceGate Book Chapters — All rights reserved.