Facial and Bodily Expressions for Control and Adaptation of Games (ECAG '11)

Workshop program

13.30Opening (Anton Nijholt and Ronald Poppe)
13.35 Invited Talk: Jacob Whitehill (UCSD)
Measures of Non-verbal Behavior in Education-related Scenarios
14:20 An Efficient Approach to Smile Detection
Caifeng Shan (Philips Research)
14:40 Approaches for Global-based Action Representations for Games and Action Understanding
Md. Atiqur Rahman Ahad, Joo Kooi Tan, Hyoungseop Kim and Seiji Ishikawa (Kyushu Institute of Technology)
15.00 Coffee break
15.30 Invited Talk: Evan Suma (ICT-USC)
Integrating Full Body Interaction with Virtual Environments and Serious Games
16:15 Motion Capture for Realtime Control of Virtual Actors in Live, Distributed, Theatrical Performances
Joe Geigel and Marla Schweppe (Rochester Institute of Technology)
16:35 HCI^2 Workbench: A Development Tool for Multimodal Human-Computer Interaction Systems
Jie Shen, Wenzhe Shi and Maja Pantic (Imperial College London / University of Twente)
16.55 Closing

Workshop themes

Expressivity in the human body and face can serve to control or adapt the interaction with a system. Examples are Xbox Kinect that uses body movements to control game characters, gesture interaction with a robot in a home environment, or adapting teaching strategy in a tutoring application based on detected frustration or boredom. In these examples, observations of the face and body are used in different forms, depending on whether the user has the initiative to control the interaction or whether the application takes the initiative to adapt itself to the user. Hence, we look at:
  • Control: The user consciously produces facial expressions, head movements or body gestures to control a game. This includes commands that allow navigation in the game environment or that allow movements of game characters or changes in their appearances (e.g. showing similar facial expressions on the character's face, transforming body gestures to emotion-related or to emotion-guided activities).

  • Adaptation: The gamer's spontaneous facial expressions and body poses are interpreted and used to adapt the game to the supposed affective state of the gamer. This adaptation can affect the appearance of the game environment, the interaction modalities, the experience and engagement, the narrative and the strategy that is followed by the game or the game actors.

Paper submission

We are soliciting papers that discuss research into this area, with a focus on applications. We consider the domain of entertainment, robot control, and (serious) gaming and simulation. In addition to video-based observation, we also consider other means of input, including multi-modal approaches. Technical papers, as well as survey papers and empirical papers are eligible. Authors are invited to submit papers, with the page limits and formatting guidelines of the main conference (see www.fg2011.org). Papers will be refereed by three reviewers. Submissions will be handled through the ECAG'11 conference management system at https://cmt.research.microsoft.com/ECAG2011/.

All FG Workshop papers will be archived by IEEE Xplore. They will also be included in the DVD conference proceedings.


Participants can register for the workshop or for the FG conference including all workshops, at www.fg2011.org. Early registration ends February 20.

Important dates

Paper submission: Closed
Author notification:Closed
Camera-ready due:Closed
FG conference:March 21-25, 2011
Workshop:March 25, 2011

Workshop organizers

Anton Nijholt (University of Twente, the Netherlands)
Ronald Poppe (University of Twente, the Netherlands)

Programme committee

Vasilis Argyriou (Kingston University, UK)
Nadia Berthouze (University College London, UK)
Rob Blaauboer (Logica, Amsterdam, the Netherlands)
Tony Brooks (Aalborg University, Esbjerg, Denmark)
David England (John Moores University, Liverpool, UK)
Yun Fu (University at Buffalo (SUNY), USA)
Mitsuru Ishizuka (University of Tokyo, Japan)
Ben Kröse (University of Amsterdam, the Netherlands)
Christopher Peters (Coventry University, UK)
Mannes Poel (University of Twente, the Netherlands)
Gang Qian (Arizona State University, USA)
Md. Atiqur Rahman Ahad ((University of Dhaka, Bangladesh)
Dennis Reidsma (University of Twente, the Netherlands)
Ben Schouten (Technische Universiteit Eindhoven, the Netherlands)
Nicu Sebe (University of Trento, Italy)
Marten den Uyl (Vicarvision, Amsterdam, the Netherlands)
Remco Veltkamp (Utrecht University, the Netherlands)
Stefanos Zafeiriou (Imperial College, London, UK)
Huang Zhiyong (I2R, Singapore)


For questions about the workshop, please contact the organizers at ecag2011@ewi.utwente.nl.

Photo by clogozm