Abstract: Reactive Virtual Trainer (RVT) is an Embodied Conversational Agent capable of presenting physical exercises that are to be performed by a human, monitoring the user and providing feedback at different levels.


ECA, parameterized animation, fitness, multimodal coordination




To find out how tempo affects motion dynamics, we analyze optical motion capture recordings of subjects performing excercises. clap

Computer vision

This work is done in a Bachelor Referaat assignment by Frank Zijlstra. The Virtual Trainer is likely to be used in a home entertainment environment. Only one webcam will be available and lab conditions should not be required for the analysis to work. To provide feedback timely, the analysis should work in real time.
Using a silhouette based approach, these requirements can be met. The silhouette is extracted from a video recording. In this silhouette, body parts are identified in real time.
The body part are matched to body parts in a reference silhouette to provide feedback on how to correct the pose. feedback

Multi-modal generation

We make use of the multi-modal synchronization language BML to specify the synchronization between metronome beats, speech and clapping motion. planning

Beat detection

By playing the users preferred music during the exercise, exercising could be made more enjoyable. We interpret the music to find the beats so that the exercise, and thus the movement of the trainer, can be aligned to the tempo of the music. For this we can borrow the beat detection algorithm from our Virtual Conductor, which implements methods described in the publications of Anssi Klapuri. This algorithm will detect the tempo and beat in the music played during the exercise.

Exercise corpus

Motion captured fitness exercises can be found at the HMI mocap database.

Parameterized animation

Using our animation framework, we can define exercises as by describing movement paths of limbs. Movements can be accentuated by adapting amplitude or velocity profile.
Parameterized animation (1Mb).

Publications related to this showcase are:


Book Chapters

H. van Welbergen, Z.M. Ruttkay and B. Varga Informed Use of Motion Synthesis Methods. pp. 132-143. In Motion in Games, A. Egges, A. Kamphuis and M.H. Overmars (eds), Springer Verlag, Berlin, 2008 [pdf] [bibTeX] [e-prints] 

Contributions to Proceedings

Z.M. Ruttkay and H. van Welbergen Elbows higher! Performing, observing and correcting exercises by a Virtual Trainer. in Proceedings of the Eight International Conference on Intelligent Virtual Agents, Lecture Notes in Artificial Intelligence, volume 5208, Springer Verlag, London, pp. 409-416, 2008 [pdf] [bibTeX] [e-prints] 

H. van Welbergen Using motion capture data to generate and evaluate motion models for real-time computer animation. in Proceedings of Measuring Behavior 2008, 6th International Conference on Methods and Techniques in Behavioral Research, A.J. Spink, M.R. Ballintijn, N.D. Bogers, F. Grieco, L.W.S. Loijens, L.P.J.J. Noldus, G. Smit and P.H. Zimmerman (eds), Noldus Information Technology bv, Maastricht, pp. 26-27, 2008 [pdf] [bibTeX] [e-prints] 


Contributions to Proceedings

Z.M. Ruttkay and H. van Welbergen Let's shake hands! On the coordination of gestures of humanoids. in Artificial and Ambient Intelligence, April 02 2007, Newcastle University, Newcastle upon Tyne, UK, pp. 164-168, 2007 [bibTeX]

A. Nijholt, D. Reidsma, Z.M. Ruttkay, H. van Welbergen and P. Bos Nonverbal and Bodily Interaction in Ambient Entertainment. in Proceedings workshop on Fundamentals of Verbal and Non-verbal Communication and the Biometrical Issue, A.M. Esposito, M. Bratanic, E. Keller and M. Marinaro (eds), NATO Security through Science Series, E: Human and Societal Dynamics, volume 18, IOS Press, Amsterdam, pp. 343-348, 2007 [pdf] [bibTeX] [e-prints] 

H.H. Vilhjálmsson, N. Cantelmo, J. Cassell, N.E. Chafai, M. Kipp, S. Kopp, M. Mancini, S.C. Marsella, A.N. Marshall, C. Pelachaud, Z.M. Ruttkay, K.R. Thórisson, H. van Welbergen and R.J. van der Werf The Behavior Markup Language: Recent Developments and Challenges. in Proceedings of the 7th International Conference on Intelligent Virtual Agents, C. Pelachaud, J.-C. Martin, E. Andre, G. Collet, K. Karpouzis and D. Pelé (eds), Lecture Notes in Computer Science, volume 4722, Springer, Berlin, pp. 99-120, 2007 [pdf] [bibTeX] [e-prints] 

H. van Welbergen and Z.M. Ruttkay On the parametrization of clapping. in Proceedings of the 7th International Workshop on Gesture in Human-Computer Interaction and Simulation, M. Sales Dias and R. Jota (eds), ADETTI, Lisboa, Portugal, pp. 20-21, 2007 [pdf] [bibTeX] [e-prints] 


Contributions to Proceedings

Z.M. Ruttkay, J. Zwiers, H. van Welbergen and D. Reidsma Towards a Reactive Virtual Trainer. in Proceedings of the 6th International Conference on Intelligent Virtual Agents, J. Gratch, M. Young, R.S. Aylett, D. Ballin and P. Olivier (eds), Lecture notes in Computer Science, volume 4133, Springer Verlag, Berlin, pp. 292-303, 2006 [pdf] [bibTeX] [e-prints] 

Z.M. Ruttkay and H. van Welbergen On the timing of gestures of a Virtual Physiotherapist. in 3rd Central European Multimedia and Virtual Reality Conference, C.S. Lanyi (eds), Pannonian University Press, Hungary, pp. 219-224, 2006 [pdf] [bibTeX] [e-prints]