Abstract: The Virtual Dancer is an interactive dancing agent. It dances together with the user, aligning the appropriate dance moves in real time to the beat of the music, adapting its style to what it observes from the user through real time computer vision. By alternating patterns of following the user with taking the lead with new moves, the system attempts to achieve a mutual dancing interaction.


ECA, beat tracking, virtual rap dancer, audio/video input, virtual reality


The Virtual Dancer is an interactive dancing agent. It dances together with the user to the beat of the music. It will adapt its performance to whatever the human user is doing, who is observed using real time computer vision. Below some elements of the system are discussed.


Beat detection

A prerequisite for a virtual dancer is the ability to interpret the music to find the beats to which the dance should be aligned. For this we implemented the beat detection algorithm described in the publications of Anssi Klapuri. This algorithm will detect the tempo and beat in the music played for the dancer.

Beat-aligned dance moves

Once the beat is known, the virtual dancer should be able to dance along. For this we constructed a database of many different dance moves, collected using motion capturing or created manually, using parameterized animation (see also the videos to the right of this page). The virtual dancer selects the most appropriate dance moves from the database, given the observations of the movement characteristics of the human dancer. These moves are then timed to the beat, by locally warping its timing so that the beat positions in the move match with those in the music. For example, in a complex clapping animation, the clap-points are aligned exactly to the predicted beat times, so the dancer will clap to the beat of the music. The transition from one move to the other is made using an IK-generated stepping motion and interpolation techniques.


Observation of human dance partner

cv In the mean time, the system observes the movements of the human dance partner using the computer vision system ParleVision. Using several robust processors, the system extracts global characteristics about the movements of the human dancer, such how much (s)he moves around or how much (s)he waves with the arms. Such characteristics can then be used to select moves from the database that are in some way "appropriate" to the dancing style of the human dancer.

Interactive dancing

By alternating patterns of following the user with taking the lead with new moves, the system attempts to achieve a mutual dancing interaction where both human and virtual dancer influence each other. Finding the appropriate nonverbal interaction patterns that achieve this is one of the longer term issues that is being addressed in the context of this showcase.


The work on the Virtual Dancer has been presented in several videos, some of which are downloadable from here. The first video was produced for the recent CHI2006 interactivity session. The second movie was a submission for GALA2005, where the system did not too bad in the final rankings. The other movies show the process of animating the dance motions captured in a Vicon lab and an example of parametrized animation. Note: the movies are usually quite large (10-30 Mb).
CHI 2006 Interactivity Submission (19Mb). Video CHI2006 (19Mb)
GALA2005 submission video (23Mb). Video shown at GALA2005 (23Mb)
Motion capture and animation for the Virtual Dancer (10Mb). screenshot
Parameterized animation (1Mb). ikeditor
Poster (8Mb) Poster

Publications related to this showcase are:


Contributions to Proceedings

D. Reidsma, E. Dehling, H. van Welbergen, J. Zwiers and A. Nijholt Leading and following with a virtual trainer. in Proceedings of the 4th International Workshop on Whole Body Interaction in Games and Entertainment, D. England, A. El Rhalibi and A.D. Cheok (eds), University of Liverpool, Liverpool, pp. 4, 2011 [pdf] [bibTeX] [e-prints] 


Contributions to Proceedings

D. Reidsma, K.P. Truong, H. van Welbergen, D. Neiberg, S. Pammi, I.A. de Kok and B. van Straalen Continuous Interaction with a Virtual Human. in Proceedings of the 6th International Summer Workshop on Multimodal Interfaces (eNTERFACE’10), A.A. Salah and T. Gevers (eds), University of Amsterdam, Amsterdam, pp. 24-39, 2010 [pdf] [bibTeX] [e-prints] 


Contributions to Proceedings

M. ter Maat, R.M. Ebbers, D. Reidsma and A. Nijholt Beyond the Beat: Modelling Intentions in a Virtual Conductor. in INTETAIN '08: Proceedings of the 2nd international conference on INtelligent TEchnologies for interactive enterTAINment, ACM Digital Library, ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering), Brussels, Belgium, pp. 12-21, 2008 [pdf] [bibTeX] [e-prints] 


Journal Papers

H. van Welbergen, A. Nijholt, D. Reidsma and J. Zwiers Presenting in Virtual Worlds: Towards an Architecture for a 3D Presenter explaining 2D-Presented Information. IEEE Intelligent Systems, J. Hendler, D. Goren-Bar and O. Mayora-Ibarra (eds), 21(5):47-53, 2006 [pdf] [bibTeX] [e-prints] 

Contributions to Proceedings

A. Nijholt, D. Reidsma and H. van Welbergen Establishing Rapport with a Virtual Dancer. in Proceedings (re)Actor. The First International Conference on Digital Live Art., J.G. Sheridan and A. Bayliss (eds), University of Leeds, London, pp. 2, 2006 [pdf] [bibTeX] [e-prints] 

D. Reidsma, A. Nijholt, R.W. Poppe, R.J. Rienks and G.H.W. Hondorp Virtual Rap Dancer: Invitation to Dance. in CHI '06 extended abstracts on Human factors in computing systems, ACM, pp. 263-266, 2006 [pdf] [bibTeX] [e-prints] 

D. Reidsma, H. van Welbergen, R.W. Poppe, P. Bos and A. Nijholt Towards Bi-directional Dancing Interaction. in Proc. of 5th International Conference on Entertainment Computing, R. Harper, M. Rauterberg and M. Combetto (eds), Lecture Notes in Computer Science, volume 4161, Springer Verlag, Berlin, pp. 1-12, 2006 [pdf] [bibTeX] [e-prints] 


Contributions to Proceedings

D. Reidsma, A. Nijholt, R.J. Rienks and G.H.W. Hondorp Interacting with a Virtual Rap Dancer. in Intetain: INtelligent TEchnologies for interactive enterTAINment, M. Maybury, O. Stock and W. Wahlster (eds), Lecture Notes in Computer Science, volume 3814, Springer Verlag, Berlin, pp. 134-143, 2005 [pdf] [bibTeX] [e-prints]