The Virtual Dancer is an interactive dancing agent. It dances together
with the user, aligning the appropriate dance moves in real time to the
beat of the music, adapting its style to what it observes from the
user through real time computer vision. By alternating patterns of
following the user with taking the lead with new moves,
the system attempts to achieve a mutual dancing interaction.
ECA, beat tracking, virtual rap dancer, audio/video input, virtual realitySome themes present in this work are :The Virtual Dancer is an interactive dancing agent. It dances together
with the user to the beat of the music. It will adapt its performance
to whatever the human user is doing, who is observed using real time
computer vision. Below some elements of the system are discussed.
A prerequisite for a virtual dancer is the ability to interpret the
music to find the beats to which the dance should be aligned. For this
we implemented the beat detection algorithm described in the
publications of Anssi Klapuri. This algorithm will detect the tempo
and beat in the music played for the dancer.
Once the beat is known, the virtual dancer should be able to dance
along. For this we constructed a database of many different dance
moves, collected using motion capturing or created manually, using
parameterized animation (see also the videos to the right of this
page). The virtual dancer selects the most appropriate dance moves
from the database, given the observations of the movement
characteristics of the human dancer. These moves are then timed to the
beat, by locally warping its timing so that the beat positions in the
move match with those in the music. For example, in a complex clapping
animation, the clap-points are aligned exactly to the predicted beat
times, so the dancer will clap to the beat of the music. The
transition from one move to the other is made using an IK-generated
stepping motion and interpolation techniques.
In the mean time, the system observes the movements of the human dance
partner using the computer vision system ParleVision.
Using several robust processors, the system extracts global
characteristics about the movements of the human dancer, such how much
(s)he moves around or how much (s)he waves with the arms. Such
characteristics can then be used to select moves from the database
that are in some way "appropriate" to the dancing style of the human
By alternating patterns of following the user with taking
the lead with new moves, the system attempts to achieve a mutual
dancing interaction where both human and virtual dancer influence each
other. Finding the appropriate nonverbal interaction patterns that
achieve this is one of the longer term issues that is being addressed
in the context of this showcase.
The work on the Virtual Dancer has been presented in several videos,
some of which are downloadable from here. The first video was produced
for the recent CHI2006
interactivity session. The second movie
was a submission for GALA2005, where the system did not too bad in the
final rankings. The other movies show the process of
animating the dance motions captured in a Vicon lab and an example of
parametrized animation. Note: the movies are usually quite large
CHI 2006 Interactivity Submission (19Mb).
GALA2005 submission video (23Mb).
Motion capture and animation for the Virtual Dancer (10Mb).
Parameterized animation (1Mb).
|D. Reidsma, H. van Welbergen, R.W. Poppe, P. Bos and A. Nijholt Towards Bi-directional Dancing Interaction, in Proc. of 5th International Conference on Entertainment Computing, R. Harper, M. Rauterberg and M. Combetto (eds), Lecture Notes in Computer Science, volume 4161, Springer Verlag, Berlin, ISBN 3-540-45259-1, ISSN 0302-9743, pp. 1-12, 2006 [ BiBTeX ]  |
|D. Reidsma, A. Nijholt, R.J. Rienks and G.H.W. Hondorp Interacting with a Virtual Rap Dancer, in Intetain: INtelligent TEchnologies for interactive enterTAINment, M. Maybury, O. Stock and W. Wahlster (eds), Lecture Notes in Computer Science, volume 3814, Springer Verlag, Berlin, ISBN 978-3-540-30509-5, pp. 134-143, 2005 [ BiBTeX ]  |
|M. ter Maat, R.M. Ebbers, D. Reidsma and A. Nijholt Beyond the Beat: Modelling Intentions in a Virtual Conductor, in INTETAIN '08: Proceedings of the 2nd international conference on INtelligent TEchnologies for interactive enterTAINment, ACM Digital Library, ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering), Brussels, Belgium, ISBN 978-963-9799-13-4, pp. 12-21, 2008 [ BiBTeX ]  [Official URL] |
|D. Reidsma, K.P. Truong, H. van Welbergen, D. Neiberg, S. Pammi, I.A. de Kok and B. van Straalen Continuous Interaction with a Virtual Human, in Proceedings of the 6th International Summer Workshop on Multimodal Interfaces (eNTERFACE’10), A.A. Salah and T. Gevers (eds), University of Amsterdam, Amsterdam, ISBN 978-90-5776-213-0, pp. 24-39, 2010 [ BiBTeX ]  [Official URL] |
|D. Reidsma, E. Dehling, H. van Welbergen, J. Zwiers and A. Nijholt Leading and following with a virtual trainer, in Proceedings of the 4th International Workshop on Whole Body Interaction in Games and Entertainment, D. England, A. El Rhalibi and A.D. Cheok (eds), University of Liverpool, Liverpool, ISBN not assigned, pp. 4, 2011 [ BiBTeX ]  |