DanceDJ: A 3D Dance Animation Authoring System for Live Performance

Naoya Iwamoto, Takuya Kato, Hubert P. H. Shum, Ryo Kakitsuka, Kenta Hara and Shigeo Morishima
Proceedings of the 2017 International Conference on Advances in Computer Entertainment Technology (ACE), 2017

 Best Paper Award

DanceDJ: A 3D Dance Animation Authoring System for Live Performance

Abstract

Dance is an important component of live performance for expressing emotion and presenting visual context. Human dance performances typically require expert knowledge of dance choreography and professional rehearsal, which are too costly for casual entertainment venues and clubs. Recent advancements in character animation and motion synthesis have made it possible to synthesize virtual 3D dance characters in real-time. The major problem in existing systems is a lack of an intuitive interfaces to control the animation for real-time dance controls. We propose a new system called the DanceDJ to solve this problem. Our system consists of two parts. The first part is an underlying motion analysis system that evaluates motion features including dance features such as the postures and movement tempo, as well as audio features such as the music tempo and structure. As a pre-process, given a dancing motion database, our system evaluates the quality of possible timings to connect and switch different dancing motions. During run-time, we propose a control interface that provides visual guidance. We observe that disk jockeys (DJs) effectively control the mixing of music using the DJ controller, and therefore propose a DJ controller for controlling dancing characters. This allows DJs to transfer their skills from music control to dance control using a similar hardware setup. We map different motion control functions onto the DJ controller, and visualize the timing of natural connection points, such that the DJ can effectively govern the synthesized dance motion. We conducted two user experiments to evaluate the user experience and the quality of the dance character. Quantitative analysis shows that our system performs well in both motion control and simulation quality.

Downloads

YouTube

Citations

BibTeX

@inproceedings{iwamoto17dancedj,
 author={Iwamoto, Naoya and Kato, Takuya and Shum, Hubert P. H. and Kakitsuka, Ryo and Hara, Kenta and Morishima, Shigeo},
 booktitle={Proceedings of the 2017 International Conference on Advances in Computer Entertainment Technology},
 series={ACE '17},
 title={DanceDJ: A 3D Dance Animation Authoring System for Live Performance},
 year={2017},
 month={12},
 pages={653--670},
 numpages={18},
 doi={10.1007/978-3-319-76270-8_46},
 isbn={978-3-319-76270-8},
 location={London, UK},
}

RIS

TY  - CONF
AU  - Iwamoto, Naoya
AU  - Kato, Takuya
AU  - Shum, Hubert P. H.
AU  - Kakitsuka, Ryo
AU  - Hara, Kenta
AU  - Morishima, Shigeo
T2  - Proceedings of the 2017 International Conference on Advances in Computer Entertainment Technology
TI  - DanceDJ: A 3D Dance Animation Authoring System for Live Performance
PY  - 2017
Y1  - 12 2017
SP  - 653
EP  - 670
DO  - 10.1007/978-3-319-76270-8_46
SN  - 978-3-319-76270-8
ER  - 

Plain Text

Naoya Iwamoto, Takuya Kato, Hubert P. H. Shum, Ryo Kakitsuka, Kenta Hara and Shigeo Morishima, "DanceDJ: A 3D Dance Animation Authoring System for Live Performance," in ACE '17: Proceedings of the 2017 International Conference on Advances in Computer Entertainment Technology, pp. 653-670, London, UK, Dec 2017.

Supporting Grants

The Ministry of Education, Culture, Sports, Science and Technology

MEXT Top Global University Project Scholarship (Ref: ): £5,500, Co-Applicant (PI: Prof. Shigeo Morishima, Japanese Partner)
Received from The Ministry of Education, Culture, Sports, Science and Technology, 2015-2016
Project Page

Similar Research

Naoya Iwamoto, Hubert P. H. Shum, Wakana Asahina and Shigeo Morishima, "Automatic Sign Dance Synthesis from Gesture-Based Sign Language", Proceedings of the 2019 ACM SIGGRAPH Conference on Motion, Interaction and Games (MIG), 2019
Wakana Asahina, Naoya Iwamoto, Hubert P. H. Shum and Shigeo Morishima, "Automatic Dance Generation System Considering Sign Language Information", Proceedings of the 2016 ACM SIGGRAPH Posters, 2016
Edmond S. L. Ho, Hubert P. H. Shum, He Wang and Li Yi, "Synthesizing Motion with Relative Emotion Strength", Proceedings of the 2017 ACM SIGGRAPH Asia Workshop on Data-Driven Animation Techniques (D2AT), 2017
Hubert P. H. Shum, Ludovic Hoyet, Edmond S. L. Ho, Taku Komura and Franck Multon, "Preparation Behaviour Synthesis with Reinforcement Learning", Proceedings of the 2013 International Conference on Computer Animation and Social Agents (CASA), 2013
Hubert P. H. Shum and Edmond S. L. Ho, "Real-Time Physical Modelling of Character Movements with Microsoft Kinect", Proceedings of the 2012 ACM Symposium on Virtual Reality Software and Technology (VRST), 2012
Liuyang Zhou, Lifeng Shang, Hubert P. H. Shum and Howard Leung, "Human Motion Variation Synthesis with Multivariate Gaussian Processes", Computer Animation and Virtual Worlds (CAVW) - Proceedings of the 2014 International Conference on Computer Animation and Social Agents (CASA), 2014
He Wang, Edmond S. L. Ho, Hubert P. H. Shum and Zhanxing Zhu, "Spatio-Temporal Manifold Learning for Human Motions via Long-Horizon Modeling", IEEE Transactions on Visualization and Computer Graphics (TVCG), 2021

 

 

Last updated on 14 April 2024
RSS Feed