[Recruitment]
Associate/Assistant Professor in Artificial Intelligence for Space-Enabled Technologies, Durham University

We are looking for applicants in Artificial Intelligence, Computer Vision, Edge Computing, Digital Twins, Human Computer Interaction, User Modelling, Robotics or Resilient Computing with potentials/achievements in informing space applications.

The post hoder will enjoy 1) a permanent (equivalent to US tenured) position at a top 100 university, 2) significantly reduced teaching, 3) a fully-funded PhD, 4) travel budget, 5) chance for a 2-year fully-funded Post-Doc.

Data-Driven Crowd Motion Control with Multi-Touch Gestures

Yijun Shen, Joseph Henry, He Wang, Edmond S. L. Ho, Taku Komura and Hubert P. H. Shum
Computer Graphics Forum (CGF), 2018

 Invited presentation at Eurographics 2019 Impact Factor: 2.5 Citation: 14#

Data-Driven Crowd Motion Control with Multi-Touch Gestures
# According to Google Scholar 2023"

Abstract

Controlling a crowd using multi-touch devices appeals to the computer games and animation industries, as such devices provide a high dimensional control signal that can effectively define the crowd formation and movement. However, existing works relying on pre-defined control schemes require the users to learn a scheme that may not be intuitive. We propose a data-driven gesture-based crowd control system, in which the control scheme is learned from example gestures provided by different users. In particular, we build a database with pairwise samples of gestures and crowd motions. To effectively generalize the gesture style of different users, such as the use of different numbers of fingers, we propose a set of gesture features for representing a set of hand gesture trajectories. Similarly, to represent crowd motion trajectories of different numbers of characters over time, we propose a set of crowd motion features that are extracted from a Gaussian mixture model. Given a run-time gesture, our system extracts the K nearest gestures from the database and interpolates the corresponding crowd motions in order to generate the run-time control. Our system is accurate and efficient, making it suitable for real-time applications such as real-time strategy games and interactive animation controls.

Downloads

YouTube

Citations

BibTeX

@article{shen18datadriven,
 author={Shen, Yijun and Henry, Joseph and Wang, He and Ho, Edmond S. L. and Komura, Taku and Shum, Hubert P. H.},
 journal={Computer Graphics Forum},
 title={Data-Driven Crowd Motion Control with Multi-Touch Gestures},
 year={2018},
 volume={37},
 number={6},
 pages={382--394},
 numpages={14},
 doi={10.1111/cgf.13333},
 issn={1467-8659},
 publisher={John Wiley and Sons Ltd.},
 Address={Chichester, UK},
}

RIS

TY  - JOUR
AU  - Shen, Yijun
AU  - Henry, Joseph
AU  - Wang, He
AU  - Ho, Edmond S. L.
AU  - Komura, Taku
AU  - Shum, Hubert P. H.
T2  - Computer Graphics Forum
TI  - Data-Driven Crowd Motion Control with Multi-Touch Gestures
PY  - 2018
VL  - 37
IS  - 6
SP  - 382
EP  - 394
DO  - 10.1111/cgf.13333
SN  - 1467-8659
PB  - John Wiley and Sons Ltd.
ER  - 

Plain Text

Yijun Shen, Joseph Henry, He Wang, Edmond S. L. Ho, Taku Komura and Hubert P. H. Shum, "Data-Driven Crowd Motion Control with Multi-Touch Gestures," Computer Graphics Forum, vol. 37, no. 6, pp. 382-394, John Wiley and Sons Ltd., 2018.

Supporting Grants

Northumbria University

Postgraduate Research Scholarship (Ref: ): £65,000, Principal Investigator ()
Received from Faculty of Engineering and Environment, Northumbria University, UK, 2015-2018
Project Page

Similar Research

Joseph Henry, Hubert P. H. Shum and Taku Komura, "Interactive Formation Control in Complex Environments", IEEE Transactions on Visualization and Computer Graphics (TVCG), 2014
Joseph Henry, Hubert P. H. Shum and Taku Komura, "Environment-Aware Real-Time Crowd Control", Proceedings of the 2012 ACM SIGGRAPH/Eurographics Symposium on Computer Animation (SCA), 2012
Adam Barnett, Hubert P. H. Shum and Taku Komura, "Coordinated Crowd Simulation with Topological Scene Analysis", Computer Graphics Forum (CGF), 2016

 

 

Last updated on 17 February 2024
RSS Feed