EURASIP Journal on Advances in Signal Processing
Visual human motion understanding in the Wild
March 1, 2019 (in 1 week, 3 days)
Visual human motion understanding is a key computer vision task that aims at understanding the placement, trajectories, and future actions of humans in a unconstrained natural scene. This field includes the key tasks of finding people in a scene through people detection, segmentation and pose estimation, understanding movement through people tracking, and recognizing people behaviors through motion trajectories. Recently, there has been increasing interest from the academic vision community about this topic, as well as from communities in industry due to its applications in a number of fields. For example, safe mobile robot navigation, including autonomous driving depends on robots (cars) being able to recognize where nearby pedestrians are and what they might do next. Likewise, human motion understanding is key for smart surveillance, athletic performance analysis, and VR applications, human computer interaction, among others.
Although the studies of human motion understanding in computer vision are invaluable for both academia and industry, there are many fundamental problems unsolved e.g., robust object detection and tracking, unconstrained object activity recognition, etc. Recently, machine learning algorithms, such as deep learning, have been successfully applied in this area for object tracking, activity modeling and recognition and also shown promising results in some real-world applications such as human computer interaction and autonomous driving, which prepares us to exploit and develop effective machine learning algorithms for addressing fundamental issues in human motion understanding.
- Prof. Shengping Zhang, Harbin Institute of Technology, China
- Dr. Huiyu Zhou, University of Leicester, United Kingdom
- Prof. Xiangyuan Lan, Hong Kong Baptist University, Hong Kong
- Dr. Lei Zhang, University of Pittsburgh, United States
- Dr. Christophoros Nikou, University of Ioannina, Greece