This workshop aims at stimulating submissions on computational models and systems for the automated detection,
measurement, and prediction of movement qualities from behavioural signals,
based on multi-layer parallel processes at non-linearly stratified temporal dimensions,
an important research challenge for future multimodal systems. We invite contributions towards novel
computational models and technologies for human movement analysis and prediction going beyond the well-known
motion capture paradigm. Future motion capture and movement analysis systems will be endowed with a completely new functionality,
achieving a novel generation of time-aware multisensory motion perception and prediction systems.
Contributions from computational models, multimodal systems, experiments on the above mentioned core topics,
as well as application scenarios, including healing, therapy and rehabilitation, entertainment,
performing arts (music, dance) and active experience of cultural content are welcome.
Contributions fostering a multidisciplinary discussion are especially recommended
Specific topics include but are not limited to the following:
- Multi-layer and multi-temporal scale automated movement analysis and prediction
- Multi-time multimodal systems
- Multi-temporal scale automated movement segmentation
- Multi-time models of entrainment and non-verbal social signals
- Individual and group motor signature
- Cognitive neuroscience models of movement perception and prediction
- Applications in therapy and rehabilitation
- multi-temporal scale automated movement segmentation
- Applications in performing arts
- Applications in active experience of multimedia cultural content
Contributions will be accepted in the form of long papers (8 pages) or short papers (4 pages), or extended abstracts (2 pages) and will be selected through a blind peer-review process. Authors will be given 20 minutes to present long papers and 10 minutes for short papers. Abstracts will be presented during a shared poster session with other ACM ICMI workshops.
Each author is only allowed to submit no more than 2 papers.
Authors are invited to submit via EasyChair at the following link:
EASYCHAIR > CFP > MSMT2020
For long and short papers, please follow the guidelines available at: ICMI.ACM.ORG
For posters, authors are invited to submit extended abstracts (2 pages max. references included).
The workshop is partially supported by
the EU-H2020-FET Proactive Project GA824160 EnTimeMent.