ICMI is the premier international forum for multidisciplinaryresearchonmultimodalhuman- human and human-computer interaction, interfaces,andsystemdevelopment.
Workshop proposals: March 1, 2012
Special session proposals: March 7, 2012
Paper & demo submission: May 4, 2012
Author noti;cation: July 23, 2012
Camera-ready: August 20, 2012
Main Conference: Oct 23-25, 2012
Workshops: Oct 22, 26, 2012
Call for Papers: ICMI 2012 will feature a single-track main conference including: keynote speakers,
technical full and short papers (including oral and poster presentations), special sessions,
demonstrations, exhibits and doctoral spotlight papers. Topics of interest include, but are not limited to:
Multimodal interaction processing Machine learning, pattern recognition, and signal processing approaches for the
analysis and modeling of multimodal interaction between people and among the di;erent modalities within people;
adaptation and multimodal input fusion and output generation, addressing any combination of: vision, gaze, audio, speech,
smell/olfaction, taste, gestures, pen, haptic and tangible, bio-signals such as brain activity and skin conductivity.
Interactive systems and applications Mobile and ubiquitous systems, automotive and navigation systems, human-robot and human-virtual agent interaction, virtual and augmented reality, education, authoring, entertainment, gaming,
telepresence, assistive and prosthetic systems, brain-computer interfaces, universal access, healthcare, biometry, intelligent
environments, meeting analysis and meeting spaces, indexing, retrieval and summarization, etc.
Data, evaluation, and standards for multimodal interactive systems Design issues, principles, and best
practices and authoring techniques for human-machine interfaces using any combinations of input and/or output multiple
modalities. Architectures; assessment techniques and methodologies; corpora; annotation and browsing of multimodal
interactive data; W3C and other standards for multimodal interaction and interfaces; evaluation techniques for multimodal
Modeling human communication patterns The modalities and applications named above drive a need for
multimodal models of human-human and human-machine communication, including verbal and non-verbal interaction,
a;ordances of di;erent modalities, multimodal discourse and dialogue modeling, modeling of culture as it pertains to
multimodality, long-term multimodal interaction, multimodality in social and a;ective interaction,
multimodal social signal processing.
Patrick Smith Photography