ISMAR 2016, the premier conference for Augmented Reality (AR) and Mixed Reality (MR), will be held in beautiful Merida, Yucatan, Mexico.

ISMAR is responding to the recent explosion of commercial and research activities related to AR, MR, and Virtual Reality (VR) by continuing the expansion of its scope over the past several years. ISMAR 2016 will cover the full range of technologies encompassed by the MR continuum, from interfaces in the real world to fully immersive experiences. This range goes far beyond the traditional definition of AR, which focused on precise 3D tracking, visual display, and real-time performance.

We specifically invite contributions from fundamental areas such as Computer Graphics, Human-Computer Interaction, Psychology, Computer Vision, and Optics, and how these areas contribute to advancing AR / VR / MR technology.

This year, we continue to have an open call for selecting Program Committee members, in the hope that this further increases transparency and widens scope. 

Further Details

Submission Details

There is only one paper submission category, from 4 to 10 pages. Papers ready for journal publication will be directly published in a special issue of IEEE Transactions on Visualization and Computer Graphics (TVCG).  Other accepted papers will be published in the ISMAR proceedings.  Paper quality versus length will be assessed according to a contribution-per-page judgment.

  • All accepted papers will be orally presented at the ISMAR conference.
  • All accepted papers will have the opportunity to be presented as a demo.
  • All accepted papers will have the opportunity to be presented as a poster.
  • All accepted papers will be archived in the IEEE Xplore digital library.

Poster submissions will be accepted as usual with a submission date to be announced later.

Topics of Interest

All topics relevant to AR, VR, and MR are of interest. These include, but are not limited to:

Information Presentation

  • Mediated and diminished reality
  • Multisensory rendering, registration, and synchronization
  • Photorealistic and non-photorealistic rendering
  • Real-time and non-real-time interactive rendering
  • Visual, aural, haptic, and olfactory augmentation

Input

  • Acquisition of 3D video and scene descriptions
  • Calibration and registration (of sensing systems)
  • Location sensing technologies (of any kind, including non-real-time)
  • Projector-camera systems
  • Sensor fusion
  • Smart spaces
  • Touch, tangible and gesture interfaces
  • Video processing and streaming
  • Visual mapping
  • Wearable sensors, ambient-device interaction

OutputCall for Poster Papers

  • Display hardware, including 3D, stereoscopic, and multi-user
  • Live video stream augmentation (e.g., in robotics and broadcast)
  • Wearable actuators and augmented humans
  • Wearable and situated displays (e.g., eyewear, smart watches, pico-projectors)

User Experience Design

  • Collaborative interfaces
  • Technology acceptance and social implications
  • Therapy and rehabilitation
  • Usability studies and experiments
  • Virtual analytics and entertainment

Human Performance and Perception

  • Interaction techniques
  • Learning and training
  • Multimodal input and output
  • Perception of virtual objects

System Architecture

  • Content creation and management
  • Distributed and collaborative architectures
  • Online services
  • Real-time performance issues
  • Scene description and management issues
  • Wearable and mobile computing

Applications

  • Architecture
  • Art, cultural heritage, education and training
  • Entertainment, broadcast
  • Industrial, military, emergency response
  • Medical
  • Personal information systems

Contact

ISMAR 2016 Science & Technology Program Co-Chairs: scitech_chairs[at]ismar2016.org

  • Wolfgang Broll, Technische Universität Ilmenau, Germany
  • Hideo Saito, Keio University, Japan
  • J. Edward Swan II, Mississippi State University, USA