RegisterGet new password
ISMAR 2014 - Sep 10-12 - Munich, Germany

ieee.org
computer.org
vgtc.org

ISMAR Sessions

Call for Participation

Committees

Supporters and Media

Contact




Social Media

LinkedInLinkedIn



Previous Years

ISMAR Posters in Session "Posters Presentation - Batch 1"

All posters and demonstrations will be presented in a "One Minute Madness" session on Wednesday, 11:30 AM - 1:00 PM in the main conference hall (HS1). Afterwards they will be on display throughout the entire conference, as individually arranged by the authors. Specical, guaranteed presentation times are arranged in the following three batches:

Posters Presentation - Batch 1
Date & Time :  Wednesday, September 10 03:15 pm - 03:45 pm
Location : Magistrale
Posters : 
The Posture Angle Threshold between Airplane and Window Frame Metaphors
Authors:
Marcus Tönnis, Sandro Weber, Gudrun Klinker
Description:
Integrating different mental models and the corresponding interaction techniques for navigation and object manipulation tasks is a topic of concern for providing user interfaces for the wide variety of potential users. With a user controlled, egocentric hand-held device we aim at integrating the so far identified three major interaction techniques based on the metaphors of a steering wheel, a toy airplane and a window frame. Here, specifically the toy airplane and the window frame contradict each other, leaving the issue at which postural angle of the hand-held device the mental model of the users changes.
View management for webized mobile AR contents
Authors:
Jungbin Kim, Joohyun Lee, Byounghyun Yoo, Sangchul Ahn, Heedong Ko
Description:
Information presentation techniques are important in augmented reality applications as they are in the traditional desktop user interface (WIMP) and web user interface. This paper introduces view management for a web-based augmented reality mobile platform. We use a webized mobile AR browser called Insight that provides separation of the application logic, including the tracking engine and AR content, so that the view management logic and contents are easy to reuse. In addition, the view management is able to accommodate in-situ context of an AR application.
MOBIL: A Moments based Local Binary Descriptor
Authors:
Abdelkader BELLARBI, Samir Otmane, Nadia ZENATI-HENDA, Samir BENBELKACEM
Description:
In this paper, we propose an efficient, and fast binary descriptor, called MOBIL (MOments based BInary differences for Local description), which compares not just the intensity, but also sub-regions geometric proprieties by employing moments. This approach offers high distinctiveness against affine transformations and appearance changes. The experimental evaluation shows that MOBIL achieves a quite good performance in term of low computation complexity and high recognition rate compared to state-of-the-art real-time local descriptors.
A Preliminary Study on Altering Surface Softness Perception using Augmented Color and Deformation
Authors:
Parinya Punpongsanon, Daisuke Iwai, Kosuke Sato
Description:
Choosing the appropriate soft/hard material is important for designing a product such as sofa or bed, but typically limited by the number of physical materials that the designer owns. Pseudo-haptic feedback is an alternative way that enables designer to roughly simulate material properties (e.g., softness, hardness) by only generating the visual illusion. However, the current technique is limited within video see-through augmented reality, in which the user interact in a real space while looking at a virtual space. This paper explores the possibility to realize pseudo-haptic feedback for touching objects in spatial augmented reality. We investigate and compare effects of visually superimposing a projection graphics onto the surface of a touched object and the fingernail/finger for changing the surface tactile perception. The potential of our method is discussed through a preliminary user study.
Combining Multi-touch and Device Movement in Mobile Augmented Reality Manipulations.
Authors:
Asier Marzo, Benoît Bossavit, Martin Hachet
Description:
Three input modalities for manipulation techniques in Mobile Augmented Reality have been compared. The first one employs only multi-touch input. The second modality uses the movements of the device. Finally, the third one is a hybrid approach based on a combination of the two previous modalities. A user evaluation (N=12) on a 6 DOF docking task suggests that combining multi-touch input and device movement offers the best results in terms of task completion time and efficiency. Nonetheless, using solely the device is more intuitive and performs worse only in large rotations. Given that mobile devices are increasingly supporting movement tracking, the presented results encourage the addition of device movement as an input modality.
Touch Gestures for Improved 3D Object Manipulation in Mobile Augmented Reality
Authors:
Philipp Tiefenbacher, Andreas Pflaum, Gerhard Rigoll
Description:
This work presents three techniques for 3D manipulation on mobile touch devices, taking the specifics of mobile AR scenes into account. We compare the common direct manipulation technique with two indirect techniques, which utilize only the thumbs to perform the transformations. The evaluation of the manipulation variants is conducted in a mixed reality (MR) environment which takes advantage of the controlled conditions of a full virtual reality (VR) system. A study with 18 participants shows that the two-thumb method tops the other techniques. It performs better with respect to the total manipulation time and total number of gestures.
Towards Mobile Augmented Reality for the Elderly
Authors:
Daniel Kurz, Anton Fedosov, Stefan Diewald, Jörg Güttler, Barbara Geilhof, Matthias Heuberger
Description:
Mobility and independence are key aspects for self-determined living in today's world and demographic change presents the challenge to retain these aspects for the aging population. Augmented Reality (AR) user interfaces might support the elderly, for example, when navigating as pedestrians or by explaining how devices and mobility aids work and how they are maintained. This poster reports on the results of practical field tests with elderly subjects testing handheld AR applications. The main finding is that common handheld AR user interfaces are not suited for the elderly because they require the user to hold up the device so the back-facing camera captures the object or environment related to which digital information shall be presented. Tablet computers are too heavy and they do not provide sufficient grip to hold them over a long period of time. One possible alternative is using head-mounted displays (HMD). We present the promising results of a user test evaluating whether elderly people can deal with AR interfaces on a lightweight HMD. We conclude with an outlook to improved handheld AR user interfaces that do not require continuously holding up the device, which we hope are better suited for the elderly.
A Mobile Augmented Reality System to Assist Auto Mechanics
Authors:
Darko Stanimirovic, Nina Damasky, Sabine Webel, Dirk Koriath, Andrea Spillner, Daniel Kurz
Description:
Ground-breaking technologies and innovative design of upcoming vehicles introduce complex maintenance procedures for auto mechanics. In order to present these procedures in an intuitive manner, the Mobile Augmented Reality Technical Assistance (MARTA) project was initiated. The goal was to create an Augmented Reality-aided application running on a tablet computer, which shows maintenance instructions superimposed on a live video feed of the car. Robust image-based tracking of specular surfaces using both edge and texture features as well as the software framework are the most important aspects of the project, which are presented here. The resulting application is deployed and used productively to support maintenance of the Volkswagen XL1 vehicle across the world.
Indirect Augmented Reality Considering Real-World Illumination Change
Authors:
Fumio Okura, Takayuki Akaguma, Tomokazu Sato, Naokazu Yokoya
Description:
Indirect augmented reality (IAR) utilizes pre-captured omnidirectional images and offline superimposition of virtual objects for achieving high-quality geometric and photometric registration. Meanwhile, IAR causes inconsistency between the real world and the pre-captured image. This paper describes the first-ever study focusing on the temporal inconsistency issue in IAR. We propose a novel IAR system which reflects real-world illumination change by selecting an appropriate image from a set of images pre-captured under various illumination. Results of a public experiment show that the proposed system can improve the realism in IAR.
Non-Parametric Camera-Based Calibration of Optical See-Through Glasses for Augmented Reality Applications
Authors:
Martin Klemm, Harald Hoppe, Fabian Seebacher
Description:
This work describes a camera-based method for the calibration of optical See-Through Glasses (STGs). A new calibration technique is introduced for calibrating every single display pixel of the STGs in order to overcome the disadvantages of a parametric model. A non-parametric model compared to the parametric one has the advantage that it can also map arbitrary distortions. The new generation of STGs using waveguide-based displays will have higher arbitrary distortions due to the characteristics of their optics. First tests show better accuracies than in previous works. By using cameras which are placed behind the displays of the STGs, no error prone user interaction is necessary. It is shown that a high accuracy tracking device is not necessary for a good calibration. A camera mounted rigidly on the STGs is used to find the relations between the system components. Furthermore, this work elaborates on the necessity of a second subsequent calibration step which adapts the STGs to a specific user. First tests prove the theory that this subsequent step is necessary.
Visual-Inertial 6-DOF Localization for a Wearable Immersive VR/AR System
Authors:
Ludovico Carozza, Frederic Bosche', Mohamed Abdel-Wahab
Description:
We present a real-time visual-inertial localization approach directly integrable in a wearable immersive system for simulation and training. In this context, while CAVE systems typically require complex and expensive set-up, our approach relies on visual and inertial information provided by consumer monocular camera and Inertial Measurement Unit, embedded in a wearable stereoscopic HMD. 6-DOF localization is achieved through image registration with respect to a 3D map of descriptors of the training room and robust tracking of visual features. We propose a novel efficient and robust pipeline based on state-of-the-art image-based localization and sensor fusion approaches, which makes use of robust orientation information from IMU, to cope with camera fast motion and limit motion jitters. The proposed system runs at 30 fps on a standard PC and requires very limited set-up for its intended application.
Exploring Social Augmentation Concepts for Public Speaking using Peripheral Feedback and Real-Time Behavior Analysis
Authors:
Ionut Damian, Chiew Seng Sean Tan, Tobias Baur, Johannes Schöning, Kris Luyten, Elisabeth André
Description:
Non-verbal and unconscious behavior plays an important role for efficient human-to-human communication but are often undervalued when training people to become better communicators. This is particularly true for public speakers who need not only behave according to a social etiquette but do so while generating enthusiasm and interest for dozens if not hundreds of other persons. In this paper we propose the concept of social augmentation using wearable computing with the goal of giving users the ability to continuously monitor their performance as a communicator. To this end we explore interaction modalities and feedback mechanisms which would lend themselves to this task.

Sponsors (Become one)

supported by



in special cooperation with