RegisterGet new password
ISMAR 2014 - Sep 10-12 - Munich, Germany

ieee.org
computer.org
vgtc.org

ISMAR Sessions

Call for Participation

Committees

Supporters and Media

Contact




Social Media

LinkedInLinkedIn



Previous Years

ISMAR Posters in Session "Posters Presentation - Batch 3"

All posters and demonstrations will be presented in a "One Minute Madness" session on Wednesday, 11:30 AM - 1:00 PM in the main conference hall (HS1). Afterwards they will be on display throughout the entire conference, as individually arranged by the authors. Specical, guaranteed presentation times are arranged in the following three batches:

Posters Presentation - Batch 3
Date & Time :  Friday, September 12 10:30 am - 11:00 am
Location : Magistrale
Posters : 
Classifications of Augmented Reality Uses in Marketing
Author:
Ana Javornik
Description:
Existing uses of augmented reality (AR) in marketing have not yet been discussed according to the different contexts of consumption and marketing functions they fulfill. This research investigates which uses of AR have emerged so far in marketing and proposes a classification schema for them, based on the intensity of the augmentation and on marketing functions. Such differentiation is needed in order to better understand the dynamics of augmentation of physical surroundings for commercial purposes and consequently to distinguish between different consumer experiences.
A single co-lived augmented world or many solipsistic fantasies?
Author:
Nicola Liberati
Description:
The aim of this paper is to determine the difference, in augmented reality (AR), between the creation of one single world and the creation of multiple worlds in terms of the “reality” of the augmented world created by the device. This work analyses what kind of relation between subjects and worlds are created by these two different kinds of worlds created from a phenomenological perspective. It will be clear how the production of such an augmented world cannot be solved by mere technical elements, but it has to deal with bigger problems.
CI-Spy: Using Mobile-AR for Scaffolding Historical Inquiry Learning
Authors:
Gurjot Singh, Doug Bowman, Todd Ogle, David Hicks, David Cline, Eric Ragan, Aaron Johnson, Rosemary Zlokas
Description:
Learning how to think critically, analyze sources, and develop an evidence-based account is central to learning history. Often, learners struggle to understand inquiry, to apply it in specific scenarios, and to remain engaged while learning it. This paper discusses preliminary design of a mobile AR system that explicitly teaches inquiry strategies for historical sources and engages students to practice in an augmented real-world context while scaffolding their progression. The overarching question guiding our project is how and to what extent AR technologies can be used to support learning of critical inquiry strategies and processes.
AR for the comprehension of linear perspective in the Renaissance masterpiece The Holy Trinity (Masaccio, 1426)
Author:
Giovanni Landi
Description:
This work shows how augmented reality can help the comprehension of the geometric construction of the perspective in the Holy Trinity fresco (Masaccio, 1426) in the church of Santa Maria Novella in Florence,opening new scenarios in the demonstration of the subtle underlying geometric principles.There has been a long debate about the actual geometrical fidelity of the “fake architecture” painted in perspective in the Masaccio’s fresco. Recent studies demonstrated that Masaccio’s perspective is one of the best and commendable applications of the Brunelleschi's lesson about linear perspective and its geometric construction.We here use AR to follow step by step the geometric construction of Masaccio’s perspective.This approach takes advantage of the possibilities that AR offers to explore in realtime the relations between 3d and its 2d representations.
An Augmented and Virtual Reality System for Training Autistic Children
Authors:
Lakshmi Prabha Nattamai Sekar, Alexandre Santos, Dimitar Mladenov, Olga Beltramello
Description:
Autism or Autism Spectrum disorder (ASD) is a pervasive development disorder causing impairment in thinking, feeling, hearing, speaking and social interaction. For this reason, children suffering from autism need to follow special training in order to increase their ability to learn new skills and knowledge. These children have propensity to be attracted by the technology devices especially virtual animations. The interest of this research work is to explore and study the use of Augmented and Virtual Reality (AR/VR) system for training the children with ASD based on Applied Behavior Analysis (ABA) techniques. This system assists in teaching children about new pictures or objects along with the associated keyword or matching sentence in an immersive way with fast interaction. The preliminary prototype demonstrates satisfactory performance of the proposed AR/VR system working in laboratory conditions.
Social Panoramas Using Wearable Computers
Authors:
Carolin Reichherzer, Alaeddin Nassani, Mark Billinghurst
Description:
In this paper we describe the concept of Social Panoramas that combine panorama images, Mixed Reality, and wearable computers to support remote collaboration. We have developed a prototype that allows panorama images to be explored in real time between a Google Glass user and a remote tablet user. This uses a variety of cues for supporting awareness, and enabling pointing and drawing. We conducted a study to explore if these cues can increase Social Presence. The results suggest that increased interaction does not increase Social Presence, but tools with a higher perceived usability show an improved sense of presence.
Device vs. User Perspective Rendering in Google Glass AR Applications
Authors:
João Paulo Lima, Rafael Roberto, João Marcelo Teixeira, Veronica Teichrieb
View Independence in Remote Collaboration Using AR
Authors:
Matthew Tait, Mark Billinghurst
Description:
This poster presents an Augmented Reality (AR) system for remote collaboration that allows a remote user to navigate a local user’s scene, independently from their viewpoint. This is achieved by using a 3D scan and reconstruction of the user’s environment. The remote user can place virtual objects in the scene that the local user views through a head mounted display (HMD), helping them place real objects. A user study tested how the amount of remote view independence affected collaboration. We found that increased view independence led to faster task completion, more user confidence, and a decrease in verbal communication, but object placement accuracy remained unchanged.
A Reconstructive See-Through Display
Author:
Ky Waegel
Description:
The two most common display technologies used in augmented reality head-mounted displays are optical see-through and video see-through. In this paper I demonstrate a third alternative: reconstructive see-through. By using a commodity depth camera to construct a dense 3D model of the world and rendering this to the user, distracting latency and position discrepancies between real and virtual objects can be reduced.
Representing Degradation of Real Objects Using Augmented Reality
Authors:
Takuya Ogawa, Manabe Yoshitsugu, Noriko Yata
Description:
Much research in augmented reality (AR) technology attempts to match the textures of virtual objects with real world. However, the textures of real objects must also be rendered consistent with virtual information. This paper proposes a method for representing the degradation of real objects in virtual time. Real-world depth information, used to build three-dimensional models of real objects, is captured by a RGB-D camera. The degradation of real objects is then represented by superimposing the degradation texture onto the real object.
Interactive Deformation of Real Objects
Authors:
Jungsik Park, Byung-Kuk Seo, Jong-Il Park
Description:
This paper presents a method for interactive deformation of a real object. Our method uses a predefined 3D model of a target object for tracking and deformation. A camera pose relative to the target object is estimated using 3D model-based tracking. Object region in camera image is obtained by projecting the 3D model onto image plane with the estimated camera pose, and a texture map is extracted from the object region and mapped to the 3D model. Then a texture-mapped model is rendered based on a mesh deformed by user via Laplacian operation. Experimental results demonstrate that our method provides user interactions with 3D real objects on real scenes, not augmented virtual contents.
Contextually Panned and Zoomed Augmented Reality Interactions Using COTS Heads Up Displays
Authors:
Alex Hill, Harrison Leach
Description:
Consumer of the shelf heads up displays with onboard cameras and processing power have recently become available. Evaluations of a naive implementation of video-see-through augmented reality suggest that their small display and off-axis camera presents usability problems. We panned and zoomed a composited video feed on the Google Glass device to center the augmented reality context within the display and to give the appearance of a fixed distance to the content. We pilot tested both the panned and zoomed display against a naive implementation and found that users preferred the view-stabilized version.
Interacting with your own hands in a fully immersive MR system
Authors:
Franco Tecchia, Giovanni Avveduto, Marcello Carrozzino, Raffaello Brondi, Massimo Bergamasco, Leila Alem
Description:
This poster introduces a fully immersive Mixed Reality system we have recently developed, where the user is free to walk inside a virtual scenario while wearing a HMD. The novelty of the system lies in the fact that users can see and use their real hands - by means of a Kinect-like camera mounted on the HMD - in order to naturally interact with the virtual objects. Our working hypothesis are that the introduction of the photorealistic capture of users' hands in a coherently rendered virtual scenario induces in them a strong feeling of presence and embodiment without the need of using a synthetic 3D modelled avatar as a representation of the self. We also argue that the users' ability of grasping and manipulating virtual objects using their own hands not only provides an intuitive interaction experience, but also improves self-perception as well as the perception of the environment.
AIBLE: An Inquiry-Based Augmented Reality Environment for teaching astronomical phenomena
Authors:
Stéphanie Fleck, Gilles Simon, Christian Bastien
Description:
We present an inquiry-based augmented reality (AR) learning environment (AIBLE) designed for teaching basic astronomical phenomena in elementary classroom (children of 8-11 years old). The novelty of this environment lies in the combination of both Inquiry Based Sciences Education and didactics principles with AR features. This environment was user tested by 69 pupils in order to assess its impact on learning. The main results indicate that AIBLE provides new opportunities for the identification of learners’ problem solving strategies.

Sponsors (Become one)

supported by



in special cooperation with