Keywords
UIST2.0 Archive - 20 years of UIST
Back
Back to keywords index

augmented

augmented laboratory note-book

In Proceedings of UIST 2002
Article Picture

The missing link: augmenting biology laboratory notebooks (p. 41-50)

augmented paper

In Proceedings of UIST 2008
Article Picture

Iterative design and evaluation of an event architecture for pen-and-paper interfaces (p. 111-120)

Abstract plus

This paper explores architectural support for interfaces combining pen, paper, and PC. We show how the event-based approach common to GUIs can apply to augmented paper, and describe additions to address paper's distinguishing characteristics. To understand the developer experience of this architecture, we deployed the toolkit to 17 student teams for six weeks. Analysis of the developers' code provided insight into the appropriateness of events for paper UIs. The usage patterns we distilled informed a second iteration of the toolkit, which introduces techniques for integrating interactive and batched input handling, coordinating interactions across devices, and debugging paper applications. The study also revealed that programmers created gesture handlers by composing simple ink measurements. This desire for informal interactions inspired us to include abstractions for recognition. This work has implications beyond paper - designers of graphical tools can examine API usage to inform iterative toolkit development.

augmented reality

In Proceedings of UIST 1995
Article Picture

The world through the computer: computer augmented interaction with real world environments (p. 29-36)

In Proceedings of UIST 1995
Article Picture

Retrieving electronic documents with real-world objects on InteractiveDESK (p. 37-38)

In Proceedings of UIST 1997
Article Picture

HoloWall: designing a finger, hand, body, and object sensitive wall (p. 209-210)

In Proceedings of UIST 1997
Article Picture

Audio aura: light-weight audio augmented reality (p. 211-212)

In Proceedings of UIST 1997
Article Picture

The metaDESK: models and prototypes for tangible user interfaces (p. 223-232)

In Proceedings of UIST 1998
Article Picture

Of Vampire mirrors and privacy lamps: privacy management in multi-user augmented environments (p. 171-172)

In Proceedings of UIST 1999
Article Picture

Real-world interaction using the FieldMouse (p. 113-119)

In Proceedings of UIST 1999
Article Picture

Linking and messaging from real paper in the Paper PDA (p. 179-186)

In Proceedings of UIST 2000
Article Picture

System lag tests for augmented and virtual environments (p. 161-170)

In Proceedings of UIST 2001
Article Picture

View management for virtual and augmented reality (p. 101-110)

In Proceedings of UIST 2002
Article Picture

The missing link: augmenting biology laboratory notebooks (p. 41-50)

In Proceedings of UIST 2002
Article Picture

An annotated situation-awareness aid for augmented reality (p. 213-216)

In Proceedings of UIST 2004
Article Picture

DART: a toolkit for rapid design exploration of augmented reality experiences (p. 197-206)

In Proceedings of UIST 2005
Article Picture

Moveable interactive projected displays using projector based tracking (p. 63-72)

In Proceedings of UIST 2005
Article Picture

Supporting interaction in augmented reality in the presence of uncertain spatial knowledge (p. 111-114)

In Proceedings of UIST 2007
Article Picture

Hybrid infrared and visible light projection for location tracking (p. 57-60)

Abstract plus

A number of projects within the computer graphics, computer vision, and human-computer interaction communities have recognized the value of using projected structured light patterns for the purposes of doing range finding, location dependent data delivery, projector adaptation, or object discovery and tracking. However, most of the work exploring these concepts has relied on visible structured light patterns resulting in a caustic visual experience. In this work, we present the first design and implementation of a high-resolution, scalable, general purpose invisible near-infrared projector that can be manufactured in a practical manner. This approach is compatible with simultaneous visible light projection and integrates well with future Digital Light Processing (DLP) projector designs -- the most common type of projectors today. By unifying both the visible and non-visible pattern projection into a single device, we can greatly simply the implementation and execution of interactive projection systems. Additionally, we can inherently provide location discovery and tracking capabilities that are unattainable using other approaches.

In Proceedings of UIST 2007
Article Picture

Lucid touch: a see-through mobile device (p. 269-278)

Abstract plus

Touch is a compelling input modality for interactive devices; however, touch input on the small screen of a mobile device is problematic because a user's fingers occlude the graphical elements he wishes to work with. In this paper, we present LucidTouch, a mobile device that addresses this limitation by allowing the user to control the application by touching the back of the device. The key to making this usable is what we call pseudo-transparency: by overlaying an image of the user's hands onto the screen, we create the illusion of the mobile device itself being semi-transparent. This pseudo-transparency allows users to accurately acquire targets while not occluding the screen with their fingers and hand. Lucid Touch also supports multi-touch input, allowing users to operate the device simultaneously with all 10 fingers. We present initial study results that indicate that many users found touching on the back to be preferable to touching on the front, due to reduced occlusion, higher precision, and the ability to make multi-finger input.

In Proceedings of UIST 2008
Article Picture

Foldable interactive displays (p. 287-290)

Abstract plus

Modern computer displays tend to be in fixed size, rigid, and rectilinear rendering them insensitive to the visual area demands of an application or the desires of the user. Foldable displays offer the ability to reshape and resize the interactive surface at our convenience and even permit us to carry a very large display surface in a small volume. In this paper, we implement four interactive foldable display designs using image projection with low-cost tracking and explore display behaviors using orientation sensitivity.

In Proceedings of UIST 2010
Article Picture

Gilded gait: reshaping the urban experience with augmented footsteps (p. 185-188)

Abstract plus

In this paper we describe Gilded Gait, a system that changes the perceived physical texture of the ground, as felt through the soles of users' feet. Ground texture, in spite of its potential as an effective channel of peripheral information display, has so far been paid little attention in HCI research. The system is designed as a pair of insoles with embedded actuators, and utilizes vibrotactile feedback to simulate the perceptions of a range of different ground textures. The discreet, low-key nature of the interface makes it particularly suited for outdoor use, and its capacity to alter how people experience the built environment may open new possibilities in urban design.

In Proceedings of UIST 2010
Article Picture

Combining multiple depth cameras and projectors for interactions on, above and between surfaces (p. 273-282)

Abstract plus

Instrumented with multiple depth cameras and projectors, LightSpace is a small room installation designed to explore a variety of interactions and computational strategies related to interactive displays and the space that they inhabit. LightSpace cameras and projectors are calibrated to 3D real world coordinates, allowing for projection of graphics correctly onto any surface visible by both camera and projector. Selective projection of the depth camera data enables emulation of interactive displays on un-instrumented surfaces (such as a standard table or office desk), as well as facilitates mid-air interactions between and around these displays. For example, after performing multi-touch interactions on a virtual object on the tabletop, the user may transfer the object to another display by simultaneously touching the object and the destination display. Or the user may "pick up" the object by sweeping it into their hand, see it sitting in their hand as they walk over to an interactive wall display, and "drop" the object onto the wall by touching it with their other hand. We detail the interactions and algorithms unique to LightSpace, discuss some initial observations of use and suggest future directions.

augmented reality (ar)

computer augmented environment

In Proceedings of UIST 1995
Article Picture

The world through the computer: computer augmented interaction with real world environments (p. 29-36)

In Proceedings of UIST 1997
Article Picture

Pick-and-drop: a direct manipulation technique for multiple computer environments (p. 31-39)

paper augmented digital document

In Proceedings of UIST 2003
Article Picture

Paper augmented digital documents (p. 51-60)