Numerous methods have been proposed that allow mobile devices to determine where they are located (e.g., home or office) and in some cases, predict what activity the user is currently engaged in (e.g., walking, sitting, or driving). While useful, this sensing currently only tells part of a much richer story. To allow devices to act most appropriately to the situation they are in, it would also be very helpful to know about their placement - for example whether they are sitting on a desk, hidden in a drawer, placed in a pocket, or held in one's hand - as different device behaviors may be called for in each of these situations. In this paper, we describe a simple, small, and inexpensive multispectral optical sensor for identifying materials in proximity to a device. This information can be used in concert with e.g., location information, to estimate, for example, that the device is "sitting on the desk at home", or "in the pocket at work". This paper discusses several potential uses of this technology, as well as results from a two-part study, which indicates that this technique can detect placement at 94.4% accuracy with real-world placement sets.
When designing context-aware applications, it is difficult to for designers in the studio or lab to envision the contextual conditions that will be encountered at runtime. Designers need a tool that can create/re-create naturalistic contextual states and transitions, so that they can evaluate an application under expected contexts. We have designed and developed RePlay: a system for capturing and playing back sensor traces representing scenarios of use. RePlay contributes to research on ubicomp design tools by embodying a structured approach to the capture and playback of contextual data. In particular, RePlay supports: capturing naturalistic data through Capture Probes, encapsulating scenarios of use through Episodes, and supporting exploratory manipulation of scenarios through Transforms. Our experiences using RePlay in internal design projects illustrate its potential benefits for ubicomp design.
The venerable desktop metaphor is beginning to show signs of strain in supporting modern knowledge work. In this paper, we examine how the desktop metaphor can be re-framed, shifting the focus away from a low-level (and increasingly obsolete) focus on documents and applications to an interface based upon the creation of and interaction with manually declared, semantically meaningful activities. We begin by unpacking some of the foundational assumptions of desktop interface design, describe an activity-based model for organizing the desktop interface based on theories of cognition and observations of real-world practice, and identify a series of high-level system requirements for interfaces that use activity as their primary organizing principle. Based on these requirements, we present the novel interface design of the Giornata system, a prototype activity-based desktop interface, and share initial findings from a longitudinal deployment of the Giornata system in a real-world setting.