Skip to main content

Community Repository Search Results

resource research Media and Technology
Finger-based touch input has become a major interaction modality for mobile user interfaces. However, due to the low precision of finger input, small user interface components are often difficult to acquire and operate on a mobile device. It is even harder when the user is on the go and unable to pay close attention to the interface. In this paper, we present Gesture Avatar, a novel interaction technique that allows users to operate existing arbitrary user interfaces using gestures. It leverages the visibility of graphical user interfaces and the casual interaction of gestures. Gesture Avatar
DATE:
TEAM MEMBERS: Hao Lü Yang Li
resource research Media and Technology
Multi-Touch technology provides a successful gesture based Human Computer Interface. The contact and gesture recognition algorithms of this interface are based on full hand function and, therefore, are not accessible to many people with physical disability. In this paper, we design a set of command-like gestures for users with limited range and function in their digits and wrist. Trajectory and angle features are extracted from these gestures and passed to a recurrent neural network for recognition. Experiments are performed to test the feasibility of gesture recognition system and determine
DATE:
TEAM MEMBERS: Yu Yuan Ying Liu Kenneth Barner
resource research Media and Technology
Creating multiple prototypes facilitates comparative reasoning, grounds team discussion, and enables situated exploration. However, current interface design tools focus on creating single artifacts. This paper introduces the Juxtapose code editor and runtime environment for designing multiple alternatives of both application logic and interface parameters. For rapidly comparing code alternatives, Juxtapose>introduces selectively parallel source editing and execution.
DATE:
TEAM MEMBERS: Björn Hartmann Loren Yu Abel Allison Yeonsoo Yang Scott R. Klemmer
resource research Media and Technology
This paper explores the interactive possibilities enabled when the barrel of a digital pen is augmented with a multi-touch sensor. We present a novel multi-touch pen (MTPen) prototype and discuss its alternate uses beyond those of a standard stylus, such as allowing new touch gestures to be performed using the index finger or thumb and detecting how users grip the device as a mechanism for mode switch-ing. We also discuss the hardware and software implemen-tation challenges in realizing our prototype, and showcase how one can combine different grips (tripod, relaxed tripod, sketch, wrap) and
DATE:
TEAM MEMBERS: Jim Spadaccini Hyunyoung Song Hrvoje Benko Francois Guimbretiere Shahram Izadi Xiang Cao Ken Hinckley
resource research Media and Technology
Zooming user interfaces are increasingly popular on mobile devices with touch screens. Swiping and pinching finger gestures anywhere on the screen manipulate the displayed portion of a page, and taps open objects within the page. This makes navigation easy but limits other manipulations of objects that would be supported naturally by the same gestures, notably cut and paste, multiple selection, and drag and drop. A popular device that suffers from this limitation is Apple’s iPhone. In this paper, we present Bezel Swipe, an interaction technique that supports multiple selection, cut, copy
DATE:
TEAM MEMBERS: Volker Roth Thea Turner
resource evaluation Media and Technology
During its first year, more than 1500 people signed up to be a part of Open Exhibits. Participation ranged from reading blog posts, to trying a few software modules or using Open Exhibits software to develop actual exhibition components. This report highlights findings about the emerging community and trends in Open Exhibits participation.
DATE:
TEAM MEMBERS: Jim Spadaccini Rockman Et. Al.
resource research Media and Technology
Most current multi-touch capable interactive user interfaces for tabletop are built from custom toolkits that are decoupled from, and on top of, the “Desktop” provided by the underlying Operating System. However, this approach requires that each individual touch system build their own suite of touch capable custom applications (such as photo browsers), usually resulting in limited functionality. In this paper, we propose a software architecture for supporting and integrating multi-touch capability on existing desktop systems, where multi-touch and multiple single pointer input can be used
DATE:
TEAM MEMBERS: Kelvin Cheng Benjamin Itzstein Paul Sztajer Markus Rittenbruch
resource research Media and Technology
Despite the considerable quantity of research directed towards multitouch technologies, a set of standardized UI components have not been developed. Menu systems provide a particular challenge, as traditional GUI menus require a level of pointing precision inappropriate for direct finger input. Marking menus are a promising alternative, but have yet to be investigated or adapted for use within multitouch systems. In this paper, we first investigate the human capabilities for performing directional chording gestures, to assess the feasibility of multitouch marking menus. Based on the positive
DATE:
TEAM MEMBERS: Julian Lepinski Tovi Grossman George Fitzmaurice
resource research Media and Technology
Modern mobile phones can store a large amount of data, such as contacts, applications and music. However, it is difficult to access specific data items via existing mobile user interfaces. In this paper, we present Gesture Search, a tool that allows a user to quickly access various data items on a mobile phone by drawing gestures on its touch screen. Gesture Search contributes a unique way of combining gesture-based interaction and search for fast mobile data access. It also demonstrates a novel approach for coupling gestures with standard GUI interaction. A real world deployment with mobile
DATE:
TEAM MEMBERS: Yang Li
resource research Media and Technology
In this paper, we propose Objects, Containers, Gestures, and Manipulations (OCGM, pronounced like Occam’s Razor) as universal foundational metaphors of Natural User Interfaces. We compare OCGM to existing paradigms using SRK behavior classification and early childhood cognitive development, and justify the “universal” and “foundational” descriptors based upon cognitive linguistics and universal grammar. If adopted, OCGM would significantly improve the conceptual understanding of NUIs by developers and designers and ultimately result in better NUI applications.
DATE:
TEAM MEMBERS: Ron George Joshua Blake
resource research Media and Technology
Proton is a novel framework that addresses both of these problems. Using Proton, the application developer declaratively specifies each gesture as a regular expression over a stream of touch events. Proton statically analyzes the set of gestures to report conflicts, and it automatically creates gesture recognizers for the entire set. To simplify the creation of complex multitouch gestures, Proton introduces gesture tablature, a graphical notation that concisely describes the sequencing of multiple interleaved touch actions over time.
DATE:
TEAM MEMBERS: Jim Spadaccini Kenrick Kin Björn Hartmann Tony DeRose Maneesh Agrawala
resource research Media and Technology
This article introduces a new interaction model called Instrumental Interaction that extends and generalizes the principles of direct manipulation. It covers existing interaction styles, including traditional WIMP interfaces, as well as new interaction styles such as two-handed input and augmented reality. It defines a design space for new interaction techniques and a set of properties for comparing them.
DATE:
TEAM MEMBERS: Michael Beaudouin-Lafon