Finger-based touch input has become a major interaction modality for mobile user interfaces. However, due to the low precision of finger input, small user interface components are often difficult to acquire and operate on a mobile device. It is even harder when the user is on the go and unable to pay close attention to the interface. In this paper, we present Gesture Avatar, a novel interaction technique that allows users to operate existing arbitrary user interfaces using gestures. It leverages the visibility of graphical user interfaces and the casual interaction of gestures. Gesture Avatar
Collaborative Information Retrieval (CIR) is the process by which people working together can collaboratively search for, share and navigate through information. Computer support for CIR currently makes use of single-user systems. CIR systems could benefit from the use of multi-user interaction to enable more than one person to collaborate using the same data sources, at the same time and in the same place. Multi-touch interaction has provided the ability for multiple users to interact simultaneously with a multi-touch surface. This paper presents a generalised architecture for multi-touch CIR
Touch-sensitive devices are becoming more and more common. Many people use touch interaction, especially on handheld devices like iPhones or other mobile phones. But the question is, do people really understand the different gestures, i.e., do they know which gesture is the correct one for the intended action and do they know how to transfer the gestures to bigger devices and surfaces? This paper reports the results of usability tests which were carried out in semi public space to explore peoples’ ability to find gestures to navigate on a virtual globe. The globe is presented on a multi-touch
DATE:
TEAM MEMBERS:
Jim SpadacciniMarkus JokischThomas BartoschekAngela Schwering
For the past twenty years there has been a slow trickle of research disseminated through a variety of channels on the natureand use of computer interactives within museum and gallery environments. This research has yet to be consolidated into arobust and coherent evidence base for considering and understanding the continued investment in such interactives byinstitutions.Simultaneously however, the technology has changed almost beyond recognition from early kiosk-based computer exhibitsfeaturing mostly film and audio content, through to the newer generation of multi-touch interfaces being
This paper explores the interactive possibilities enabled when the barrel of a digital pen is augmented with a multi-touch sensor. We present a novel multi-touch pen (MTPen) prototype and discuss its alternate uses beyond those of a standard stylus, such as allowing new touch gestures to be performed using the index finger or thumb and detecting how users grip the device as a mechanism for mode switch-ing. We also discuss the hardware and software implemen-tation challenges in realizing our prototype, and showcase how one can combine different grips (tripod, relaxed tripod, sketch, wrap) and
DATE:
TEAM MEMBERS:
Jim SpadacciniHyunyoung SongHrvoje BenkoFrancois GuimbretiereShahram IzadiXiang CaoKen Hinckley
In this paper we describe two projects that utilize reality-based interaction to advance collaborative scientific inquiry and discovery. We discuss the relation between reality-based and embodied interaction, and present findings from an experimental study that illustrate benefits of reality-based tabletop interaction for collaborative inquiry-based learning.
New mobile devices with large multi-touch displays, such as the iPad, have brought revolutionary changes to ways users interact with computers. Instead of traditional input devices such as keyboards, touchpads and mice, multi-touch gestures are used as the primary means of interacting with mobile devices. Surprisingly, body-motion gestures are evolving to become a new, natural, and effective way for game players to interact with game consoles in a very similar fashion: in Kinect for Xbox 360, a controller-free gaming experience is made possible by using body-motion gestures to play games.
This paper outlines research showing a suprizing agreement in the guesability of multitouch gestures on tabletop surfaces between users. It also provides more evidence that crowd sourcing gesture mapping will lead to more complete intuitive gesture set and potential convergence into a standard gesture library.
DATE:
TEAM MEMBERS:
Jacob WobbrockMeredith MorisAndrew Wilson
Direct-touch interaction on mobile phones revolves around screens that compete for visual attention with users‟ real-world tasks and activities. This paper investigates the impact of these situational impairments on touch-screen interaction. We probe several design factors for touch-screen gestures, under various levels of environmental demands on attention, in comparison to the status-quo approach of soft buttons. We find that in the presence of environmental distractions, ges-tures can offer significant performance gains and reduced attentional load, while performing as well as soft buttons
DATE:
TEAM MEMBERS:
Andrew BragdonEugene NelsonYang LiKen Hinckley
Recent advances in touch screen technology have increased the prevalence of touch screens and have prompted a wave of new touch screen-based devices. However, touch screens are still largely inaccessible to blind users, who must adopt error-prone compensatory strategies to use them or find accessible alternatives. This inaccessibility is due to interaction techniques that require the user to visually locate objects on the screen. To address this problem, we introduce Slide Rule, a set of audiobased multi-touch interaction techniques that enable blind users to access touch screen applications
DATE:
TEAM MEMBERS:
Jim SpadacciniJeffrey BighamJacob Wobbrock
During its first year, more than 1500 people signed up to be a part of Open Exhibits. Participation ranged from reading blog posts, to trying a few software modules or using Open Exhibits software to develop actual exhibition components. This report highlights findings about the emerging community and trends in Open Exhibits participation.
The NMC Horizon Report: 2011 Museum Edition, is a coproduction with the Marcus Institute for Digital Education in the Arts (MIDEA), and examines emerging technologies for their potential impact on and use in education and interpretation within the museum environment. The international composition of the advisory board reflects the care with which a global perspective for the report was assembled. While there are many local factors affecting the adoption and use of emerging technologies in museums, there are also issues that transcend regional boundaries and questions we all face. It was with