 |
|
Videos
|
Attentive Office
Cubicles: Mediating Visual and Auditory Interactions Between Office Co-Workers
Aadil Mamuji1, Roel Vertegaal1,
Maria Danninger2, Connor Dickie1,
Changuk Sohn1
1Queen's University, 2Technische
Universität München
We designed an office
cubicle that automatically mediates communications between co-workers
by sensing whether users are candidate members of the same social group.
The cubicle regulates visual interactions through the use of privacy glass,
which can be rendered opaque or transparent upon detection of joint orientation.
It regulates auditory interactions through noise-canceling headphones
that automatically turn off upon co-orientation.
|
|
The Context-Aware
Pill Bottle and Medication Monitor
Anand Agarawala1, Saul Greenberg1
and Geoffrey Ho2
1Department of Computer Science,
2University of Calgary
The video illustrates
and critiques a context-aware pill bottle/stand that reminds the elderly
when it is time to take their medication. A medication monitor situated
in a caregivers home displays awareness information about the elderly
users medication compliance.
|
|
Talking Assistant
- Car Repair Shop Demo
Dirk Schnelle, Erwin Aitenbichler, Jussi Kangasharju, Max Mühlhäuser
Telecooperation Department of Computer Science Darmstadt University
of Technology
In this video paper
we present the Talking Assistant and the STAIRS project, and how the two
interact. The Talking Assistant is a device for interacting in ubiquitous
computing environments. It features sensors, wireless communications and
simple local speech recognition. The STAIRS project concerns browsing
of structured hypertext documents in audio. One key feature of STAIRS
is, that beside speech commands, navigation can be controlled by changes
in context. In this paper we show one example of how context changes can
be detected with infrared tags.
|
|
Cubic Display Device
"Z-agon"
Junya Ueda, Takashi Matsumoto, Naohito Okude
Keio University
Z-agon is a device
with six-face displays which is constructed in the shape of a cube. We
assume that Z-agon will be used as a portable movie player. Designing
practical product, we propose a new ubiquitous interface in the media
design. We made a movie to show up its form and exam its future needs
and uses for the design. In this paper, we show a design process to build
up the concept of Z-agon using scenario-based modeling empowered by the
video. This movie consists of three sections. The first section shows
a Projector Prototype to exam its appearance. The second section shows
the design approach. The third section shows a movie scenario to reveal
its interaction.
|
|
Building Flexible Displays for Awareness and Interaction
Kathryn Elliot, Saul Greenberg
Department of Computer Science, University of Calgary
This video illustrates
a set of flexible ambient devices that can be connected to any suitable
information source and that provide a simple means for people to move
from awareness into interaction.
|
|
Mobile Music Making
with Sonic City
Lalya Gaye, Lars Erik
Future Applications Lab, Viktoria Institute
Sonic City is a wearable
system in which the urban environment acts as an interface for real-time
electronic music making; the user creates music dynamically by walking
through a city. The system senses the users activities and the surrounding
context, and maps this information in real time to the processing of urban
sounds that are collected with a microphone. This video shows participants
of a short-term study using our prototype in everyday settings. The actual
music created with the system is heard together with users comments.
The Sonic City project illustrates how ubiquitous and wearable computing
can enable new types of applications that encourage everyday creativity.
|
|
TACT: Mobile Wireless
Terminal for Digitally-Enabled Environments
Michimune Kohno, Yuji Ayatsuka, Jun Rekimoto
Sony Computer Science Laboratories, Inc.
In this paper, we
introduce TACT, which is a general interaction tool for ubiquitous computing
environments. This device is dedi-cated to the operation of various kinds
of connections, including audio, video, and data transmission between
remote/surrounding computers. With TACT, whenever a user wants to utilize
a nearby computing resource, the user can open a session and manipulate
its endpoints without caring about network addresses. We demon-strate
a few typical scenarios to show how TACT is used in vari-ous situations.
|
|
User Assist Application
with Location Sensors, Integration
Udana Bandara1 2,
Mikio Hasegawa1, Masugi Inoue1,
Masateru Minami1 3,
Hiroyuki Morikawa1 2
, Tomonori Aoyama2
1National Institute of Information
and Communications Technology, 2University
of Tokyo, 3Shibaura Institute of Technology
In this video paper,
we introduce two context-aware applications: a navigator, and a communicator
which are implemented in our laboratory premises. The applications are
developed on a software platform which supports the integration of several
different kinds of sensors. Currently we use data from three location
sensing methods and user activity data as sensor input. The navigator
is used for guiding guests in our laboratory premises with varying accuracy
according to the users position. The communicator combines voice
over IP (VoIP) , instant messaging (IM) and email functionalities in a
single application, and switches between methods of communication according
to the users preference and context.
|
|
Tele-Reality in
the Wild
Neil J. McCurdy, William G. Griswol
Department of Computer Science and Engineering, University of California,
San Diego
We are rapidly moving
toward a world where personal networked video cameras are ubiquitous.
Already, camera-equipped cell phones are becoming commonplace. Imagine
being able to tap into all of these real-time video feeds to remotely
explore the world live.We introduce RealityFlythrough, a telereality/
telepresence system that makes this vision possible. By situating live
2d video feeds in a 3d model of the world, RealityFlythrough allows any
space to be explored remotely. No special cameras, tripods, rigs, scaffolding,
or lighting is required to create the model, and no lengthy preprocessing
of images is necessary. Rather than try to achieve photorealism at every
point in space, we instead focus on providing the user with a sense of
how the video streams relate to one another spatially. By providing cues
in the form of dynamic transitions, we can approximate photorealistic
tele-reality while harnessing cameras in the wild. This video
describes the RealityFlythrough system, and reports on a live tele-reality
experience. We find that tele-reality can work in the wild using only
commodity hardware and off-the-shelf software, and that imperfect transitions
are sensible and provide a compelling user experience.
|
|
Adapting Information
Through Tangible Augmented Reality Interfaces
Patrick Sinclair, Kirk Martinez
University of Southampton
Tangible augmented
reality interfaces offer a hands on approach for examining objects and
exploring the associated information. We describe two tangible augmented
reality interfaces that can expose the adaptation of information presented
to users about objects in augmented reality environments.
|
|
Can You See me
Now
Matt Adams1, Ju Row Farr1,
Nick Tandavanitj1, Steve Benford2,
Martin Flintham2,
Adam Drozd2,
Rob Anastasi2
1Blast Theory, 2The
University of Nottingham
Can You See Me Now?
is a ubiquitous artistic game that mixes street players who use mobile,
location-tracked devices, with online players who use conventional PCs
connected over the Internet. The game serves as both a professional touring
artwork and a research project that has enabled emerging ubiquitous technologies
to be studied in the wild, i.e., as used by the public on
the streets of actual cities throughout the world.
|
|
Magic Touch
Thomas Pederson
Dept. of Computing Science, Umeå university
This video demonstration
illustrates a method for tracking location changes of large sets of real-world
objects unobtrusively and cost-effectively based on the assumption that
all object movements are caused by users themselves, and can be tracked
using wearable sensor technology placed on human hands.
|
|
Connecting Remote
Teams: Cross-Media Integration to Support Remote Informal Encounters
Thorsten Prante1, Richard Stenzel1,
Carsten Röcker1, Daniel van Alphen2,
Norbert A. Streitz1, Carsten Magerkurth1,
Daniela A. Plewe2
1Fraunhofer IPSI, AMBIENTE - Smart
Environments of the Future, 2 Independant
researchers
This video presents
the Hello.Wall artefact in a mixed-media set-up to support spontaneous,
informal encounters in two remote lounge spaces of a distributed team.
The Hello.Walls are used as awareness tools to know more about the remote
teams state and at the same time as a tool to smoothen transitions
to place-based video communication among the remote teams members.
This connecting-remote-teams scenario was tested in a living-lab evaluation
for several weeks and proved to foster remote informal encounters and
thereby contributed to smooth and fluent project work in our Ambient-Agoras
project.
|
|
SimPhony: a voice
communication tool for distributed workgroups
Vidya Lakshmipathy, Chris Schmandt
MIT Media Lab
Communication is
vital in any workplace. However, as workers become less tied to their
desktops and computers, the need to provide them with a mobile communication
tool that adapts to their work environment becomes more necessary. This
paper describes SimPhony - a mobile, voice-controlled, voice communication
system built on a PDA and designed specifically for distributed workgroups.
SimPhony supports one-to-one or one-to-many communication with voice instant
messages or synchronous audio transmitted over an 802.11b wireless network,
and it transitions between different communication styles as messages
becomes more frequent. The SimPhony interface looks much like an instant
messaging client but is accessible through a voice or visual interface
on the PDA or a voice interface accessible by any telephone.
|
|
|
|
|
 |
|