CHI 97 Electronic Publications: Demonstrations
`Eudaemonic Eye': `Personal Imaging' and wearable computing as result of deconstructing HCI; towards greater creativity and self-determination
Steve Mann
N1NLF
MIT Media Lab,
Building E15-389,
20 Ames Street,
Cambridge, MA02139, USA
steve@media.mit.edu
http://wearcam.org/
Now at mann@eecg.toronto.edu
ABSTRACT
The apparatus for `personal imaging'
consists of a combination of the author's
`existential computer' invention (hardware
portion also referred to as the ``wearable
computer'') with an electronic camera as the primary input device.
Personal imaging, a conceptual framework around this simple apparatus,
is first presented as a new research area, and
then applications to the visual arts, and to personal documentary,
are presented.
Keywords
`existential computing',
wearable computing,
`personal imaging',
lightpainting,
electronic flash,
mobile multimedia,
`video orbits',
`VideoClips',
`pencigraphic imaging',
personal documentary,
augmented reality, `mediated reality'.
© 1997 Copyright on this material is held by the authors.
THE `EXISTENTIAL COMPUTER'
Existentialism, the philosophical theory that emphasizes the
existance of the individual person as a free and responsible agent
determining his own development, is proposed as a basis for
`existential media' --- technology that (1) gives the user the
deference of assumed competence, e.g. a computer that assumes
that the user is able-minded rather than stupid, and (2)
synergizes with the user rather than functions as a separate entity that
replaces the user.
SMART CLOTHING
The goal of the `Existential User Interface'
(EUI) is not increased productivity (e.g. making
individuals more useful to society), but, rather, to reclaim the
personal space (prosthetic territory) lost by invasive technology. A
good example of `existential media' is clothing. Clothing affords us
a great deal of self-determination, and serves as a useful metaphor
for `existential media'. (It is no coincidence that clothing also
formed the substrate upon which the `existential computer' invention
was first realized.) The SONY ``Walkman'' is another example of
`existential media'. Its ability to reclaim personal space lost to
``muzak'' (a use of technology that has stolen much of our solitude)
affords the user a great deal of self-determination.
DECONSTRUCTING CHI
When we make reference to ``CHI'' (Computer Human Interaction) we call
attention to the boundary between humans and computers --- CHI becomes
a self-fulfilling prophecy, emphasizing this boundary. A goal of
`existential computing' is to eliminate this artificial (unnecessary)
boundary by ``becoming'' the computer, rather than merely interfacing
to it.
An example of an existential user interface is the author's
`finger mouse'[7] (based on `WearCam' as in input device).
(See Fig 1).
Figure 1:
`FingerMouse':
(a) Early prototype of author's `existential computer'
invention with camera.
(b) Back view of author and
apparatus
together with object being outlined (luxo lamp).
(c) What wearer sees through the glasses:
outline of the object that
denotes path of moving finger around in the space between
the camera (taking the role of eye) and the object.
Image is frozen for near-perfect registration.
(Thanks to Thad Starner and Flavia Sparacino for help in
getting XFakeEvents to work for this).
HISTORICAL BACKGROUND
The first `personal imaging' prototype, designed and built
by the author (Fig 2),
comprised a modular personal, wearable, multimedia computer system together
with one or more cameras, a head-mounted display,
and other sensors (one or more microphones, A/D converter for voltage
measurement, two wearable radar systems, etc.), connected wirelessly to a
separate base-station which later formed the gateway to the Internet.
The modular nature of the system
allowed portions to be left out or included, depending on the occasion.
Figure 2: Evolution of author's
`existential computing'
and `personal imaging' inventions.
(1980) apparatus was somewhat cumbersome.
The bulky 1.5 inch Cathode Ray Tube (CRT)
required a helmet for
support, and provided only enough resolution for low-quality
greyscale imagery or 40 letters of text per row.
Later, a waist-mounted television was found to be less
cumbersome, but failed to provide constancy of
user-interface. Note the electronic flash, used for lightpainting,
also has the built in ``keyboard'' (seven microswitches in the
handle).
With the advent of miniature CRTs in the late 1980s,
a comfortable eyeglass-based system became practical.
Presently, the author has designed and built the apparatus
into ordinary eyeglasses and ordinary clothing
(Late 1990s)
Ivan Sutherland
described a head-mounted display with half-silvered
mirrors so that the wearer could see a virtual world superimposed
on reality[1][14].
Sutherland's work, as well as more recent related
work [2][4]
is characterized by its tethered nature (tethered to a
workstation which is generally powered from an AC outlet).
In this sense it differs from the proposed `personal imaging'
paradigm which is based on a rig that is entirely battery operated
and tetherless (e.g. includes wireless communications).
Other very recent work in wearable computing[3]
provides a task-specific system, in particular, a repair manual
for use by soldiers. To make use simple, and to keep the soldier
focused on the task at hand, the only input is a knob and pushbutton,
so that menu items from a specific program may be selected.
The `personal imaging' effort
differed from the more recent research on
what might be best described as ``employer-owned technology'', or technology
controlled by an external entity.
In particular, the proposed user-interface paradigm
is based on technology owned, operated, and controlled
by the wearer --- technology that becomes part of the wearer's day-to-day
lifestyle, and ``gets to know'' the wearer ``intimately''.
The importance of this difference/distinction is detailed in [11].
`PERSONAL IMAGING'
The theoretical background for personal imaging is based on
regarding the camera as a measurement instrument, in particular, an
array of directional lightmeters, where the user interacts with the
scene[13][5][10][8].
VIDEO ORBITS: `PAINTING WITH LOOKS/LIGHT'
Images of the same scene may be combined together to produce a single image
of greater resolution and spatial extent[8]. Within the
context of personal imaging, this results in an automated and natural process
of generating environment maps by looking around in a space, so that it
can be relegated to a background task, that is, environment maps are generated
by the pencigraphic imaging agent, and transmitted to the World Wide Web
so that others can share in the day-to-day experiences of the
wearer[6].
Differently exposed images may be combined to increase
dynamic range and image
definition[15][9][10],
as well as for artistic/expressive effect.
VIDEO ORBITS: MEDIATED REALITY
Examples of illusory rigid planar patches arise directly from
projective coordinate transformations applied to individual images,
allowing messages to be left on everyday planar textures in the real
world, visible to those wearing the eyeglasses, and on the list
of recipients for a particular message (Fig 3).
Figure 3: Message left on the flat wall of department store entrance
remains dormant until recipient of message happens to look
at wall. The act of looking, being mediated by the
apparatus, enables the message to appear as an illusory
planar patch. (Thanks to Jeffrey Levine for help with
work in this figure.)
Face recognition has been used together with `video orbits'
to create existential name tags[12] (e.g. don't exist in reality
but can be seen by wearer of apparatus) where the name had been entered
at a previous encounter or drawn from a database.
PERSONAL DOCUMENTARY and `VideoClips'
`Personal imaging'
(used in two of the author's documentaries, `Shooting Back' and `VideoClips')
makes use of the manner in which the EUI functions as a true
extension of the body and mind, giving rise to a new cinematographic technique.
In `Shooting Back' (Fig 4)
Figure 4: Shootingback conveys the first-person perspective in a much
more natural way than previous point-of-view cinematography.
Here the viewer assumes the point of view of a documentary
video maker questioning video surveillance.
(a) Talking to representative of organization using surveillance.
HandyCam is held at side pointing back. (b) bringing camera
up to eudaemonic eye, (c) eudaemonic
eye inside eyecup (looking through viewfinder).
REFERENCES
- 1
-
R. A. Earnshaw, M. A. Gigante, and H Jones.
Virtual reality systems.
Academic press, 1993.
- 2
-
Feiner, MacIntyre, and Seligmann.
Knowledge-based augmented reality, Jul 1993.
Communications of the ACM, 36(7).
- 3
-
S. Finger, M. Terk, E. Subrahmanian, C. Kasabach, F. Prinz, D.P. Siewiorek,
A. Smailagic, J. Stivorek, and L Weiss.
Rapid design and manufacture of wearable computers.
Communications of the ACM, pages 63--70, February 1996.
issue on Computer Science in Manufacturing.
- 4
-
Henry Fuchs, Mike Bajura, and Ryutarou Ohbuchi.
Teaming ultrasound data with virtual reality in obstetrics.
http://www>//www.ncsa.uiuc.edu/Pubs/MetaCenter/SciHi93/1c.Highlights-BiologyC.html.
- 5
-
S. Mann.
Compositing multiple pictures of the same scene.
In Proceedings of the 46th Annual IS&T Conference,
Cambridge, Massachusetts, May 9-14 1993. The Society of Imaging Science and
Technology.
- 6
-
S. Mann.
Wearable Wireless Webcam, 1994.
http://wearcam.org.
- 7
-
S. Mann.
``smart clothing''.
TR 366, M.I.T. Media Lab Perceptual Computing Section, Cambridge, Ma,
February 2 1996.
- 8
-
S. Mann and R. W. Picard.
Video orbits of the projective group; a simple approach to
featureless estimation of parameters.
TR 338, M.I.T. Media Lab Perceptual Computing Section, Cambridge, Ma,
1995.
To appear, IEEE Trans. Image Proc.
- 9
-
S. Mann and R.W. Picard.
Being `undigital' with digital cameras: Extending dynamic range by
combining differently exposed pictures.
Technical Report 323, M.I.T. Media Lab Perceptual Computing Section,
Boston, Massachusetts, 1994.
Also appears, IS&T's 46th annual conference, pages 422-428, May
1995.
- 10
-
Steve Mann.
`pencigraphy' with AGC: Joint parameter estimation in both domain
and range of functions in same orbit of the projective-Wyckoff group.
Technical Report 384, MIT Media Lab, Cambridge, Massachusetts,
December 1994.
also appears in: IEEE International Conference on Image Processing
(ICIP 96), Lausanne, Switzerland, September 1996.
- 11
-
Steve Mann.
Smart clothing: The shift to wearable computing.
Communications of the ACM, pages 23--24, August 1996.
- 12
-
Steve Mann.
Wearable, tetherless computer--mediated reality: Wearcam as a
wearable face--recognizer, and other applications for the disabled.
TR 361, M.I.T. Media Lab Perceptual Computing Section, Cambridge, Ma,
February 2 1996.
Also appears in AAAI Fall Symposium on Developing Assistive
Technology for People with Disabilities, 9-11 November 1996, MIT.
- 13
-
Cynthia Ryals.
Lightspace: A new language of imaging.
PHOTO Electronic Imaging, 38(2):14--16, 1995.
http://www.novalink.com/pei/mann2.html.
- 14
-
I Sutherland.
A head-mounted three dimensional display.
In Proc. Fall Joint Computer Conference, pages 757--764, 1968.
- 15
-
Charles W. Wyckoff.
An experimental extended response film.
S.P.I.E. NEWSLETTER, JUNE-JULY 1962.
- ...self-determination.
- Research supported, in part, by HP labs, Palo Alto.
Words or phrases in single quotes are those introduced
by author here or elsewhere in the literature.
CHI 97 Electronic Publications: Demonstrations