Figure 7. the raw thinSight sensor data shown left and after
interpolation and smoothing right. note that the raw image is a very
low resolution, but contains enough data to generate the relatively
rich image at right.
Figure 8. Fingertips can be sensed easily with thinSight. Left: the
user places five fingers on the display to manipulate a photo. Right:
a close-up of the sensor data when fingers are positioned as shown
at left. the raw sensor data is: ( 1) scaled-up with interpolation,
( 2) normalized, ( 3) thresholded to produce a binary image, and finally
( 4) processed using connected components analysis to reveal the
Fingertips appear as small blobs in the image as they
approach the surface, increasing in intensity as they get
closer. This gives rise to the possibility of sensing both touch
and hover. To date we have only implemented touch/no-touch differentiation, using thresholding. However, we can
reliably and consistently detect touch to within a few millimeters for a variety of skin tones, so we believe that disambiguating hover from touch would be possible.
In addition to fingers and hands, optical sensing allows
us to observe other IR reflective objects through the display.
Figure 1 illustrates how the display can distinguish the shape
of many reflective objects in front of the surface, including
an entire hand, mobile phone, remote control, and a reel
of white tape. We have found in practice that many objects
A logical next step is to attempt to uniquely identify
objects by placement of visual codes underneath them. Such
codes have been used effectively in tabletop systems such as
the Microsoft Surface and various research prototypes12, 28 to
support tangible interaction. We have also started preliminary experiments with the use of such codes on ThinSight,
see Figure 9.
Active electronic identification schemes are also feasible.
For example, cheap and small dedicated electronic units
containing an IR emitter can be stuck onto or embedded
inside objects that need to be identified. These emitters will
produce a signal directed to a small subset of the display
sensors. By emitting modulated IR it is possible to transmit
a unique identifier to the display.
94 CommuniCationS oF thE aCm | DeCeMBeR 2009 | vOL. 52 | NO. 12
4. 3. Communicating through the thinSight display
Beyond simple identification, an embedded IR transmitter also provides a basis for supporting richer bidirectional
communication with the display. In theory any IR modulation scheme, such as the widely adopted IrDA standard,
could be supported by ThinSight. We have implemented
a DC-balanced modulation scheme which allows retro-reflective object sensing to occur at the same time as data
transmission. This required no additions or alterations to the
sensor PCB, only changes to the microcontroller firmware.
To demonstrate our prototype implementation of this, we
built a small embedded IR transceiver based on a low power
MSP430 microcontroller, see Figure 10. We encode 3 bits of
data in the IR transmitted from the ThinSight pixels to control an RGB LED fitted to the embedded receiver. When the
user touches various soft buttons on the ThinSight display,
this in turn transmits different 3 bit codes from ThinSight
pixels to cause different colors on the embedded device to
It is theoretically possible to transmit and receive different data simultaneously using different columns on the
Figure 9. an example 2" diameter visual marker and the resulting
thinSight image after processing.
Figure 10. using thinSight to communicate with devices using iR.
top left: an embedded microcontroller/iR transceiver/RGB LED
device. Bottom left: touching a soft button on the thinSight display
signals the RGB LED on the embedded device to turn red (bottom
right). top right: a remote control is used to signal from a distance
the display whichin turn sends an iR command to the RGB device to
turn the LED blue.