producing x–t images. As a consequence, their field of view
is reduced to a single horizontal line of view of the scene.
We solve these problems with our ultrafast imaging system, outlined in Figure 2. The light source is a femtosecond
Kerr lens mode-locked Ti:Sapphire laser, which emits 50-fs
with a center wavelength of 795nm, at a repetition rate of
75 MHz and average power of 500 m W. In order to see ultrafast events in a scene with macro-scaled objects, we focus
the light with a lens onto a Lambertian diffuser, which then
acts as a point light source and illuminates the entire scene
with a spherically shaped pulse. Alternatively, if we want to
observe pulse propagation itself, rather than the interactions
with large objects, we direct the laser beam across the field
of view of the camera through a scattering medium (see the
bottle scene in Figure 1).
Because all the pulses are statistically identical, we can
record the scattered light from many of them and integrate
the measurements to average out noise. The result is a sig-
nal with a high SNR. To synchronize the illumination with
the streak sensor (Hamamatsu C5680), we split off a portion
of the beam with a glass slide and direct it onto a fast photo-
detector connected to the sensor, so that both detector and
illumination operate synchronously (Figure 2a and b).
3. 2. Capturing space–time planes
The streak sensor then captures an x–t image of a certain
scanline (i.e., a line of pixels in the horizontal dimension)
of the scene with a space–time resolution of 672 × 512. The
exact time resolution depends on the amplification of an
internal sweep voltage signal applied to the streak sensor.
With our hardware, it can be adjusted from 0.30 to 5.07 ps.
Practically, we choose the fastest resolution that still allows
for capture of the entire duration of the event. In the streak
sensor, a photocathode converts incoming photons, arriving
Figure 1. What does the world look like at the speed of light? Our new computational photography technique allows us to visualize light in
ultra-slow motion, as it travels and interacts with objects in table-top scenes. We capture photons with an effective temporal resolution
of less than 2 ps per frame. Top row, left: a false color, single streak image from our sensor. Middle: time lapse visualization of the bottle
scene, as directly reconstructed from sensor data. Right: time-unwarped visualization, taking into account the fact that the speed of light
can no longer be considered infinite (see the main text for details). Bottom row: original scene through which a laser pulse propagates,
followed by different frames of the complete reconstructed video. For this and other results in the paper, we refer the reader to the videos
included in the project pages: femtocamera.info and http://giga.cps.unizar.es/~ajarabo/pubs/femtoSIG2013/.
Streak image Peak-tme visualizations
(a) (b) (c) (d) (e)
space (x) (672 pixel streak camera view
Figure 2. (a) Photograph of our ultrafast imaging system setup. The DSLR camera takes a conventional photo for comparison. (b) In order to
capture a single 1D space–time photo, a laser beam strikes a diffuser, which converts the beam into a spherical energy front that illuminates
the scene; a beamsplitter and a synchronization detector enable synchronization between the laser and the streak sensor. (c) After
interacting with the scene, photons enter a horizontal slit in the camera and strike a photocathode, which generates electrons. These are
deflected at different angles as they pass through a microchannel plate, by means of rapidly changing the voltage between the electrodes.
The CCD records the horizontal position of each pulse and maps its arrival time to the vertical axis, depending on how much the electrons
have been deflected. (d) We focus the streak sensor on a single narrow scanline of the scene. (e) Sample image taken by the streak sensor.
The horizontal axis (672 pixels) records the photons’ spatial locations in the acquired scanline, while the vertical axis ( 1 ns window in our
implementation) codes their arrival time. Rotating the adjustable mirrors shown in (a) allows for scanning of the scene in the y-axis and
generation of ultrafast 2D movies such as the one visualized in Figure 1 (b–d, credit: Greg Gbur).