Researchers at Carnegie Mellon Univ.’s Robotics Institute
have leveraged the latest browser technology to create GigaPan Time Machine, a
system that enables viewers to explore gigapixel-scale, high-resolution videos
and image sequences by panning or zooming in and out of the images while
simultaneously moving back and forth through time.
Viewers, for instance, can use the system to focus in on the
details of a booth within a panorama of a carnival midway, but also reverse
time to see how the booth was constructed. Or they can watch a group of plants
sprout, grow and flower, shifting perspective to watch some plants move wildly
as they grow while others get eaten by caterpillars. Or, they can view a
computer simulation of the early universe, watching as gravity works across 600
million light-years to condense matter into filaments and finally into stars
that can be seen by zooming in for a close up.
“With GigaPan Time Machine, you can simultaneously
explore space and time at extremely high resolutions,” said Illah
Nourbakhsh, associate professor of robotics and head of the CREATE Lab.
“Science has always been about narrowing your point of view—selecting a particular
experiment or observation that you think might provide insight. But this system
enables what we call exhaustive science, capturing huge amounts of data that
can then be explored in amazing ways.”
The system is an extension of the GigaPan technology
developed by the CREATE Lab and NASA, which can capture a mosaic of hundreds or
thousands of digital pictures and stitch those frames into a panorama that be
interactively explored via computer. To extend GigaPan into the time dimension,
image mosaics are repeatedly captured at set intervals, and then stitched
across both space and time to create a video in which each frame can be
hundreds of millions, or even billions of pixels.
An enabling technology for time-lapse GigaPans is a feature
of the HTML5 language that has been incorporated into such browsers as Google’s
Chrome and Apple’s Safari. HTML5, the latest revision of the HyperText Markup
Language (HTML) standard that is at the core of the Internet, makes browsers
capable of presenting video content without use of plug-ins such as Adobe Flash
Using HTML5, CREATE Lab computer scientists Randy Sargent,
Chris Bartley and Paul Dille developed algorithms and software architecture
that make it possible to shift seamlessly from one video portion to another as
viewers zoom in and out of Time Machine imagery. To keep bandwidth manageable,
the GigaPan site streams only those video fragments that pertain to the segment
and/or time frame being viewed.
“We were crashing the browsers early on,” Sargent
recalled. “We’re really pushing the browser technology to the
Guidelines on how individuals can capture time-lapse images
using GigaPan cameras are included on the site created for hosting the new
imagery’s large data files, http://timemachine.gigapan.org. Sargent explained
the CREATE Lab is eager to work with people who want to capture Time Machine
imagery with GigaPan, or use the visualization technology for other
Once a Time Machine GigaPan has been created, viewers can
annotate and save their explorations of it in the form of video “Time
Though the time-lapse mode is an extension of the original
GigaPan concept, scientists already are applying the visualization techniques
to other types of Big Data. Carnegie Mellon’s Bruce and Astrid
for Cosmology, for instance, has used it to visualize a simulation of the early
universe performed at the Pittsburgh
by Tiziana Di Matteo, associate professor of physics.
“Simulations are a huge bunch of numbers, ugly
numbers,” Di Matteo said. “Visualizing even a portion of a simulation
requires a huge amount of computing itself.” Visualization of these large
data sets is crucial to the science, however. “Discoveries often come from
just looking at it,” she explained.
Rupert Croft, associate professor of physics, said
cosmological simulations are so massive that only a segment can be visualized
at a time using usual techniques. Yet whatever is happening within that segment
is being affected by forces elsewhere in the simulation that cannot be readily
accessed. By converting the entire simulation into a time-lapse GigaPan,
however, Croft and his PhD student, Yu Feng, were able to create an image that
provided both the big picture of what was happening in the early universe and
the ability to look in detail at any region of interest.
Using a conventional GigaPan camera, Janet Steven, an
assistant professor of biology at Sweet
has created time-lapse imagery of rapid-growing brassicas, known as Wisconsin
Fast Plants. “This is such an incredible tool for plant biology,” she
said. “It gives you the advantage of observing individual plants, groups
of plants and parts of plants, all at once.”
Steven, who has received GigaPan training through the Fine
Outreach for Science program, said time-lapse photography has long been used in
biology, but the GigaPan technology makes it possible to observe a number of
plants in detail without having separate cameras for each plant. Even as one
plant is studied in detail, it’s possible to also see what neighboring plants
are doing and how that might affect the subject plant, she added.
Steven said creating time-lapse GigaPans of entire
landscapes could be a powerful tool for studying seasonal change in plants and
ecosystems, an area of increasing interest for understanding climate change.
Time-lapse GigaPan imagery of biological experiments also could be an
educational tool, allowing students to make independent observations and
develop their own hypotheses.
Google Inc. supported development of GigaPan Time Machine.