Beyond the 5-billion-light-year boundary, yet another method is available for determining three-dimensional structure: redshift. In the 1920s the astronomer Edwin Hubble noted that the farther a galaxy is from Earth, the greater its redshift, or the amount by which its light is shifted toward longer, or redder, wavelengths of the electromagnetic spectrum. The same effect governs the pitch of a train whistle: to a listener standing on a platform, the wavelength of the sound becomes longer, and the pitch becomes lower, as the train speeds away. By measuring the redshift and applying Hubble’s relation between redshift and distance, one can estimate distances to the farthest reaches of the observable universe, many billions of light-years away.
While astronomers were radically expanding their view of the universe in the early part of the twentieth century, a new invention was also fueling a rising public enthusiasm: the planetarium. The first optical-projection planetarium made its debut in 1923, when engineers at the Carl Zeiss Company in Jena, Germany, projected stars on the interior of a hemisphere. A marvel of engineering, the “Wonder of Jena” attracted tremendous attention, and similar technology soon spread across Europe and the United States. A faithful reproduction of the night sky—pinpoint, luminous stars that rose and set, along with planets that followed their proper trajectories—formed the core of planetarium technology, enabling instructors to give dramatic demonstrations of the science of celestial motions and Newtonian physics.
As astronomical knowledge about the universe has grown, new tools have been invented or acquired to keep the planetariums up to date. In the decades from the 1920s until the early 1990s, slide, film, and video projectors were added to the mix of planetarium technology to tell the story of astronomical progress. But portraying the wonders of a universe that seemed to be expanding as knowledge grew proved to be challenging. How could planetariums convey the immensity of the universe? How could they simulate the experience of a flight to the farthest reaches of time and space?
Even astronomers had to grapple with the difficulty of representing their discoveries in a two-dimensional format. A classic example is the discovery of the large-scale structure of galaxies in the 1980s: astronomers were forced to analyze the size and shape of the local universe by viewing two-dimensional “slices” through the three-dimensional data.
When smaller, more powerful computers became available, astronomers were finally able to examine data more comprehensively—and with greater insight. All the objects in a virtual space could at last be plotted in proper perspective. Around that same time, the Evans & Sutherland (E&S) Computer Corporation in Salt Lake City, a pioneering company in computer graphics, introduced a digital projector for planetariums that could, with a single lens on a video projection tube, display a black-and-white star field—as well as other monochromatic images made up of dots and lines. Unlike earlier planetarium projectors, E&S’s device presented a three-dimensional universe: the stars had depth, the orbits of the planets in the solar system could be observed from any angle, and users could input data from many sources.
For the Hayden Planetarium, the new technology offered the potential to convey contemporary astrophysics as never before. In 1995 the American Museum of Natural History was embarking on a complete redesign of the planetarium, then sixty years old. The public viewing space was to be housed within a sphere suspended in a glass cube, in what would become the Rose Center for Earth and Space. Several years earlier, in 1991, J. Richard Gott III, an astrophysicist at Princeton University, had chaired the museum’s study on the future of planetariums. The museum, which had just completed a renovation of its dinosaur displays to reflect new understanding, was receptive to Gott’s suggestions about how to revamp the Hayden Planetarium so as to represent the current state of knowledge about the universe.
Optical-mechanical projection systems, such as the Zeiss star projectors, worked extremely well at depicting stars as they appear from Earth, because stars appear “fixed” in the night sky. The Sun, Moon, and planets seem to move relative to the stars, and their motions could readily be reproduced by small, special-purpose lens systems. But to create seamless, full-color images of the varying scales of the universe, a more flexible system was needed: a computer in which the graphics cover the entire planetarium dome. A mosaic of video projections could act as an enormous computer monitor, presenting a full hemisphere of imagery.
When the Hayden Planetarium reopened in 2000, after its extensive renovation, a virtual trip through the universe required a supercomputer. Navigating databases of thousands of celestial objects and displaying them in a series of still images at the standard video rate of thirty times a second posed a tremendous computational challenge. Fortunately, the phenomenal growth and popularity of flight simulators and electronic video games spurred the field of 3-D “Edge” of the observable universe, the most distant source of data from astronomical surveys in any direction, is depicted by the Digital Universe atlas. Because light travels at a finite speed, the image portrays the distant past, not long after the big bang. Representing the cosmic horizon is the cosmic microwave background radiation, whose minute temperature variations are rendered here in false-color blues and greens. data visualization to grow up almost overnight. Thanks in part to the video-game industry, personal computers today incorporate graphics processors that surpass the capabilities of the supercomputer the planetarium purchased only five years ago. The new technology arrived practically ready-made for transfer into industry and academia.