The U.S. Air Force is responsible for the Navstar Global Positioning System (GPS) which enables all three military services and civilian users to determine: location in three- dimensions, velocity, and precise time anywhere in the world. The first GPS space vehicles were in orbit by 1978. Source: Martin Marietta Astro Space


STEERING BY THE SATELLITES

For centuries, mariners used stars and other heavenly bodies for navigation. A modern, hightech version of this technique allowed U.S. troops in the Persian Gulf War to operate in a desert wilderness with pinpoint accuracy. Now it is stimulating an enormous variety of civilian applications--from James Bond-style personal navigators to aircraft avoidance systems--and fueling a huge new commercial market.

This modern navigational miracle, known as the Global Positioning System (GPS), can instantly and automatically tell users their location and altitude to within about 30 feet anywhere on Earth. Instead of stars, the GPS system uses 24 satellites that each circle the Earth in precisely determined orbits every 12 hours. Instead of starlight, it uses radio waves that cannot be blocked by clouds. And instead of a mariner's sextant, the GPS system depends on computer chips, miniaturized radio receivers, and--especially--ultra-precise atomic clocks. Such clocks, carried on each satellite, keep time to within millionths of a second over an entire year. Such accuracy is at the heart of the GPS system, because it allows the satellites to broadcast timing signals. The signals from several satellites can be compared by a receiver and electronically translated into a precise determination of its position.

Atomic clocks were not, of course, invented with such an application in mind. In fact, they arose from efforts to answer fundamental questions about the nature of the universe. Testing the basic laws of physics, such as Einstein's theory of general relativity, turned out to require much more accurate clocks than were available 30 years ago. So university physicists set out to develop them, and succeeded both in verifying Einstein's predictions and in making major advances in the technology of time-keeping. Outside of physics, no great need for ultra-precise clocks was foreseen; but, as so often happens, the advance opened up unpredictable opportunities.

The Global Positioning System was initially developed by the U.S. Air Force for military navigation, for which it has already proved its worth. Many important civilian applications have already emerged--coastal navigation, emergency rescue, tracking commercial vehicles--and GPS use is expanding rapidly. Last year, foreshadowing the impact of the GPS system on aircraft avoidance and navigation, a Gulfstream airplane made a flawless approach to Washington's National Airport using only GPS guidance. Inexpensive receivers to guide backpackers in remote areas or to guide automotive travellers along unfamiliar routes are beginning to appear.

Over 160 manufacturers are developing GPS-based systems worldwide for a new multibillion dollar market. The investment in atomic clocks made decades ago was a seminal part of this development and illustrates the remarkable dividends to society that fundamental research can provide. Today's investments in atomic research, such as trapping and cooling atoms in webs of laser light, have improved precision by orders of magnitude compared with that of GPS clocks. In the laboratory, this basic research is pushing our understanding of physical laws to new limits. In the marketplace, it will undoubtedly stimulate new technologies with surprising societal applications.


[Photo of purified DNA fluorescing orange under UV
light]

Purified DNA, fluorescing orange under UV light, is extracted and used for molecular biology studies. This visualization of a single band of DNA aids in the isolation and extraction of the DNA for future molecular biology studies. Source: Mike Mitchell


A KEY TO CANCER

Cancer has terrified patients and baffled medical scientists for a long time. Recently, however, a new level of understanding of this dread disease has begun to emerge. Scientific studies initiated using widely different approaches are now unexpectedly converging to provide a picture of the molecular basis of at least some forms of cancer--including colon cancer and melanoma. The resulting insights are certain to have an important impact on the fight against cancer.

Much of this progress has come from untargeted basic research that is aimed at learning how the cells in all forms of life function. One group of researchers was studying the life cycle common to all cells. These scientists knew that their studies were critical to understanding a central life process that operates in all animal cells and hoped that their findings might in some way become relevant to understanding the processes that lead to human cancer. For a decade, they isolated and characterized a group of proteins that interact in complex ways to form the central growth-controlling machinery inside cells termed the cell cycle clock. At the time, the implications of their work on human disease were totally unclear.

Two of the cell cycle proteins they studied, known only as p16 and p21, were first described in a scientific paper published in December, 1993. Within just four months, another research team, working independently, found an important and totally unexpected link between the p16 protein and cancer. This team, which had set out to study hereditary melanoma (a deadly skin cancer), found that gene that directs a cell to make p16 is often mutated (altered) in cancer cells, leading to its inactivation. The mutations can be seen not just in melanoma cells, but in cells of many other forms of human cancer as well. This indicates p16 plays a critical role in the molecular processes controlling cell proliferation; when p16 is lost, the control of cell growth goes awry, leading to the runaway proliferation seen in cancer.

This discovery complemented a similar, earlier finding which showed that another cell cycle controller, a protein termed p53, also plays an important role in human cancer, being found in mutant form in about half of human tumors. So important was this early discovery that Science magazine named p53 as its 1993 "molecule of the year." Strikingly, insights into the cell's growth cycle, which have now become critically important for understanding human cancer, originated from studies of the life cycles of yeast, clam, sea urchin, and frog cells.

Understanding the molecular causes of cancer--the triggers that lead to disease--can lead to the development of new weapons to fight its spread, including the development of novel therapies, new types of drugs, and the use of gene therapy to correct defective versions of growth-controlling genes present within cancer cells. The "road map" sketched by those conducting fundamental research on the cell life cycle will be there to guide researchers who are now beginning to develop these and other innovative approaches to cancer treatment.

Yet another fully unexpected convergence of unrelated lines of research occurred in 1993. Several groups of researchers were studying tumors from patients with a hereditary form of colon cancer. Their published papers describing certain DNA abnormalities in these colon cancers attracted the attention of other researchers who had seen similar abnormalities in the DNA of baker's yeast cells. The yeast cells showed defects in a cellular system--termed mismatch repair--that checks the yeast DNA for errors in genetic text, enabling the cell to repair and hence erase the errors. The yeast cells carried several defective mismatch repair genes. When the human counterparts of these yeast genes were isolated, they were found to be the culprits responsible for hereditary colon cancer. The work on yeast not only showed precisely how such genes operate, but also led cancer researchers directly to find otherwise elusive human genes, saving years of research time.

Finding these colon cancer genes will enable members of families at risk for the disease to take a genetic test that will indicate who among them should receive frequent presymptomatic screening for colon cancer. In addition, as is often the case with untargeted basic research, research on this mismatch repair system will have applications for understanding yet other diseases beyond colon cancer.


[Photo of c60 buckyball]

The perfectly round C60 "buckyball" cluster. Source: National Science Foundation


A NEW CHEMISTRY FOR CARBON

Until a few years ago, there were two known forms of pure carbon, graphite and diamond. Then an improbable-seeming third form of carbon was discovered: a hollow cluster of 60 carbon atoms shaped like a soccer ball. Buckminsterfullerene or "buckyballs"--named for the American architect R. Buckminster Fuller, whose geodesic domes had a similar structure--is the roundest, most symmetrical large molecule known. It is exceedingly rugged and very stable, capable of surviving the temperature extremes of outer space.

At first, however, the molecule was a mystery wrapped in an enigma. But when a convenient way of making this molecule, also known as C60, was discovered, it set off an explosion of research among chemists, physicists, and materials scientists to uncover the molecule's secrets. Investigators soon discovered a whole family of related molecules, including C70, C84 and other "fullerenes"--clusters as small as C28 and as large as a postulated C240.

These unusual molecules turn out to have extraordinary chemical and physical properties.They react with elements from across the periodic table and with the chemical species known as free radicals--key to the polymerization processes widely used in industry--thus opening up the fullerenes to the manipulative magic of organic chemists. When a fullerene is "doped" by inserting just the right amount of potassium or cesium into empty spaces within the crystal, it becomes a superconductor--the best organic superconductor known. More important, because C60 is a relatively simple system, it may help physicists master the still mysterious theory of high-temperature superconductivity.

Speculation and some hard work on potential applications began almost immediately after the discovery of buckyballs. Possible applications of interest to industry include optical devices; chemical sensors and chemical separation devices; production of diamonds and carbides as cutting tools or hardening agents; batteries and other electrochemical applications, including hydrogen storage media; drug delivery systems and other medical applications; polymers, such as new plastics; and catalysts.

Catalysts, in fact, appear to be a natural application for fullerenes, given their combination of rugged structure and high reactivity. Experiments suggest that fullerenes which incorporate alkali metals possess catalytic properties resembling those of platinum. The C60 molecule can also absorb large numbers of hydrogen atoms--almost one hydrogen for each carbon--without disrupting the buckyball structure. This property suggests that fullerenes may be a better storage medium for hydrogen than metal hydrides, the best current material, and hence possibly a key factor in the development of new batteries and even of non-polluting automobiles based on fuel cells. A thin layer of the C70 fullerene, when deposited on a silicon chip, seems to provide a vastly improved template for growing thin films of diamond.

It is too early to make reliable forecasts of commercial potential, although the early indications are that buckyballs may represent a technological bonanza when their properties are fully understood. Yet it is important to note that the discovery of this curious molecule and its cousins was serendipitous, made in the course of fundamental experiments aimed at understanding how long-chain molecules are formed in outer space. It is a strong reminder that fundamental science is often the wellspring of advanced technology in ways that are completely unpredictable.


[Photo of optical fiber]

Hair-thin fibers of ultrapure glass are now transmitting voice, data, and video communications in many parts of the globe in the form of digital signals emitted by semiconductor lasers the size of a grain of salt. Source: PLG Group


ORIGINS OF THE INFORMATION SUPERHIGHWAY

Ten years ago, the information superhighway could not have been built. Many of the core technologies essential to the convergence of computing and communications--a conjunction at the heart of the information superhighway--were simply not ready. The discoveries that initiated or made these technologies possible go back even further--before anyone dared to dream of a world in which scientists could collaborate across continents, in which every school could be connected to the great libraries and museums, and in which ordinary citizens could tap a wealth of digital services and entertainment from their homes.

The true origins of the information superhighway, in fact, include fundamental research on the physics of surfaces in the late 1940s that led to transistors, obscure university work on microwave oscillators in the early 1950s that led to lasers, and a speculative suggestion in an academic journal in the mid-1960s that led to optical fibers. Such research, if proposed today, would be hard to distinguish from hundreds of similar basic research proposals. Yet it produced the seeds of a revolutionary technology that is likely to transform homes and workplaces alike.

Consider one thread in this complex story, that of optical fibers. The idea that laser light could be transmitted over long distances in a glass fiber--and hence used for communications--can be traced to a 1966 article in a scientific journal. The first fibers were relatively crude; they broke easily and defects or impurities in the glass scattered or absorbed enough of the light signal that it couldn't travel very far. But basic research on the chemistry and thermodynamics of glass and on the scattering of light in liquids (glass can be thought of as a cooled liquid) led to steady improvements--purer glasses that reduced losses, for example, and epoxy coatings that made the fibers more flexible and resistant to corrosion. In 1970, Corning Glass Works demonstrated a fiber that could transmit a light signal with losses of only 1 percent per kilometer--a big advance at the time, but not good enough for commercial systems.

Today's fibers have losses of 100-fold less, reduced almost to the theoretical limit, and the result has been an explosion of optical communications. Optical fibers now carry most U.S. long-distance telecommunications and the total traffic over fibers is 1,000 times greater than a decade ago.

But the fiber story is far from finished. Fundamental research into the properties of rare earth elements, such as erbium, has led to a new wave of developments that are transforming fibers from passive to active devices with even greater carrying capacity. When fibers are doped with erbium and powered by a semiconductor laser, they can amplify an optical signal. Spliced directly into a fiber cable, these fiber amplifiers will soon begin to replace the regenerating stations that now detect, amplify, and retransmit optical communications signals every 30 to 100 kilometers. Since the comparatively slow electronic components of regenerating stations are the principal bottlenecks in today's long-distance networks, this change to an all-optical technology will increase the capacity of long-distance communications systems by as much as 100-fold.

The process is a continuing one. Just as commercial deployment of the information superhighway is harvesting earlier investments in the creation of basic knowledge, so the technologies of tomorrow and the commercial competitiveness that goes with them will stem from the science of today.


[Computer-generated Landsat image]

This computer generated color composite image, produced from data acquired from the Landsat 4 and 5 satellites, is a representation of deforestation in the Brazilian Amazon region from 1978 (top) and 1988 (bottom). The deforestation represented in these figures is confined exclusively to the forest strata and has been averaged into 10-by-10 mile cells. Source: NASA


MONITORING THE EARTH

The increasing scale of human activities on the earth has brought with it increased risk of environmental damage on a global scale. Managing the earth in a responsible manner thus requires monitoring the atmosphere, the oceans, and critical terrestrial ecosystems, so that environmental degradation can be detected in time. Satellites, backed by aircraft and ground observations and by fundamental research on biogeophysical systems, are already playing a major role, and could play an even larger one in the future.

Satellite data helped to confirm the initial discovery of the Antarctic ozone hole and to show that degradation of the earth's protective ozone layer was a global phenomenon. Intensive field and laboratory research, coupled with aircraft and satellite data, soon demonstrated that the degradation was caused by human activities--the industrial chemicals known as chlorofluorocarbons (CFCs), which are degraded in the stratosphere to release chlorine and which, in turn, catalyzes the destruction of ozone. This research led to the signing of the Montreal Protocol and amendments, which committed nations to phase out production and use of CFCs. Subsequent satellite monitoring has shown continued declines in global ozone levels and the presence of high chlorine concentrations over the Arctic, possibly presaging the creation of an Arctic ozone hole as well. Such observations have led to accelerated deadlines for phasing out CFCs, with the result that global production and emissions of these chemicals are beginning to decline. What could have been a major global disaster, with sharply increased levels of ultraviolet radiation harmful to living creatures, is being averted.

Pictures from space have brought global attention in recent years to another environmental problem--the destruction of tropical forests. Astronauts have described seeing the plumes of smoke from space, and nighttime infrared images have shown the Amazon region lit with hundreds of fires. The loss of tropical forests not only threatens the ecosystems that harbor the largest portion of the world's species, but also increases the risk of global warming.

Satellite observations can not only call attention to environmental hazards, they can also help to assess the extent and character of the problem accurately. Recently, for example, careful analysis of satellite imagery from the Amazon region showed that the total area of forest loss over the past decade was less than had been initially thought. However, it also showed that the pattern of clearing and burning had increased the fragmentation of the forest, making an area two-and-a-half times that actually cleared vulnerable to loss of species through disruption of ecosystems. The analysis technique is applicable to other tropical forests as well, and research has already begun to re-examine past satellite images covering other tropical forest regions.

As human populations and industrial activity increase, so will pressure on our environment. Both fundamental research to better understand earth systems and increased monitoring using advanced satellites will be important to detect degradation in time and thus to help preserve the earth for future generations.


[Photo of virus entering carrot
cell]

Scanning electron microscopy reveals several Agrobacterium tumefaciens as they begin to infect a carrot cell. In the process, the bacteria's genetic material will enter the plant cell. Source: A. G. Matthysse, K. V. Holmes, R. H. G. Gurlitz


A VIRTUOUS INFECTION

Genetic engineering turned out to be relatively easy in animal or bacterial cells. Plants, however, initially appeared to be much harder. Fundamental research on a common soil bacterium and its interaction with the plants it infects unexpectedly showed the way. Now genes introduced into plants with the help of that bacterium are poised to bring huge economic and environmental benefits to U.S. agriculture.

The soil bacterium is called Agrobacterium. It infects nearly 10,000 species of plants, causing what is known as crown gall disease. The galls are tumorlike enlargements, and while the disease is harmful to some plants, it is not usually considered a major threat to crops and so is not worth a targeted research effort. Nonetheless, it attracted the curiosity of agricultural scientists in universities and government laboratories as a possible model for cancer; this research led to some fascinating discoveries.

What makes Agrobacterium unique is that when it infects a plant, it transfers a tiny bit of its genetic material, its DNA, into its host--the only bacterium known to do so. It is this foreign DNA that causes the galls or tumors. Further research uncovered the details and showed how to take advantage of this natural genetic engineering agent. The genetic material that causes the galls resides in a small circular ring, or plasmid, of bacterial DNA, a portion of which is incorporated into the plant cell's chromosomes. Scientists were able to snip out the tumor-inducing genes and replace them with genes of choice. Then, when the altered Agrobacterium infects a plant, the new genes are incorporated into its genetic makeup.

Such controlled, virtuous infections have become the method of choice for genetic engineering of many important commercial crops. Genes inserted with this method have led to spoilage-resistant tomatoes, insect-resistant cotton, and a host of experimental varieties of soybeans, rapeseed, poplar trees, and roses. Fresh tomatoes constitute a $4 billion market in the United States; a spoilage-resistant variety that can be ripened longer on the vine for greater flavor is one of the first genetically-engineered products in supermarkets. Cotton, the fifth largest U.S. crop, is also the largest user of pesticides, so that insect-resistant cotton will save farmers money and reduce environmental risks.

Although Agrobacterium is no longer the only genetic engineering tool in the plant scientist's arsenal, the fundamental research that uncovered its secrets helped launch the agricultural biotechnology industry. Biotechnology is expected to help produce safer, more nutritious foods and other agricultural products, create crop cultivars needed to cope with changing climates and pathogens, and make feasible alternative farming techniques that can conserve or reclaim fragile soils.


[MRI cross-section of brain]

An MRI Cross- Section of a brain showing both hemispheres as well as the ventricles (openings). Source: National Institute of Neurologic Disorders and Stroke

[Patient entering MRI machine]

Patient entering a magnetic resonance imaging (MRI) machine for medical diagnosis. Source: National Cancer Institute


SEEING INSIDE THE BODY

Over the ages, physicians have sought a means of seeing inside the human body without cutting it open. Fundamental discoveries in physics have given us first x-rays and then the more modern diagnostic methods of magnetic resonance imaging (MRI) and positron-electron tomography (PET), contributing to remarkable advances in medical research.

The development of MRI is illustrative of the often complex path to major new technologies. It began as basic research in nuclear physics--in particular, the curious fact that the nuclei of most atoms behave as though they have a tiny magnet attached to them. Physicists soon learned that when they probed the properties of that magnet with a radio beam in the presence of a strong external magnetic field, they could identify which kind of atom it was. As the technique, known as nuclear magnetic resonance, improved, it became possible even to tell something about an atom's interactions with neighboring atoms. Chemists then developed the technique further as a powerful tool for analyzing the chemical structure of a material, including, eventually, biological tissues. This ability to probe the submicroscopic structure of matter--and hence to map the distribution of certain kinds of molecules in a sample or of cancer cells in a body--provided the scientific base for MRI.

Yet MRI also depends on a number of technologies that evolved separately but in parallel with the basic science, and it was the combination of these with the fundamental physics that made MRI possible. Once the idea emerged of using the nuclear magnetic resonance technique to create images, for example, a host of practical problems remained. For one thing, the technique was initially too slow for medical use. Modern electronics--especially computers-on-a-chip that could be built directly into practical instruments--helped speed it up.

So did the mathematical technique known as tomography--synthesizing a composite image from many different "pictures." Superconducting magnets helped to make more compact and powerful MRI instruments.

The result is a remarkable medical diagnostic tool. MRI gives the most precise picture now available of what is happening inside the body and does so noninvasively and safely. Yet it is most unlikely that MRI could have emerged from a targeted effort to design a better imaging technique--who would have thought to begin by measuring the strengths of magnets within atomic nuclei? For MRI, as for many other important technologies, just such fundamental explorations of nature produced the knowledge that enabled a vision of a life-saving imaging technique.


[Pairs of silicon atoms]

pairs of silicon atoms bonded together in a barbell configuration exist in up or down states whose configurations could be changed as shown in this supercomputer simulation by a thin metallic tip as used in Scanning Tunneling Microscopy. Source: K. Cho, J. Joannopoulos

[Oxygen in a silicon crystal]

A single oxygen atom is made visible as a blue sphere in this supercomputer-generated image of electronic charge inside a silicon crystal. The oxygen atom straddles one of the silicon bonds in the bonding lattice represented by a warm-colored honeycomb. Source: IBM Yorktown Heights and MIT


SIMULATING REALITY

The computing revolution is dramatically transforming virtually every aspect of our society-- our work, our play, even our national security. This revolution started with the discovery of the transistor, the result of fundamental research in solid state physics and the earlier development of quantum theory. The next stage, development of complex microchips incorporating many transistors, drew from fundamental work in physics, chemistry, and materials science. Now applications such as smart military weapons, delivery of consumer services such as movies on demand, or means of transferring electronic funds in a secure manner are incorporating new discoveries in mathematics, engineering, and computer science.

One important frontier of the computing revolution is found in today's powerful supercomputers, which have the ability to perform hundreds or thousands of calculations simultaneously (so-called massively parallel computers). Within years, this field is expected to cross an important threshold, when the fastest computers will be capable of a thousand billion floating-point operations per second (teraflops). Petaflop computers (capable of a billion billion operations per second) may follow only a few years later.

These advances in computing technology draw heavily on fundamental science. But science and technology are closely intertwined: the technology is also driving forward the frontiers of science--ushering in new fields of research and extending the limits of inquiry in virtually all fields--which will in turn enable new technology. For example, for the first time scientists can now begin to simulate such complex physical and biological systems as the earth's climate, the atomic structure of novel materials, and the molecular structure of living cells. Applications of this new computationally-driven science will include improved microelectronic devices and rational drug design. Computational studies of silicon-- the semiconductor material on which most modern computing is based--illustrate the trend. Researchers are now beginning to simulate silicon-based materials with supercomputers, allowing them to perform "theoretical experiments" and--using new techniques for visualizing atomic scale structure--to "see" the results. Physicists, for example, can now use supercomputers to understand how oxygen impurities influence and impede the electrical properties of silicon wafers--a problem that has plagued semiconductor manufacturers for years. In a simulation, a researcher can introduce oxygen molecules into a silicon lattice and watch how it throws the local electrons into a tizzy--something no microscope can observe. The same approach can be used to study another important problem--the migration and diffusion of impurities within a silicon crystal. Insights from such simulations could lead to improved manufacturing processes.

Exciting results are also emerging from studies of the surface of silicon crystals. The outermost atomic layer seems to consist of pairs of atoms, bonded together in a "barbell" configuration. Theoretical experiments indicate that each barbell can exist in one of two states--up or down. This suggests the possibility of storing bits of data on an atomic scale--many thousands of times more compactly than in present computer memories. Other simulations show that a thin metallic tip, similar to those used in Scanning Tunneling Microscopy, can in principle establish the required orientation of the surface atoms. Thus the supercomputer simulations may lead to the development of revolutionary new information storage technologies.

The synergy between science and technology is crucial for developing the next generation of new technologies. Present computer designs will reach limits dictated by the laws of physics. Can faster, smarter machines be built to model the human brain? Can biological components be built into computer chips? What about using individual molecules as switches a thousand times faster than microelectronic devices? These are the kinds of breakthrough technologies realizable only through fundamental research--research that is itself supported by advanced technology.


[Photo of special indicator housings]

Three types of specially designed indicator housings. The large indicator products were designed to be replacements for older incandescent lamps, and last 50 to 100 times longer. This type of indicator is now used in many applications including mass transit, heavy equipment, and as marker lights on trucks. Source: DiaLight Corporation


PLASTICS THAT GLOW

Ever since the discovery of nylon in the first half of this century, plastics have steadily found more and more uses. Now they are poised to invade one of the hottest areas of electronics-- light-emitting diodes. That could create lucrative new markets for such things as computer screens, advertising displays, and even wall-sized video screens.

Intriguingly, however, these advances have not come from the electronics industry, as innovative as it is. Instead they come from fundamental university research into the properties of polymers--the long-chain organic molecules formed of repeated units that are the basis of most plastics. This long-chain structure gives polymer plastics the flexibility that makes them so valuable in dozens of applications--from stretch tights to bulletproof vests to kitchenware that doesn't shatter when dropped.

Most polymers don't conduct electricity. But in the late 1970s, researchers at the University of Pennsylvania discovered a plastic that, when "doped" with small amounts of impurities, could conduct. The technique is analogous to the doping of semiconductors that makes possible transistors. Further research led to other conducting polymers, but none found so far conduct well enough to replace metals. In exploring the basic properties of these materials, however, researchers did find something else-- plastics that emit light when an electric current passes through them.

The discovery came from investigations of luminescence--the property that makes a luminous watch dial glow after it has been exposed to light--in conducting polymers. Studies of the electronic properties of these materials suggested the possibility of triggering such glows electrically, just as in the light- emitting diodes (LEDs) that form the little red on/off lights on many electronic appliances. Not only was it possible, but scientists rapidly found plastics that would emit in virtually all colors--from red to yellow to blue. Further research has improved efficiencies, boosting the first feeble glows into a source of light potentially bright enough to power a display screen. The key remaining problem is that plastic LEDs don't last long enough for most applications, but researchers are optimistic that this problem, too, will be solved.

Light-emitting plastics have become an area of intense scientific and commercial interest. Scientific articles on the subject were among the most-cited in physics in 1993. Two startup companies have already been formed, and larger electronic companies are paying close attention.

The appeal is that light-emitting plastics are thin and flexible and can even be bent around corners--it is possible to imagine low-level lighting fixtures that would fit any location, or even clothing that would glow. Plastic LEDs could also be made in large sheets, potentially as wall-size, flat display screens--long a key commercial target in consumer electronics. Such potential applications could not have been anticipated in the initial investigations into conducting polymers, but such surprises are not an uncommon outcome of fundamental research.


THE HUMAN DIMENSION

Every day, people in many diverse occupations make decisions that have potentially large consequences for human health and safety. Is this plane safe to fly? Given this patient profile, should a physician operate or recommend more conservative therapy? Will this weather pattern develop into a tornado? The answers to these questions have high stakes, and the process of arriving at a decision is often highly probabilistic, involving analysis of ambiguous and often conflicting information.

Researchers in the area of signal detection theory study how people, animals, and machines distinguish meaningful or important information from the background "noise" in their environment. From these studies, behavioral scientists are developing methods to help people make better decisions.

The results are known as decision aids, often computer programs that rely on a systematic, standard method that establishes a decision threshold and maximize accuracy. A decision threshold is the level of evidence deemed necessary to make a decision in the specific situation, e.g. at what probability of a malignancy does one diagnose breast cancer? A fine balance must be struck between a lax threshold which creates many false positives (and their attendant emotional stress and unnecessary surgery), and a strict threshold which misses some positive cases and thereby jeopardizes lives. The decision maker must weigh whether false alarms or undiagnosed conditions are most costly and then must adjust the threshold accordingly. Accuracy can be improved by enhancing the quality of evidence available through basic research and developing better diagnostic tests and instruments. Thus, the methods of behavioral science go hand in hand with the physical and biological sciences.

The practical applications of decision aids are numerous and extensive. Already used experimentally in breast cancer diagnosis, HIV testing, weather forecasting, prostate cancer staging, and testing airplane wings for metal fatigue, the technique promises improved accuracy, and improved public health and safety.


[Disk with Black Hole at center]

A NASA Hubble Space Telescope image of a spiral-shaped disk of hot gas in the core of active galaxy M87. Hubble measurements show the disk is rotating so rapidly it contains a massive black hole at its hub. Source: NASA


BRINGING THE UNIVERSE INTO FOCUS

The razor-sharp images from the repaired Hubble Space Telescope bring the incredible splendor of distant heavenly objects into human view for the first time. Recently, for example, scientists used the Hubble to gather evidence that a massive black hole really exists at the center of a neighboring galaxy. Confronting--and even solving--some of the deepest mysteries of the universe may have no seemingly practical payoff, but it nonetheless appeals to the deep-seated human desire for knowledge of how we fit into the cosmos. Story Musgrave, the payload commander for the recent Hubble servicing mission, put it this way: "I have thought of that instrument as contributing to my personal ideas about what my place in the universe is, what it is to be human."

The Hubble telescope revealed a pancake-shaped disk of hot gas at the center of the giant M87 galaxy. So sharp was the image that astronomers could see by the pattern of movement that the spinning disk of gas was being sucked down into and swallowed by something at the center. Measurements with the telescope determined the speed of the gas--an incredible 1.2 million miles per hour-- allowing astronomers to calculate the mass of the central object. It is equal to 2 to 3 billion suns, so massive for its size that it could only be a black hole--a collapsed condition of matter whose gravitational pull is so strong that not even light can escape. Black holes are predicted by Einstein's theory of general relativity and have been suspected on the basis of other astronomical evidence, but they are such a strange and novel phenomenon that some scientists have remained skeptical. Now the new Hubble observations have provided proof--"the smoking gun", as one astronomer put it.

In addition to such dramatic discoveries, the Hubble is also being used to study the size and age of the universe, the evolution of galaxies from minute fluctuations in the early cosmos, and the details of star birth and star death. There are still plenty of mysteries to unravel about the vastness of space surrounding our planetary home--mysteries that are now in better focus.

The refurbished Hubble Space Telescope is a triumph of technology and human ingenuity, captured in the public mind by the sight of NASA's astronauts unfurling Hubble's new solar panels in space, against the distant backdrop of our home planet. The Hubble servicing mission demonstrated what can happen when scientists and engineers join together to solve a difficult problem. The flawed mirror was discovered soon after Hubble's 1990 launch. A team immediately gathered to examine dozens of possible "fixes". The resulting corrective device, an optical jukebox called COSTAR, was devised and built in only 26 months. Combined with the remarkable ability of the astronauts who installed it and otherwise upgraded the space telescope, the result placed innovative technology in the service of humanity's vision and age-old quest for knowledge about our environment and our place in it.