Appendix A

Key Technologies for the Biological Sciences

Prior to the meeting, workshop participants were asked to identify and comment on one to three emerging technologies or areas of technology development that are likely to have significant impact on biology. The participants also were asked to identify non-financial issues that would pose bottlenecks in development. Summaries of those presentations follow.

Ken Barton
Agracetus Inc.

1. Transgenic Plants as Production Vessels or Bioreactors for Non-native Biomolecules. This represents the most abundant and economical source for high-volume biologicals. It has been limited in the past to materials inherent in natural or bred germplasm. These materials have included plant proteins, carbohydrates, plant oils, and various secondary metabolites.

There are characteristics of this emerging technology that should be considered as plans are made to fill in the underlying research base. For example: The timeframes in working with transgenic plants are long. It is optimally 4-6 months from the time an experiment is initiated until the first transgenic plant sets seed. This long cycle time inherently slows research progress; and because of the time value of money, it limits industrial funding available for discovery research in this area.

The development of this technology must be multidisciplinary. There must be a close interaction between biologists working with the transgenic plant systems and scientists working on the end-use, whether it be pharmaceuticals, industrial enzymes, or polymers.

Helen Berman
Biological Sciences Directorate
Advisory Committee

Dr. Berman looked at this subject from her vantage point in biological crystallography. She recommended exploiting the methodologies of computer science and mathematics and data-oriented biology. We need the skills of statisticians, manipulating data across sets using computer technology. The expertise is all there. Education is a practical solution. In the undergraduate years there is a need to make all students aware of all disciplines. We have to get students to think more critically. In graduate training we have to allow students both to go deeply into a particular area and to go across fields. It is a sociological issue: solo versus group research; night versus day; pencil versus computer; bench versus desk; arrogance versus tolerance.

Joseph A. Berry
Carnegie Institution of Washington

1. Monitoring and Modeling. Dr. Berry used the example of atmospheric modeling combined with biologic models to calculate what is going on in the atmosphere. This is a slow dynamic system but the changes seem relative to our attention span. There are changes in composition as well as the weather. The concept of closed systems was discussed.

2. Measuring Fluxes on the Landscape/Global Scale. Closed systems can be analyzed. There is a huge gap in terms of atmospheric closed systems versus biologic closed systems. Regarding emerging technologies, we need to assess quantitatively the accuracy of the models being explained. Also important are measurements of correlation and rapid time response.

Gene Block
Center for Biological Timing
University of Virginia

1. Embedded Circuit Technologies - Biosensors and Microfabricated Sensors. Specifically, the tissue/silicon interface technology for:

2. Further Development of Light-producing Reporter Genes. Reporter genes such as green fluorescent protein (GFP) and luciferase offer the opportunity to record gene activity from living cells, tissues, and even behaving animals. The continuing development of different wavelengths of these reporters offers hope of multiplexed recording of several genes simultaneously. This technique will provide great experimental leverage in studying dynamic biological systems.

Steven Briggs
Pioneer Hi-Bred International

1. Finding Function from Sequencing Plants. Whole organism biology, focusing on the role of specific genes. Specific structural or enzymatic activity of protein.

2. Finding Sequencing from Function.

George Bruening
Center for Engineering Plants for Resistance Against Pathogens
University of California, Davis

1. Animals and plants as factories for the production of compounds for research and commerce. Animals and plants and derived cell cultures can be modified for the production of proteins, nucleic acids, and small molecules either by genetic transformation or through the use of transient expressions systems in which the germ line of the animal or plant is not modified.

The production of specific proteins is well established in greenhouse and field tests of transformed plants; demonstrations in animals systems have been fewer. The milk of ruminants and the seeds and fruits of plants already are well established articles of commerce and provide good starting materials for purification of products of interest. However, expression of proteins and other products in other tissues also has proved to be useful.

Transient expression systems usually are based on animal or plant viruses. The high titers of viruses allow the corresponding transient expression vectors to mediate high-level production of proteins. For cells in culture, expression from alphavirus vectors has generated proteins from cloned sequences as 25% of the cell protein. Packaged retrovirus vectors have been introduced into the teat canal of sheep and cows, which, when induced to lactate, produced proteins of interest in the milk. In another example, tobacco plants inoculated with genetically engineered tobacco mosaic virus have generated tricosanthin, a potential anti-HIV drug, as 2% of the cell protein. At least two plant viruses have been engineered to produce epitopes from animal viruses on the surface of their coat protein, yielding potential veterinary vaccines.

Advantages of producing proteins or the products of the action of enzymes by transient expression systems are high success rate, ease in modification of the producer organism, high-level production, and the ability to produce materials of a type or in amounts that may be incompatible with one or more phases of the plant or animal life cycle for the transgenic organism. Compared to injection of plasmids into animal embryos that subsequently are implanted, transient expression systems appear to be more than 100 times as efficient in terms of the numbers of producer animals obtained.

Transient expression systems also have been applied as a new form of "live virus" vaccine. A vaccinia virus vector expressing a rabies virus epitope has been used to inoculate on a mass scale wild foxes in Western Europe. This project has drastically reduced the number of people in the region who were required to undergo post-inoculation immunization as a treatment for rabies.

Barriers to implementation:

2. Selection in vitro and in vivo of active members from combinatorial and other libraries of molecules (e.g., peptides, steroids). In vitro selection of active members from combinatorial libraries of molecules is an established technology in fundamental research and drug discovery. Usually selection is for binding to an immobilized, biologically relevant target. However, some functional selections in vitro also have been developed. A logical future development would extend selection to libraries of natural compounds, for example from tropical forest plants or sea animals.

Approaches to selection of active members from libraries expressed in vivo is likely to provide greatly expanded opportunities for selection on the basis of function, particularly function under biologically relevant conditions. In vivo selections require development of systems in which the biologically relevant reaction can be coupled to cell death or cessation of growth to provide the necessary selection from large libraries. Transformation and culture of very large numbers of cells is another required technology.

For the described technologies, a consortium approach involving universities and biotechnology and drug companies can be imagined for carrying out large-scale screens. Presumably, partners in such a consortium will define in advance the libraries in which they have an interest. A consortium approach can be expected to increase efficiency, to spread the cost, and to create necessary critical mass. The sheer size of the U.S. research enterprise, both private and public, may give the U.S. an international advantage in developing and applying the technologies envisioned here.

Barriers to implementation:

Robert Cotter
Middle Atlantic Mass Spectrometry Lab
Johns Hopkins University School of Medicine

1. Cheaper Mass Spectrometer.

2. Array Sequencing/Multiplex Methods.

3. Need to Facilitate Technology Transfer.

We must develop improved methods for determining molecular structure while improving our technologies. We should integrate methods. There is a need for massively parallel DNA sequencing - not just to sequence the human genome, and not just to build up the databases, but to sequence everyone's genome. Parallel sequencing: Mass spectrometers have been suggested for this. The technology is limited as to how far the resolution can go and the size; but in reality the technology is promising.

Ruth Schwartz Cowan
Department of History
State University of New York at Stony Brook

1. More scholarship on 20th-century technology developments. From a historian's perspective, there is little literature available to people who study the history of science and technology in the 20th century - especially after 1945.

2. Break down boundaries between sectors and disciplines. Dr. Cowan surveyed some of the 20th-century literature, especially regarding the development of electron microscopy and ultrasonography. She concludes that, to obtain the fastest possible development of a technology, the most important thing is the existence of permeable boundaries between government, academia, and industry; and also those between engineering and science as well as between scientific disciplines. This is why so many important technology development efforts succeeded during World War II. Boundaries can come down when people stop being arrogant about how they are trained and the languages they use; it is generally under stressful social conditions, such as wartime, that these boundaries can become permeable.

3. Critical role of undergraduate education. As an historian, Dr. Cowan's ability to understand what transpired during the workshop came from her undergraduate education, which included basic (not specialized) courses in science at the undergraduate level. Unless people on the cutting edge of research understand the need to drop their aversion to involvement at the undergraduate and K-12 educational levels, we will continually thwart the potential for interdisciplinary progress.

Jim Eberwine
Department of Pharmacology
University of Pennsylvania

1. Single Cell Differences

2. Gene Expression In-Situ

3. I.D. Character of Stem Cells

4. Protein Biosensors

We need to understand how neurons interact and talk to each other, and then to take this understanding down to the single cell level. An area in which we need to make more progress is in determining multiple MRNA and protein levels and how they react to and change in response to the central nervous system. We need to develop techniques for looking at multiple proteins, either functional or non-functional, in a detectable manner. How can multiple changes affect the system itself? Here we need a paradigm shift in the thinking of biologists so that the importance of coordinate changes in the expression of multiple genes is recognized as the underlying cause of cellular physiology. The information we receive from single cell research is enormous. This information is critical to understanding how neural networks and systems develop and are regulated. Some type of analysis system must exist in order to put these changes into a biological perspective. This type of analysis is generally applicable to all tissues and complex systems.

Steven Fodor

1. Integration of Phenotype/Genotype. There must be general access to genetic information as it relates to technology development, definition of phenotype-to-genotype relationships, and information management. On large-scale efforts such as the Human Genome effort, large amounts of information are required, with a need to screen and analyze this information. This will require automating some conventional technologies, generating further development, implementing evolutionary improvements in the field, and sponsoring fundamental new approaches.

2. Databases are Key. Development of new database algorithms - in many of these new fields the current vision is to multiplex; therefore, there are large numbers of assays, in high volume. There is a need for swift computational and interpretive capacity.

3. Integration of Technologies. There is a basic training component - i.e., people need to be familiar with the fields of chemistry, physics, biology, materials science, etc. A technical program should have structure and focus in order to integrate these disciplines. Technology should not be developed in a vacuum. It should be done in an integrated manner particular to the application.

George Garrity
Merck & Company

Natural Products Discovery (Bioprospecting) - The process of searching for pharmacologically active agents from natural sources (e.g., plants, fungi, bacteria, lower vertebrates) is a highly evolved technology rather than an emerging one. However, success in this arena is dependent upon the rapid incorporation of new technologies into the collective search strategy. The guiding principal of contemporary screening programs is that chemical diversity is a function of biological diversity. Although all of the areas detailed in this report will have some impact on the drug discovery process there are three areas that are likely to have a major effect on natural products screening in the future.

1. Integration of systematics and ecology - There is a renewed interest in biodiversity, propelled by the general belief that human activities are driving untold numbers of species into extinction. There is also a consensus that our knowledge of what those species are, their interactions, their roles in nature, and their beneficial properties is severely limited. There are a number of major initiatives underway to address these problems, which are funded by various governmental and non-governmental agencies, including the NSF. However, species concepts are inconsistent among the various disciplines in biology and ecological survey techniques differ considerably. The recent development and widespread application of phylogenetic analysis of highly conserved regions of the genome (e.g., 16/18S RNA) provides a rational framework for placing all organisms into a meaningful perspective. Application of this method in ecological surveys indicates that higher levels of diversity exist in most habitats than was previously expected, especially among prokaryotes. However, phylogenetic analysis alone does not accurately portray the true level of diversity that resides in the unique regions of the genome. As such, it must be used in conjunction with other taxonomic techniques (polyphasic taxonomy) to obtain a more complete picture of the true diversity of biological systems. Establishing the relationships among various taxonomic methods is likely to gain in importance to both systematists and the end-users of their taxonomies.

2. Effective utilization of biodiversity - An overriding justification for most of the proposed biodiversity inventories is that at least some of the "newly discovered" species will have beneficial properties, with a heavy emphasis on pharmaceuticals. However, there are several barriers to the effective utilization of the organisms that will be derived from such studies in large-scale screening programs. Knowledge of the basic physiology of an organism is critical because the production of secondary metabolites is often transient and tied to a specific phase in the life cycle or as a response to an external stimulus. Expertise in physiology, especially of microorganisms, is rapidly declining in the United States. Detection of bioactive metabolites is a function of concentration. Concentration is unknown at the time that samples are assayed. Therefore, screening programs rely upon high throughput assays that can be automated in order to detect the rare over-producers. Current approaches to automation will become rate-limiting as more samples become available for screening. More efficient strategies (e.g., bioassay arrays on a single silicone wafer) that would permit testing of a single sample in multiple assays, in parallel, are needed. Such a strategy could be used at the discovery stage or the sample characterization stage and could serve as the foundation of a more robust screening model that overcomes many of the limitations of the current mode-of-action paradigm that is in general use in drug discovery programs.

3. Bioinformatics - With the exponential gains in computational performance achieved with each new generation of hardware, our ability to accumulate, manipulate, and visualize data will be unbounded in the future. Direct access to distributed databases via high-speed/high bandwidth networks will be essential. Journals and even libraries will be replaced with electronic versions that support full text searching as well high-quality graphics. Intelligent agents will become vital to scanning and retrieving documents and supporting data. Bioinformatics will play a pivotal role in building and maintaining large, enterprise-wide systems that support research activities. But bioinformatics must evolve into a bona fide area of research. A rate-limiting step in systematics, ecology, and drug discovery is our ability to interpret large volumes of multivariate data. Traditional methods of classification can and must be applied in a prospective fashion to uncover novel species, novel patterns of biological activity and, ultimately, novel pharmacological agents. However, more efficient algorithms are needed that either restrict the search space or permit the efficient manipulation of data matrixes that grow exponentially. Visualization techniques for exploring multivariate data are also lagging. Techniques for mapping high-dimensional, multivariate data into two- or three-dimensional space, in a meaningful way, are urgently needed.

Leroy Hood
Center for Molecular Biotechnology
University of Washington

1. Computational Biology - Complex Systems. Regarding computational biology, one thing that has changed is the ability to decipher and now to manipulate information. Much more challenging is the question, What is the biological information that is intrinsic to the molecular structure? Applied computations are needed that can bring their tools to decipher these languages. Some languages are easy to decipher, however the challenge is the recruitment and education of a computer scientist and the recruitment of software engineers that can form the framework. There is a need to educate biologists to know what's going on. This problem can be solved through people - i.e., recruitment and education. Computational biology is the most fundamental tool we have in biology.

2. Microfabrication/Nanofabrication - Small Machines . In this area, genomics and DNA sequencing require machines that can be functional and disposable. Working with engineers in order to increase their understanding of biology, to learn their language and vice versa, will help. There is a need for facilities that will allow these things to happen naturally.

3. Protein Chemistry - Integration of Protein Function . We need to look at the protein itself and figure out how it works. The primary organism is yeast. We can devise techniques to look at the interactions. We can develop highly miniaturized systems that can allow us to look at these proteins.

Pete Magee
Biological Sciences Directorate
Advisory Committee

1. Dynamic Studies of Single Molecules. We need to look at a single molecule in a dynamic way. (This area is structural biology-related.) We need energy transfer probes - a single molecule attached to a protein tube. This would tell us a lot about protein folding during biological reactions and the dynamics of transcription and translation in the machinery itself, where the reactions are taking place.

2. Detecting and Recognizing Unculturable Organisms. There are a lot of emerging infectious diseases with questionable etiology; this is an important medical problem. We may have to develop probes that will tell us not only that microbial organisms are there, but what kinds of organisms and what kinds of genes are there.

Barry Marrs
Industrial BioCatalysis

The fundamental barrier in microbiology today is the need to culture organisms in order to learn about them. There are three emerging technologies that address this barrier:

1. Cloning and Expressing Genes. Cloning and expression of genes, isolated from uncultured or difficult-to-culture organisms, enable enzymes or whole pathways of enzymes to be moved into "domesticated" hosts in which they may be studied or produced in volume.

2. Capillary Zone Electrophoresis. Developed by Dr. Richard Ebersole at DuPont Central Research and Development, this technology achieves the separation of living, uncultured microorganisms from environmental samples. Samples can then be studied directly or used as starter cultures from which "weed" species have been removed.

3. Optical Tweezers. "Optical tweezers" describes a micromanipulation technology in which a laser beam is used to transport individual microbial cells, during microscopic observation, into a capillary or similar container. This enables culturing of the isolate in the absence of "weeds".

Also, how can the cross-over from physics to biology be facilitated?

John Pierce
Dupont Stein Research Center

1. Metabolic Engineering

2. Intercellular Probes

3. Transgenesis in Plants

4. Cell Walls/Membranes

Robert Robbins
U.S. Department of Energy

1. Information Technology

2. Computational Power

3. Information Technology - Linked databases

David Schimel
National Center for Atmospheric Research

1. Land/Air Fluxes on Specific Molecules. The exchange of biological compounds between soils, vegetation, and the atmosphere is an emerging field relevant both to the biological effects of changing atmospheric composition (including local-regional air pollution and global change) and to the effects of the biota on the atmosphere. Recently, technology for measuring the physical fluxes of specific molecules using aerodynamic techniques has been developed in micrometeorology and biometeorology. There are currently several issues regarding the injection of this technology into biology. These include the lack of commercial instruments and software for key aspects of the micrometeorology, the lack of appropriate field sensors for many molecules and isotopes of interest to biologists and atmospheric chemists, and the lack of knowledge of the principles by many environmental biologists.

2. Isotopic Measurements. Building on major accomplishments in isotope ecology over the past decade, isotopic measurements of light isotopes (hydrogen, oxygen, nitrogen, carbon, and sulfur) have become a critical tool for ecology. Currently, mass spectrometric technology, including sample introduction via gas chromatography, has made major strides; but direct measurement of isotopes in the field, continuous measurements, and increased automation are still required to connect isotopic information to biological fluxes (including vegetation-soil-atmosphere fluxes). More robust instrumentation and supplementing traditional mass spectrometric instruments with other techniques (e.g., tunable diode lasers) could allow major breakthroughs. NSF currently supports significant infrastructure for isotopic analyses of environmental samples, but most centers are separate and there is no program linking the various isotope centers together or allowing them to pool expertise to make major advances. Linking biological, geochemical, and atmospheric groups would be advantageous.

3. Remote Sensing, Spatial Data and Related Informatics. A major aspect of biological complexity at the ecosystem scale is associated with spatial distributions of organisms and the resources they require (heat, water, light, nutrients, etc.). While theory linking spatial relationships of biophysical and geological factors to population and ecosystem dynamics has been prominent in ecology for decades, the ability to capture, manipulate, and analyze spatial data except at intensively studied sites is in its infancy. Progress is being made in remote-sensing technology for biological studies, and tremendous opportunities exist for rapid advances. Increasing the availability of spatial data, including satellite observations, to biologists could result in major breakthroughs; reciprocally, NSF sponsorship of biologists to do the required theoretical and engineering design for future missions could benefit the nation. Given the rapid penetration of remote sensing and spatial data management (geographic information systems) into the private sector, this is an area that is ripe for partnerships between public, private, and academic entities.

Stephen J. Smith
Department of Molecular & Cellular Physiology
Stanford Medical School
Stanford, California

1. Neural Imaging.

2. Imaging Cytometry.

(A) A microscope redesigned from the ground up for electronic/digital imaging.

(B) Specimen stage automated to scan large specimen areas:

Transforms microscope from qualitative to quantitative tool for cell biology

3. Molecular Arrays (Bio Chips).

Joan Steitz
Molecular Biophysics and Biochemistry
Yale University School of Medicine

Dr. Steitz advocates technologies that will result from small, individual investigator grants. She urges NSF to continue to consider these important vehicles for obtaining high return on limited funds. This stance is prompted by the realization it is often simple technologies that have had profound impact on the ability of scientists to expand understanding of fundamental biological processes. Examples from the area of gene expression, a field which underlies many of the more sophisticated technologies discussed at the workshop, are:

In transcription:

In RNA processing:

In nuclear trafficking:

In translation:

In addition, high-resolution gel fractionation and sensitive detection methods, such as PCR, have added to progress in all the above areas.

Underlying all recent progress in gene expression are, of course, the powerful methodologies of cloning and sequencing, which again arose from investment in small individual investigator grants. Dr. Steitz urges that these lessons not be forgotten as the growing appeal of more complex modern technologies threatens to consume a larger and larger fraction of a limited funding pot.

D. Lansing Taylor
Center for Light Microscope Imaging & Biotechnology
Carnegie Mellon University

1. Functional Imaging of the Chemical and Molecular Dynamics of Life. Light optical and nuclear magnetic resonance (NMR) imaging of living systems, from cells to whole organisms, are rapidly emerging as powerful tools to explore the temporal-spatial dynamics of the chemistry of life. This technology requires the integration of reagents that can sense chemical and molecular changes in cells, tissues, organs, and organisms with instrumentation that can detect and measure the changes in time and space. The development of reagents will require the use of biochemistry, organic chemistry, and molecular biology to create the necessary tools.

One special reagent technology, Protein Biosensors of Cell, Tissue and Organ Function, is now rapidly evolving. With this technology, proteins will be genetically engineered to express a fluorescent chromophore in a specific domain that will report ligand binding, including protein-protein interactions, post-translational modifications, and drug binding in vivo. The most exciting chromophores are short sequences of amino acids in proteins from a variety of bioluminescent organisms, such as the prototype-green fluorescent protein from jellyfish. The proteins autocatalytically modify the amino acids to create a chromophore during the expression of the protein (cyclization of serine-dehydrotyrosine-glycine within the hexapeptide). Protein chimeras of the target protein and the chromophore protein, designed as biosensors, will be constructed and expressed in single cells and organisms ranging from bacteria to the mouse. In vivo fluorescence spectroscopy will then be used to measure the chemical and molecular changes detected by the biosensor.

Examples of applications of this technology are:

This emerging technology is based on the integration and extension of some existing technologies:

1) Protein and DNA sequencing to explore a range of potential chromophores in nature

2) NMR and X-Ray crystallography to define protein structures

3) Protein engineering to create chimeras

4) Fluorescence spectroscopy of labeled molecules in living systems

5) Optical imaging techniques to detect and analyze signals.

Barriers to the optimal harnessing of this technology include:



2. High Throughput, High-resolution Solid-state Cameras with On-chip Processing. Live cell, tissue, organ, and organism-based light optical imaging modalities require high dynamic range (ca. 10-14 bit), high resolution (ca. 1K x 1K), large spectral range (ca. 400-900 nm) and high throughput (ca. 30 frames/sec) to quantify dynamic chemical and molecular changes. These cameras were originally driven by the military in the infrared range and, to a lesser extent, by astronomy. Development of this type of camera (cooled CCD or c-MOS) has been slow due to the high cost of producing a relatively small number for the scientific community. The goal would be to incorporate on-chip processing steps so that the camera is part of the computer, much as in the human eye-brain design.

Barriers to realizing the camera technology:

3. Micropositioning and Micromanipulation for Nanotechnologies Applied to Molecules and Cells. Positioning and moving molecules and cells with micrometer to nanometer precision will be valuable for high-speed selection of individual cells that have been manipulated (transfected, stimulated, etc.). Requires combining technologies of:


James Tiedje
Center for Microbial Ecology
Michigan State University

1. Speed of Analysis.

2. Measurement of Microbial Functions in Nature. Very large numbers of measurements over time and space are needed, using non-invasive technologies to measure gas fluxes, such as CH4, H2O, soluble products, or pollutant transformations. New measuring devices and automatic collection of remote data are needed. If non-invasive methods are not possible, micro-penetration technologies are needed to place hundreds of sensors in nature.

3. Artificial Intelligence/Knowledge Based Systems to understand what microbes can do, e.g., to predict how chemicals will be metabolized. AI/KBS offers a systematic means to more rapidly understand the complex potentials of biological systems.

Kristiina Vogt
Greeley Laboratory
Yale University

1. Trajectories of Natural Systems. There is a need for probes for studying ecosystems. Ecosystems do not necessarily follow the "rules." Biologists need to go into the field to gather data, produce a framework, and produce the tools. We need to develop a framework that will enable scientists to look at things on a larger scale and then reduce down to a smaller scale.

2. Scaling: Time/Space. Modeling tools are crucial in biology; but currently they are not being used effectively in ecosystem ecology. We need to: be able to look at sites that have been degraded and determine how far they are from natural; develop tools that allow us to index the state of an ecosystem and to identify the trajectory of change after a disturbance; develop early detectors of ecosystem change so that we are not always dealing with a clearly degraded system; and develop tools to analyze heterogeneous information across temporal and spatial scales.

3. Stable Isotopes. Important here are non-invasive isotopes that enable scientists to study a system without changing it. Some isotopes are very site-specific. Stable isotopes can effectively integrate temporal scales and facilitate integration of physiological measurements with ecosystem-scale analyses.

4. Access to Technology Resources. Models work, up to a point; but they are not the ultimate end product. We need to identify feedback in the systems. Some things are sensitive. We must develop people and expertise. We need to access training and move across disciplines for equipment, training, etc.

Watt Webb
School of Applied Engineering and Physics
Cornell University

1. Non-Linear Optics is providing powerful new tools for biological research. Advances in molecular genetics have extended the demands on our experimental methods of studying molecular mechanisms in living cells. Physical and chemical technologies that have advanced within the past decade have improved the capability of optical methods for research in cell biology. To work with living cells, benign remote sensing methods are required. Two-photon excitation of fluorescence and photochemistry now provides submicron three-dimensional resolution and the sensitivity needed to detect individual molecules.

2. Technology Transfer. Accommodation of the transfer of emerging technologies to potential users and industrial providers is a crucial step.