Technology Development

Background Information

GLOBEC is founded in part on a belief that fundamental progress can be made in solving a long-term problem in the population ecology of marine organisms, the problem of what determines population fluctuations of marine animals. The central charge of the GLOBEC working group on technology is (1) to identify those available technologies that have not been adequately applied to solving the population variability problem, and (2) to suggest if and how new technologies could be produced to overcome present barriers to our understanding of potentially critical processes.

Planning for the large multi-disciplinary research programs currently underway or being considered (e.g., Joint Global Ocean Flux Studies - JGOFS; Global Ecosystem Dynamics - GLOBEC; the Accelerated Research Initiatives of the Office of Naval Research, such as Flow Over Abrupt Topography, Marine Light-Mixed Layer) has recognized the need for data collection on a continuum of time and space scales. Ideally, biological data should be obtained, processed, and analyzed on scales commensurate with physical data. To accomplish this as yet unachieved objective, emphasis has been placed on rapid discrete sampling, continuous in situ measurement, and remote sensing. The shipboard handling, processing, and analysis of biological samples, an important component in the physical- biological mismatch, has received little attention. For example, over eleven years ago the symposium volume "Advanced Concepts in Ocean Measurements for Marine Biology" (Diemer et al., 1980) contained a short chapter by T.T. Packard (1980) which emphasized the development of shipboard techniques to measure rate processes. Most of the remainder of the book, however, dealt with data collection, not shipboard analysis. Similarly, the recent Zooplankton Colloquium held at Lake Arrowhead (Marine Zooplankton Colloquium 1, 1989) focused almost entirely on "over-the-side" issues of data collection.

Progress in understanding how marine ecosystems interact with, and are affected by, physical processes requires further development of shipborne plankton sampling and processing techniques. The objective is to sample populations on appropriate time scales and with sufficient spatial resolution to compare with the concomitant data from the physical field. This linkage has been limited in the past (and present) by slow microscopic examination of collected creatures. There is strong feeling that there is insufficient information concerning key biological rates (especially growth). In order to study population dynamics, it is crucial to know rates of change and not just the levels of the population. One method to get rates is to take the difference of point measurements. It is also desirable to have measurements of rates that can be referenced to a single point in time, e.g., measure the size and growth rate of a zooplankter rather than measure its size at two different times. Some different systems for at-sea collection and analysis have been devised, developed, and proposed. Zooplankton identification and counting possibilities include immunological tagging technology, image analysis, size spectrum analysis, and acoustic information processing. Individual growth rates may be inferred from RNA:DNA ratios, lipid concentrations, gut contents (fullness), or some other ecological or physiological parameter.

GLOBEC proposes focusing resources on new technologies and new applications of old technologies which speed the processing of zooplankton samples or provide information on important biological rates and hence provide accurate biological data which can be used to link biological processes and the physical environment.

To achieve our goal of increased and improved instrumentation and methods, GLOBEC sponsored a special session at the AGU/ASLO meeting in February of 1990. There were 14 papers presented, five were on rates and eight were on population sampling systems, both acoustic and optical. The additional paper was on immunochemical identification of ciliate protozoa. The papers on rate measurements were heartening because the committee had hoped to emphasize the importance of such measurements to GLOBEC's process oriented approach.

At this stage in GLOBEC planning, the most important criteria to establish are the scientific ones, that is, deciding on specific, well defined problems such a system in its broadest concept could help solve. Many relevant problems in zooplankton ecology have already been identified (see references above); the challenge will be to identify those most amenable to solution using shipboard technology. This challenge must include the consideration of the following criteria:

  1. Assuming the need for a technology is established, what will be required to validate the data obtained, not only to confirm that the technology is appropriate, but also to act in a near-real-time mode as the data is acquired (e.g., calibration-type problems)?

  2. Can the masses of data obtained be analyzed meaningfully? For example, are the available models and analyses adequate to delineate and study the processes involved? Can models be developed to use the potential data as input for predictive purposes or for comparison?

  3. Are the data relevant or appropriate to other areas of research? For example, while the data may not be directly applicable to a particular scientific problem, there may be relationships to other variables (i.e., transfer functions) that will make them so. Also, are the data suitable for incorporation in models?

  4. What is needed to manage all the data, given they can be collected? This must include serious consideration of the problem of data use; Gifford Ewing (1969) clearly described the hazards of the "turn it on and let it run" syndrome:

    "In the long run of time, data must be reduced at least as fast as it is acquired ... the necessity of adjusting sampling rates to ... the rate of consumption is already far from an academic question as can be seen from the volume of data presently being archived in various centers without serious perusal by anyone."

GLOBEC held a workshop to prepare a conceptual description for three population sampling instruments in conjunction with a meeting to develop a concept for an experiment in the northwest Atlantic. Workshops on technology development for biological rate measurements, and biotechnology and immunological methods were held late in 1990. As the reports from these workshops are produced they will be reviewed and used to encourage the appropriate funding agencies to support the technology development.

The three instrument systems that have been initially identified as warranting development are: 1) a shipboard automated plankton analyzer to operate from oceanographic research vessels and perform "real time" analysis of planktonic samples; 2) an instrument to map zooplankton density in the upper 200 m of the water column over a 2 km square region within 4 hours; and 3) a zooplankton "CTD" type profiler that can assess the 3-dimensional distribution of zooplankton, with the possibility that such a device could be towed or moored.

Proposed Instrumentation

A Shipboard Automatic Plankton Analyzer System

Much of the important work on biological samples collected at sea, whether living or dead, is accomplished in shore-based labs. The recognition of the importance of rapid analysis of samples after collection offers a chance to direct new efforts towards the development of shipboard analytical technology. The planning for the GLOBEC initiative, through the GLOBEC technology working group, has included discussions on rapid shipboard analyses of zooplankton. The other focuses have been on rapid in situ profiling of biological and physical variables (a "plankton CTD"), Lagrangian trackers, autonomous floats (e.g., "Slocums" - Stommel, 1989), and some type of rapid survey instrument (or instruments) to locate areas for more intensive studies. This discussion is an overview of the concepts that must be defined in order to implement, over the next five years, new and improved shipboard systems for rapid analysis of zooplankton samples. The overall concept has come to be called the "Shipboard Automatic Plankton Analyzer System" (SAPAS). Neither specifies of hardware definition nor details of the potential for biotechnological approaches are discussed. The major concern is with rapid analytical techniques, not those that involve extended sample preparation. The critical role for computers in equipment control, analysis, and interpretation will not be discussed. Attention is restricted to the needs for analysis of zooplankton in the generic sense; while "zooplankton" comprise a diverse range of size, morphology, chemical composition, and behavior, it is inappropriate at this stage to go into such details. Also not included is the important issue of designing a rational sampling program to optimize the use of such technology. Obviously, this is of critical importance; as Gifford Ewing (1969) wrote some 20 years ago:

"In a science such as oceanography where the total population is enormous and the cost of sampling is high, the efficiency of the sampling plan is often crucial if the limited facilities are to be fully exploited. All too commonly, however, the sampling plan appears to be extemporized to fit intuition, custom or convenience rather than to satisfy a clearly formulated specification of the particular requirements of the experiment."

Whatever methods (continuous or discrete) are used to bring zooplankton on board a vessel, there would be need for three categories of instruments to accomplish or facilitate the analysis of the samples:

  1. Tagging devices - These would mark organisms with such things as stains, isotopes or antibodies that, after a suitable time delay, could be used as the basis for separating, counting and/or later analyses.

  2. Sorting devices - On the basis of tag attributes, sample volume, particle counts/size/volume/density, or the judgment of the optically or mechanically augmented human eye, the sample would be separated or aliquoted for further processing, laboratory experiments/ analyses or preservation.

  3. Data gathering devices - Three existing technologies have potential for new or augmented uses in shipboard analyses.

    One advantage of using several different techniques to analyze the same samples is the potential for merging results. While any one method alone might not provide a specific type of data, together with others it may be possible to deduce otherwise unobservable properties. Some attention should be given during the development stages of a SAPAS to the possibility of such synergistic interactions of components.

Figure 1 presents in block form the basic components that might be integrated to comprise a SAPAS. The input to the system is assumed to be a profiling plankton pump operating in conjunction with environmental sensors such as acoustic backscattering transducers, fluorometer, spectral radiometers, turbulence probes, and temperature/conductivity sensors. The output would be estimates of: biomass and abundance in various size/shape and taxonomic categories; behavioral characteristics; biochemical parameters (e.g., enzymes, lipids, genetic traits); and direct or indirect measurements of rates (e.g., growth, respiration, excretion). The output is shown coming from the adjacent column of five data source blocks surrounded by the dashed line; the input to these blocks comes from the physical components of the SAPAS. These components are shaded according to their readiness for incorporation into a system. Fully shaded means these technologies should be feasible now, requiring only minor adaptation for shipboard use. Half-shaded indicates either that portions of the technology are feasible now, or that it would take perhaps five years to develop the technique. Unshaded means it may take longer than five years to provide the methodology, depending on the priority given to development. The dashed lines surrounding some of these blocks indicate that the input and output arrows are not necessarily tied to the specific block adjacent to the base or tip of the arrow.

Figure 1. Basic components that would probably be integrated to comprise a shipboard automatic plankton analyzer system (SAPAS). See text for detailed description of figure.

Rather than describe each component block of the SAPAS separately, the five data source blocks will be explained by referring to their input sources. Proceeding from the bottom block up the column, these are:

  1. Rough taxonomy--this level would provide size-frequency information of all particulate components of the tagged or untagged and sorted or unsorted sample stream in a continuous mode using the optical or electromagnetic counting techniques. In low diversity ecosystems, some taxonomic information could be inferred.

  2. Finer taxonomy--identification of organisms to levels above species (e.g., copepods, calanoid copepods, chaetognaths, gelatinous organisms) using available techniques of image analysis. It would be hoped that the categories could be assigned ecological meaning, such as trophic level, feeding mode, etc. With present capabilities, this mode would most likely operate on tagged or untagged discrete samples analyzed using acoustic or optical image formation techniques.

  3. Finest taxonomy--the acquisition of information at the individual species level or below; for example, variety or form of a species, stage, sex, reproductive status, etc. With present capabilities, this information would come from human analysis of living or preserved aliquots of the sample stream. Hopefully this tedious process could be accelerated by automated sample handling and separation techniques. Some detailed taxonomic information could be obtained from taxa sufficiently unique for image analysis systems to identify or from tags with sufficient discriminating power.

  4. Destructive laboratory analysis--in the near future, most of these analyses are likely to be carried out manually. Automation could provide a source of animals or sample fractions with desired attributes. Development of automated analyses themselves would make obtaining data on enzymatic, lipid, or genetic attributes much more efficient.

  5. Live laboratory experiments--some types of information, particularly rate data, must come from shipboard experiments done on living animals. To increase the efficiency of these experiments, the labor of sorting out and pretreating experimental animals could be reduced by utilizing automatic or augmented human sorting and manipulation. Automated culturing systems like those developed by Cabell Davis at WHOI should be adapted for shipboard use and integrated into other analytical systems to provide data on rate processes.

Other components and comments: Because of the wide range of abundance of planktonic organisms, concentration or dilution of the sample stream at various stages of processing will be necessary for optimum operation. Feedback loop-controlling devices to do this, primarily from the continuous flow analysis systems, will have to be incorporated to accomplish this efficiently.

Mapping Plankton Patch Morphology

An important question in marine ecosystems concerns understanding the gross morphology of plankton patches. What are the sizes of these patches, how do they evolve in time, and what are their relationships to the driving physical processes? Current instrumentation does not permit the measurement of these phenomena although an examination of the technical possibilities makes these goals attainable.

Acoustics is the natural method for obtaining the gross morphology of these plankton patches as the ocean is too opaque to light to permit successful use of optical techniques. There are many possible designs for this type of system, however they should all share several common features: that is, they should be able to resolve the mean acoustic backscatter within a 3-dimensional volume of the ocean. These mapping systems are 3- dimensional imaging systems.

In designing a 3-dimensional imaging system there are many issues which need to be considered. In sonar imaging a distinction is usually made between range resolution and cross range, or azimuth, resolution. The range resolution is usually quite high because this is a function of the pulse length, which can be made quite short. On the other hand, the cross range or azimuth resolution is usually much worse, the principles of geometric optics requiring quite large arrays for equivalent spatial resolution. There are also many options for the array configurations themselves, ranging from large area imaging systems which would necessarily have coarse resolution, to small area systems which would have finer resolution. These trade-offs are a result of the physical trade-offs between frequency of insonification, and spatial and temporal resolution. It is probably true that we will need to imagine a smaller system which would provide finer resolution and could be towed to provide data over large spatial areas.

One example of an instrument that could be developed is a side scan sonar system for mapping patch morphology using acoustic backscatter information as mentioned above. This sonar would have an omni-directional beam pattern in one axis and a narrow beam pattern in the other planes. It would obtain range information from time delay data and do beam forming to obtain azimuthal resolution. Towing would result in mapping a tube of ocean. Gridded transects would result in mapping a larger volume. Periodic deployment would result in a set of high resolution environmental and biological data within a large volume of water sampled at the repetition rate of the survey.

More specifically, it is envisioned that this system would have a center frequency between 100-500 kHz. This seems like a natural frequency band to use when considering spatial resolution, temporal resolution, and acoustic attenuation. System range would then be on the order of 100's of meters and range resolution of 2-10 centimeters should be possible. Azimuth resolution could be 1 degree. Resolution cells would be of width 8 cm at 5 meter range, and 1.7 meters at 100 meter range. The system would provide acoustic backscatter in real time as a function of 3-dimensional position. This would then be related to biomass to compute the distribution of biomass in 3-dimensions in a large volume of sea water at periodic intervals. Finally, a ship towing this type of sonar at a rate of 5 knots could map a volume of water approximately 2 km x 2 km and 300 meters deep every 2 hours at 0.3 meters cubed resolution.

It is also important to realize that various studies will need to be performed in order to make sense of this information. In particular, more in situ studies to correlate acoustics and organism density will be necessary. This will include target strengths for volume scattering. These can be performed concurrently, as the instrumentation becomes available. Then, the relationship between organism densities and the acoustic reverberation can be inverted to obtain the latter. Other issues as to navigation, communication, signal design, and processing seem more straightforward, with digital techniques being an inexpensive and high fidelity option.

Acoustic Plankton CTD Profiler

Since 1979, the University of Southern California and Tracor have been conducting an interdisciplinary research program titled "Dynamics of small-scale spatial distributions of zooplankton". During the last 5 years, this program has been jointly sponsored by the National Science Foundation and the Office of Naval Research. This research program has resulted in the development of new technology to quantitatively measure small-scale plankton distributions on scales of meters in depth, over 10's of kilometers horizontally. Zooplankton distributions determined acoustically, by size (0.1 to 10 mm), have been compared to phytoplankton distributions measured fluorometrically and to the physical environment. The technology developed in this research is embodied in the Multi-frequency Acoustic Profiling System (MAPS), a 21 frequency acoustic echo-ranging, echo-integration system which operates in an acoustic frequency band, 100 kHz to 10 MHz, which is appropriate for detecting and quantifying zooplankton biovolume (biomass). The MAPS has been employed in both a cast mode and in a towed (sawtooth in depth) mode at speeds up to 10 knots. Acoustic data collected with this system are transformed from acoustic volume scattering strengths to plots of zooplankton abundance versus size and depth for individual casts or oblique tows along a transect. in one operating mode, acoustic estimates of abundance versus size for individual casts are combined to illustrate two dimensional spatial distribution. In a second mode, sequential casts at a drogue location are combined to illustrate temporal variations. In both of these operating modes, the observed zooplankton distributions are compared with data collected at the same time for temperature, salinity, and chlorophyll fluorescence.

Successful use of this system has occurred in the Southern California Bight, in the Coastal Transition Zone off northern California, in the Gulf Stream (across the western wall of the Gulf Stream off Cape Hatteras and off Cape Canaveral) in the western Atlantic slope water, and in the Irish Sea.

Analyses of the various MAPS data sets have shown that zooplankton distributions were usually related to some aspect of the physical (oceanographic) system in which the animals live. The pattern of distributions, however, were different in the various oceanographic systems that we have investigated. Off southern California there was a trend towards a general coherence between zooplankton distributions of all size classes, the chlorophyll maximum and the permanent thermocline on vertical scales of 10's of meters. This pattern broke down on smaller vertical scales. In the Gulf Stream, measurements showed that different size classes of zooplankton had similar distributions. These pattems varied with physical boundaries and water mass intrusions, and were not coherent with chlorophyll peaks.

Distributions in the Coastal Transition Zone (CTZ) off northern California and in the Irish Sea revealed quite different results. In these areas, different sizes of zooplankton exhibited different distributional pattems. Distributions in the CTZ region appeared to be controlled by the various current systems. Some organisms were most abundant in the cooler filament water, while other abundance peaks were on the filament boundary or outside of it. As in the Gulf Stream, pattems of distributions appeared to be more related to physical events, or behavioral activities in response to physical events, rather than chlorophyll distributions. Oceanographic complexity in the waters of the Irish Sea again seemed to control distributions, with peaks of different size classes in different oceanographic regimes. For example, some organisms were associated with the stratified water off the Irish coast, while others showed peaks near the bottom (90 m) just above an undersea ridge. Other zooplankters were more concentrated in the mixed waters off the English coast and yet another group were near the boundary between the stratified and mixed water-column areas.

The above data sets represent snap shots of one brief period with the MAPS in different areas. Each area was unique and illustrated the complexity of the various oceanographic systems. Our results indicated the importance of a systems approach to studying any area. This includes concurrent measurements of the various physical and biological parameters and, hopefully in future studies, measurements of how these vary over time. Out-growths of this work are the functional designs for some 2-frequency moored instruments, a 5-frequency mini-MAPS, a dual beam/envelope statistics instrument, a towed multi-frequency system, and even some expendables for use in places like Antarctica.


homepage contents previous newsletter next newsletter