[an error occurred while processing this directive] CWQMC Wetland Monitoring Workgroup

Elements of Wetland and Riparian Area Monitoring Plan (WRAMP)

Key Definitions

What are Monitoring and Assessment?

WRAMP can be used for assessment and monitoring. An assessment is an observation or report of condition for one area and time period based on monitoring results. An initial assessment based on an initial monitoring effort establishes a baseline measure of condition. Monitoring can also mean a series of repeated assessments, as needed to assess changes in condition over time. Repeated monitoring is essential to assess temporal trends in the condition of wetlands and streams.

What is the Relationship of Monitoring and Assessment to Research?

WRAMP distinguishes monitoring and assessment from research. Monitoring and assessment reveal patterns of change in condition that are the basis for formulating hypotheses of causal relationships that are tested by research. In short, monitoring and assessment reveal how conditions change, whereas research explains why. WRAMP can be adapted to research by incorporating the element of experimental design.

What are the Kinds of Monitoring Supported by WRAMP?

WRAMP is intended to be used for at least three basic kinds of monitoring, as described below.

What is a Project?

Projects are on-the-ground efforts to improve or protect the abundance, diversity, or condition of wetlands or streams. The WRAMP toolset can be applied to four kinds of projects:

In the context of regulatory review, projects are often defined according to the definition of project in the California Environmental Quality Act.

What is the Watershed or Landscape Approach?

WRAMP is designed to support monitoring and assessment of wetlands and streams, including projects, in a watershed or landscape context.

A landscape is defined as a heterogeneous land area characterized by self-similar, persistent mosaics of interacting land uses or habitat types. In other words, landscapes tend to be visually self-evident. Examples of landscapes include large deltas (e.g., the Sacramento-San Joaquin Delta), large valleys (e.g., Round Valley, Sierra Valley, and self-similar areas of the Sacramento Valley), and large plains (e.g., the Santa Rosa Plain, Vina Plain, Carrizo Plain). The size of a landscape is determined by the dimensions of its repeating mosaics of land use or habitats.

Watersheds are defined as areas draining to a common place, as evident in the USGS Watershed Boundary Dataset, or as demarcated using either the USGS StreamStats tool, or the Landscape Profile Tool of the California EcoAtlas.

WRAMP broadly supports the watershed or landscape approach to aquatic resource monitoring and assessment, as well as the watershed approach, to mitigation planning called for by Regional Compensatory Mitigation and Monitoring Guidelines of the US Army Corps of Engineers (USACE) in coordination with USEPA. A video produced by USACE that helps explain these new guidelines is available online.

WRAMP incorporates tools designed to implement the watershed or landscape approach to project siting and design, project tracking, project assessment, aquatic resource mapping, ambient monitoring design, and synthesis and reporting of aquatic resource condition.

WRAMP Diagram and Guidance

The WRAMP Diagram lays out the major elements of WRAMP as 10 steps that should be considered in sequence. The diagram applies equally well to impact assessment, compliance monitoring, effectiveness monitoring, and ambient monitoring. It can be adapted to research by incorporating the element experimental design (Step 3). Key definitions are described above and a discussion of the watershed or landscape approach to monitoring aquatic resources are provided in the WRAMP Overview on the CWMW website.

Users of WRAMP should read through the text for each step of the WRAMP Diagram. While users might choose to focus on one element of WRAMP, they will benefit from knowing how the elements relate to each other. The WRAMP Diagram can also be used as a checklist to make sure all the elements of a monitoring plan have been adequately considered.

WRAMP Flowchart

Step 1: Driving Concerns

The first step in the WRAMP Framework is to understand exactly why monitoring and assessment are needed.  The WRAMP Framework therefore begins with one or more clearly-stated programmatic, regulatory, or management questions or decisions that cannot be addressed without new information from assessing the aquatic resources at one or more points in time. There is commonly a need to repeatedly revisit the concerns until they are defined well enough to be translated into Monitoring Questions (Step 2). Some typical concerns are presented below:

Step 2: Monitoring Focus

Monitoring and assessment gain focus by translating the driving concerns into monitoring questions, determining the timeline for answering the questions, and defining the geographic scope of the monitoring effort.

Step 3: Conceptual Modeling

For the purposes of WRAMP, conceptual models are tools for identifying the factors and processes that must be monitored to address the driving questions or decisions. The models should focus on cause-and-effect relationships that can strongly affect the monitoring results. Box and arrow models are recommended, where the boxes represent factors and the arrows represent their interrelationships. The models should reflect what is known as scientific fact, what can be extrapolated from the facts, and what is likely based on consensus professional judgment. The models should indicate which of these three levels of scientific certainty applies to each interrelationship.  All the major assumptions of the models should be documented.

Research to explain monitoring results can be added to the WRAMP framework at this step. The conceptualized relationships among the various factors included in the models can be translated into testable hypotheses. Additional conceptual modeling would then be needed to design the test. The hypotheses might also be developed at Step 1; the driving question could be about the causes of reported monitoring results. In this case, the conceptual modeling would only be to identify factors and processes to measure when testing the hypothesis, rather than for monitoring.

Step 4: Data Needs

Needed data are identified based on the timeline (Step 2) and the conceptual modeling (Step 3). In essence, the needed data represent factors that the models suggest are most directly related to the monitoring questions and hence the Driving Concerns (Step 1). The different kinds of needed data can be termed indicators. An indicator might consist of one variable, such as stream flow or plant cover, or it might be an index, such as CRAM, that consists of multiple variables.

An indicator can represent one or more aspects of condition or stress. A stressor is defined as any physical, chemical, or biological factor that can negatively impact the abundance, diversity, or condition of an aquatic resource (including stream and wetland plants, animals, habitats, and ecosystems). In general, stressors are monitored to help explain condition. The relationships between stressors and condition are seldom well understood, however, and including stressors in a monitoring effort can greatly increase its costs. For these reasons, monitoring should focus on condition. If monitoring reveals that conditions are declining, stressors can be added to the monitoring plan to help understand the declines. Special studies to experimentally test conceptualized relationships between stress and condition are an alternative to monitoring stressors.

Lagging indicators are used to assess existing conditions or stress, whereas leading or predictive indicators are used to assess likely future conditions. Leading indicators are usually based on well-known cause-and-effect relationships represented by the conceptual models (Step 3).  Some indicators can be lagging indicators in some regards, and predictive indicators in other regards. For example, the hydrograph of a stream or the hydroperiod of a wetland may serve to indicate existing hydrological conditions, and to predict future conditions for related factors such as stream stability or wetland plant community structure. The selected indicators should be classified as lagging or leading, based on the conceptual modeling (Step 3).

Analytics refers to the graphic and statistical methods of data analysis that will be used to summarize the monitoring results and prepare them for interpretation (Step 8). It’s important to select the analytics during the identification of data needs to make sure that all the data needed for the analyses are collected during monitoring. The analytics should include procedures for data quality assurance and quality control (QAQC). QAQC procedures have been prepared for data collection methods adopted by the Surface Water Ambient Monitoring Program (SWAMP), and are available online.

Step 5: Classification of Data

Every kind of data, indicator, and method of data collection used to assess wetlands and streams can be classified into one of three categories or levels, based on the three-level classification system developed by the USEPA.

Step 5 is accomplished by asking the following question: how can the driving question or decision be addressed using Level 1 (L1) methods, Level 2 (L2) methods, and/or Level 3 (L3) methods?  Level 1-3 data are often integral components of a monitoring plan. For example, L1 maps of the aquatic resources or project(s) to be assessed can serve as the sample frame for data collection using L2 or L3 methods. In some cases, strong positive correlation between L2 and L3 data can justify using less expensive L2 methods as proxies for L3 methods.

In general, monitoring costs increase with the level of monitoring data and methods. It is essential to explore how L1-L3 methods can be used to meet the data needs identified in Step 4, with an emphasis on maximizing the use of existing data and new L1 and L2 data. Each requirement for new L3 data should be carefully rationalized to account for their relatively high cost. The differences between L1-L3 methods and data are explained below. WRAMP can include additional L1-L3 SOPs that involve statewide technical advisory committees, rigorous field testing, and vetting with intended user communities.

Level 1 (L1). L1 includes maps and other inventories and databases for environmental information, plus the data and indicators provided by these sources, as well the methods to create them. L1 methods are necessary to answer driving questions about the location, distribution, abundance and diversity of aquatic resources and related projects in the watershed or landscape context. Some existing L1 tools for assessing wetlands and streams are listed below:

Level 2 (L2). L2 includes data, indicators, and methods for rapid field assessments of wetlands and streams. Rapid assessments typically require less than a day to apply at least once, and do not rely on the collection of field materials or any laboratory analysis. Most L2 methods are qualitative or semi-quantitative. Examples of L2 methods are described below.

Level 3 (L3). L3 includes field data to quantify one or more aspects of aquatic resource condition or stress, relative to other aspects, or per unit time or space. L3 data may include any measures of specific ecosystem parameters, including physical, chemical, and biological data.  WRAMP requires that L3 data be collected using appropriate procedures and methods, such as the standardized survey protocols used by state and federal wildlife agencies to monitor and assess fish and wildlife habitats and populations, plant community composition, noxious weed surveys, and similar survey protocols.

Step 6: Sampling Plan

Once the monitoring methods have been selected, a plan of data collection must be developed. As stated under Step 5, every monitoring plan should maximize the use of existing data, and the collection of new data should focus on L1 and L2. The collection of relatively expensive L3 data should be carefully rationalized.

The WRAMP toolset supports targeted sample designs, which use fixed sampling sites, random designs, which draw sampling sites at random from a population of possible sites within the geographic scope of the monitoring and assessment effort, and probabilistic designs, which account for the probability and any candidate site being included in a random sample. The WRAMP Toolset also supports exhaustive surveys of wetland and stream condition, which consist of assessments of every wetland or stream within a prescribed survey area. The best choice in sampling design can depend on the driving concerns and geographic scope. USEPA provides online help in choosing sampling designs.

For driving questions and decisions that can be addressed with L1 data, an exhaustive survey is often most appropriate. Budget constraints usually preclude exhaustive surveys using L2 or L3 methods, unless the scope of the survey is small. A targeted design is required to track changes in condition for fixed sites, such as reference sites, over time. Concerns about the overall condition of aquatic resources for large areas are usually best addressed using a probabilistic sampling design. USEPA provides online help with probabilistic designs for aquatic resources.

Sampling designs should consider the need to calibrate data collection methods and to train monitoring personnel. Careful control of systematic error is a hallmark of successful monitoring.

Step 7: Information Development

This is the most technically demanding and expensive element of the WRAMP Framework. It involves archiving maps of projects and sampling sites, data collection, and data analysis and interpretation. Each of these activities requires careful attention to many details that differ from one monitoring and assessment effort to another. Only the most basic aspects of the activities are noted below.

Step 8: Results and Assessment

The monitoring results will consist of the finalized project and sample site maps, the finalized data, and a report detailing the other outputs of WRAMP Steps 1-7. All of these outputs for one monitoring period or cycle comprise an assessment.
The results should be formatted to directly address the driving regulatory or management question or decision. If the purpose of the assessment is to answer a question, a finding of yes or no is ideal, with a clear presentation of the supporting evidence. If the answer is uncertain, the likelihood of yes or no should be explained, based on the monitoring results. Equivocal findings should be appended with recommendations to revise the monitoring plan to increase the certainty of its findings. This can involve revising any of the outputs from WRAMP Steps 1-7. If the purpose of the assessment is support a decision, its findings must be formatted to fit neatly into the decision process. For example, the findings could be a narrative, one or more graphs or formulas, or a table of numerical values. Formatting the assessment to fit the decision process can involve input from the decision makers.

Step 9: Storage and Delivery

The finalized data sets and assessment report should be uploaded into the interactive repositories in California EcoAtlas that were created by uploading the project map or ambient sample site maps into Project Tracker (Step 7). Water Quality data collected using protocols provided by the Surface Water Ambient Monitoring Program (SWAMP) can be uploaded into the CEDEN. Other datasets should be uploaded as digital flat files from a word processor program, such as Word, or from a spreadsheet program, such as Excel. This makes the monitoring data and assessment report readily accessible to the public through simple spatial queries using EcoAtlas, or by querying the online Project Tracker database. Project Tracker can also be used to deliver the finalized datasets and assessment reports to clients, sponsors, and other interests. 

An optional approach to uploading datasets and reports directly to Project Tracker is to enable the Project Tracker to access them from a separate internet server using web services. Web services provide a standard means of interoperating between software applications running on a variety of platforms and frameworks. One advantage of using web services is that the datasets and reports can be delivered through Project Tracker without having to exist within the Project Tracker database. The Landscape Profile Tool of the California EcoAtlas uses web services to access data from a variety of sources, including CEDEN. For delivering monitoring data and assessment reports through Project Tracker using web services, contact the EcoAtlas development team.

Step 10: Driving Concerns

The final Step in the WRAMP Diagram is to consider how well the assessment has addressed the Driving Concerns (Step 1), if they need to be revised, and to determine if there are new concerns that need to be addressed through monitoring and assessment using WRAMP.

 

(Updated 4/29/16)

 

[an error occurred while processing this directive]