HomeMy WebLinkAboutSuWa296Alaska Resources Library & Information Services
Susitna‐Watana Hydroelectric Project Document
ARLIS Uniform Cover Page
Title:
SuWa 296
Comments on Revised Study Plan for Susitna-Watana Hydroelectric
Project, FERC Project No. 14241
Author(s) – Personal:
Tom Crafford (writer of cover letter)
Author(s) – Corporate:
Alaska Department of Natural Resources. Office of Project Management and Permitting
Alaska Department of Environmental Conservation
Alaska Department of Fish and Game
AEA‐identified category, if specified:
AEA‐identified series, if specified:
Series (ARLIS‐assigned report number): Existing numbers on document:
Susitna-Watana Hydroelectric Project document number 296
Published by: Date published:
[Anchorage, Alaska : Alaska Energy Authority, 2013] January 17, 2013
Published for: Date or date range of report:
United States. Federal Energy Regulatory Commission
Volume and/or Part numbers: Final or Draft status, as indicated:
Document type: Pagination:
letter with attachments 68 pages in various pagings
Related work(s): Pages added/changed by ARLIS:
Comments to: Initial Study Report. (SuWa 223)
Notes:
Attachment 2:
Guidance document for writing a Tier 2 Water Quality Monitoring Quality Assurance Project Plan
(QAPP). -- Draft, Rev. 0.
All reports in the Susitna‐Watana Hydroelectric Project Document series include an ARLIS‐
produced cover page and an ARLIS‐assigned number for uniformity and citability. All reports
are posted online at http://www.arlis.org/susitnadocfinder/
DEPARTMENT OF NATURAL RESOURCES
OFFICE OF PROJECT MANAGEMENT AND PERMITTING
SEAN PARNELL, GOVERNOR
17 January 2013
Ms. Kimberly Bose, Secretary
Federal Energy Regulatory Commission
888 First Street
Washington D.C. 20426
Subject: Comments on Revised Study Plan for Susitna-Watana Hydroelectric Project, FERC
No. 14241
Dear Ms. Bose:
Following please find consolidated comments from the Alaska Resource Agencies on the Revised
Study Plan for the Susitna-Watana Hydroelectric Project (Project No. 14241).
The State of Alaska is committed to working with AEA and other stakeholders through out the
Federal Energy Regulatory Commission's (FERC) Integrated Licensing Process (ILP). As such,
the State’s agency staff is available to work collaboratively with FERC, the Project Proponent, and
other agencies and stakeholders in achieving quick resolution to any remaining identified
information needs prior to this season’s fieldwork. Please do not hesitate to contact my office if I
can be of service in facilitating resolution on any outstanding issues prior to this field season.
The state remains a strong proponent of timely decision-making and looks forward to working
collaboratively with FERC and all stakeholders through this process, as well as any subsequent
permitting of the proposed project.
Sincerely,
Tom Crafford, Director
Office of Project Management and Permitting
550 W. 7TH AVENUE, SUITE 1400
ANCHORAGE, ALASKA 99501
PH: (907) 269-8431 / FAX: (907) 334-8918
tom.crafford@alaska.gov
Secretary Kimberly Bose 17 January 2013
State of Alaska Resource Agency RSP Comments FERC No. 14241
Page 2 of 68
cc:
Daniel Sullivan, Commissioner, Department of Natural Resources
Cora Campbell, Commissioner, Department of Fish and Game
Larry Hartig, Commissioner, Department of Environmental Conservation
Ed Fogels, Deputy Commissioner, Department of Natural Resources
Joseph Balash, Deputy Commissioner, Department of Natural Resources
Kelly Hepler, Special Projects Coordinator, Department of Fish and Game
Secretary Kimberly Bose 17 January 2013
State of Alaska Resource Agency RSP Comments FERC No. 14241
Page 3 of 68
The Alaska Departments of Environmental Conservation (ADEC) and Fish and Game (ADFG)
provide the following comments on the Revised Study Plan (RSP) for the Susitna-Watana
Hydroelectric Project (FERC No. 14241).
I. ALASKA DEPARTMENT OF ENVIRONMENTAL CONSERVATION
The Baseline Water Quality Monitoring Sampling and Analysis Plan (SAP)/Quality Assurance
Project Plan (QAPP) for the Susitna Hydro Project provided for review would not be acceptable
for approval by ADEC at this time. Attached please find ADEC’s Quality Assurance Plan Review
Checklist with specific comments indicating which elements require revisions before the Baseline
Water Quality Monitoring Sampling and Analysis Plan (SAP)/Quality Assurance Project Plan
(QAPP) for the Susitna Hydro Project can be considered for approval by the ADEC Division of
Water (DOW). A draft guidance document for writing a Tier 2 Water Quality Monitoring Quality
Assurance Project Plan (QAPP) is also attached to provide guidance to the project proponent for
submitting the SAP/QAPP to ADEC.
A sample of the additional information needed for ADEC approval of the QAPP is summarized
below. Please refer to the attached QAPP Review Checklist for the Susitna Hydro Project Baseline
WQ Monitoring Sampling and Analysis Plan QAPP for the complete list of comments providing
details and further guidance.
QA Management
It is important to note the submitted QAPP is not clear as to which single individual is ultimately
responsible for the QAPP. The information as presented does not clearly characterize
responsibilities of project personnel; it also appears QA management is not independent from
project management. QA management should be completely independent from project
management in order to maintain the integrity of the process. Clear lines of management authority
must be defined, including: 1) line of management authority, 2) line of data reporting
responsibility (this includes relevant sampling and/or lab contractors/sub contractors), and 3)
independent line of quality assurance authority. See example in attached “Guidance for Tier 2
Water Quality Monitoring QAPP Rev 0. Section A.4.”
Data Management Process
Since this is a complex project with multiple individuals responsible for various components, it is
also critical that the data management process be described in sufficient detail to ensure all
responsible individuals are fully knowledgeable of their individual duties and responsibilities and
how they integrate with the overall project data management scheme. The QAPP needs to
characterize in detail the project’s data management process tracing the path of the data from
generation to their final use or storage [e.g., from field measurements and sample
collection/recording through transfer of data to computers (laptops, data acquisition systems, etc.),
laboratory analysis, data validation/verification, QA assessments and reporting of data of known
Secretary Kimberly Bose 17 January 2013
State of Alaska Resource Agency RSP Comments FERC No. 14241
Page 4 of 68
quality to the respective ADEC Division of Water Program Office]. Additionally, data
management must also discuss the control mechanisms for detecting and correcting errors.
Data Acquisition
The QAPP is missing strong justification for the proposed very limited temporal data set of
analytes to be measured in various sample matrices (water, groundwater, soil/sediment and fish
tissue). The QAPP should provide clear rational for monitoring project design and the assumptions
used to develop the design. Please provide sufficient justification that the data set is adequate to
reliably characterize the Susitna drainage for development of a model for damn construction and
post damn construction impacts.
Site Selection
Site selection rationale is generally addressed but better clarity is needed in characterizing the
specific rationale for each type sample matrix/analyte. Table format would be easier to follow
than a narrative description of sites, analytes and sample matrixes to be measured.
Sample Frequency
Sample frequency is also addressed but is unclear how many total samples are planned, besides
stating “monthly, each sampling event, one survey-summer,” etc. This is confusing as it does not
identify the number of samples scheduled for collection per site/analyte and whether the number of
planned samples is adequate to characterize the watershed sufficient for reliable model
development. Please revise accordingly before submitting the QAPP for approval.
Criteria for Measurement
The QAPP should state and characterize the Measurement Quality Objectives (MQOs) as to
applicable action levels or criteria for each parameter measured (precision, bias, comparability,
detectability (mdl and pql) and data completeness) and provide appropriate definition and
algorithms for each. Some project MQOs are missing or are not adequately defined as well as the
applicable most restrictive AWQS for each analyte/sample matrix.
Include measurement method (note, must be EPA CWA approved for water/wastewater work for
all water quality methods, unless the applicable drinking water method has the more restrictive
AWQS than the applicable water/wastewater AWQS. Some of the methods are specified in the
QAPP’s section of Quality Control; however a number of the proposed methods are not acceptable
for water/wastewater analysis under the EPA CWA and ADEC AWQS regulations. Examples of
proposed methods that would not be acceptable are given below; please refer to the attached QAPP
Review Checklist for more detailed comments and guidance.
Proposed Metal Analytcial methods 6010B and 6020A are not acceptable EPA CWA water
wastewater work. Select only EPA CWA water/wastewater methods of analysis with
adequate sensitivity.
Secretary Kimberly Bose 17 January 2013
State of Alaska Resource Agency RSP Comments FERC No. 14241
Page 5 of 68
Fecal coliform method EPA 1604 is DW approved but not EPA CWA approved for
water/wastewater. Clarify what is the applicable method and for what compliance purpose.
Mercury in water – method 7470A not acceptable for EPA CWA water/wastewater
analysis
Missing specific analysis methods for DO, pH, temperature, turbidity, redox potential,
color, residues. Provide specific EPA CWA approved method of analysis for each of these
parameters
Radionuclides – specify what method is for what specific radionuclide.
Before submitting the QAPP to ADEC for approval, it is recommended a review all the proposed
methods of analysis selection is completed to ensure the methods are appropriate methods of
analysis for the applicable sample matrix and adequate measurement sensitivity. Ensure the
appropriate precision and accuracy acceptance criteria are specified.
Additionally no specific numeric regulatory or guidance standards are specified for each pollutant
and sample matrix for which sample results will be compared against to assess compliance with or
with which to assess future measurement results against during and/or after post damn
construction. These must be specified for all analytes/sample matrices.
Please clarify that Project Precision is to be assessed via replicate sample measurements, not
sample duplicate measurements and revise precision acceptance criteria limits as applicable.
Historical Data/Nondirect Measurements
The QAPP mentions it will be using some USGS data from stations along the Susitna drainage.
QAPP needs to define how it will assess the reliability of this data for use in the project.
It appears that some historical data may be used for qualitative assessment only. The terms and
conditions of how, when, where and why must be defined if data is to be used, especially if data is
of unknown or questionable reliability.
The issue of historical data and how reliable the data is needs to be adequately addressed. Section
2.0 of the QAPP provides an overview of the project and mentions that large amounts of data were
collected in the 1980s as well as availability of other data (USGS, etc) data that will/may be used
to augment proposed project monitoring data to develop a model. However, no summary data is
provided in the QAPP. The QAPP states, “A comprehensive data set for the Susitna and
tributaries is not available.” It would appear critical that if historical data is intended to be used
that development of a comprehensive data set be a key component of this QAPP as well as a
critical data quality assessment on the reliability of the historical data for use in the project’s goals.
Assessments and Oversight
Frequency and occurrence of all assessments must be specified in the QAPP. Responsibility for
scheduling and conducting audits, issuing report findings and monitoring corrective actions l ies
with Project QA Officer (QAO). The Project QAO must have sole responsibility for all
Secretary Kimberly Bose 17 January 2013
State of Alaska Resource Agency RSP Comments FERC No. 14241
Page 6 of 68
assessments performed. The TT Technical lead may neither perform audits nor direct audits.
QAO must be completely independent from direct management of project monitoring operations
and the TT Technical Lead and TT PM. The Project QAO may delegate specific QA duties to
other staff, however such staff work only under his/her direction. Please revise the QAPP
accordingly.
The QAPP mentions audits but provides no specifics. This section must identify assessment types,
frequency and acceptance criteria. Please refer to the attached QAPP Review Checklist for
assessment requirements.
Additional Comments
Please refer to the attached QAPP Review Checklist for the Susitna Hydro Project Baseline WQ
Monitoring Sampling and Analysis Plan QAPP for the complete list of ADEC comments providing
details and further guidance.
II. ALASKA DEPARTMENT OF FISH AND GAME
The Alaska Department of Fish and Game’s (ADF&G) reviewed the Revised Study Plan (RSP)
provided by the Alaska Energy Authority (AEA) on December 18, 2012. On December 31, 2012,
the Federal Energy Regulatory Commission (FERC) stated that 13 of the 58 RSPs needed
additional information and issued a modified schedule for completing the requested information.
The following comments are submitted on all the RSPs pertinent to the mandate of ADF&G. We
look forward to reviewing the additional information on the 13 RSP’s when available.
7.6 Ice Processes in the Susitna River Study
Overall, we agree with the general approach.
7.6.4.7. The study indicates “The model will also predict ice cover stability, including potential for
jamming, under load-following fluctuations.” It is not clear if the results will describe the depth
of ice under project operations compared to baseline conditions or the potential to induce ice scour
(and where) under load-following fluctuations. These effects may have profound impacts on
available fish habitat and successful incubation.
From the study description, it appears the model will primarily predict physical conditions (e.g. ice
decay, ice cover formation, potential for break up jams) which may indirectly provide information
on potential impacts to fish and aquatic resources but may lack a direct causal relationship. For
example, how will winter load-following fluctuations impact burbot which have fairly specific
winter spawning and over-wintering habitat needs?
Secretary Kimberly Bose 17 January 2013
State of Alaska Resource Agency RSP Comments FERC No. 14241
Page 7 of 68
8.5. Fish and Aquatics Instream Flow Study
8.5.1.2. Sufficient time and discussion should be planned to select study areas and sampling
procedures with the Technical Working Groups (TWG) due to the large study areas, number of
affected resources and variety of sampling methods to be evaluated.
In addition to water velocity within study areas subdivisions over a range of flows during seasonal
conditions metric, we also recommend water depths for the same areas of information.
The Decision Support System-type framework that is proposed to conduct a variety of post-
processing comparative analyses should also include information/ linkages to other pertinent
ecological data, such as water temperatures and turbidity. This would enable comparison of
different project operation scenarios to baseline conditions.
8.5.2.1. This section provides a good summary of existing instream flow, fish habitat, and aquatic
resource information for the Susitna River basin.
8.5.4.1. While Stalnaker (1995) is an excellent resource on instream flow assessments, it is a
primer that provides an introduction to the science. A more thorough report that would be better
applicable as a reference guide would be Bovee et al. (1998) “Stream Habitat Analysis Using the
Instream Flow Incremental Methodology” USGS/BRD Information and Technology Report-1998-
0004.
It will be important for the Instream Flow System (IFS) framework to have the ability to compile
information for both cumulative evaluations as well as provide for independent habitat -specific
evaluations (e.g. for a specific location, target species, etc.).
8.5.4.2.1.2. We support the selection of and location of the ten intensive study areas also called
focus areas for the evaluation of multiple resource disciplines, with the intent to further evaluated
for appropriateness based on results of the habitat mapping. We also support the selection of
additional transects outside of the focus areas for evaluation flow-habitat response characteristics.
8.5.4.4.1.1. We concur with proposed methods and techniques following accepted USGS
guidelines including the use of a Standard Operating Procedure (SOP) to provide uniform survey
methods across all controls, maintenance of USGS local datum offsets to enable incorporation of
USGS gage data, and stream discharge measurements.
8.5.4.5.1.1. We support the use of site-specific habitat suitability criteria (HSC) collected for
identified target species and life stages. We encourage discussion on appropriate sample sizes,
contingencies, and other factors important to criteria development. Bootstrap analysis should be
used in situations with a confidence that the sample is representative of the true population. One
measure of these criteria is whether samples collected are representative of habitats across the
surveyed area.
Secretary Kimberly Bose 17 January 2013
State of Alaska Resource Agency RSP Comments FERC No. 14241
Page 8 of 68
8.5.4.5.1.2. A summary of the pilot 2012-2013 winter habitat sampling results and
recommendations for future winter sampling methods is needed for review and discussion of future
winter habitat sampling methods.
Further discussion is needed on the stranding and trapping study and analyses to clarify procedures
and expected results. We support the general approach but we are not clear on how the referenced
equation would be used to analyze affected resources.
We agree with the approach to develop fish species periodicity table. After completion of the field
studies it is likely new information on fish species and life stages timing will learned that will
needed to be incorporated into the tables prior to final analyses.
8.5.4.7.1.2. We support the spatial analysis approach outlined as a starting point. Further
discussion will be needed on the details and how the data from multi-thread channels will be
compiled and aggregated.
8.5.4.7.1.3. Sensitivity analysis of the habitat modeling efforts will be a key to understanding
habitat response parameters and uncertainty and we support a thorough analysis. We look forward
to working cooperatively on the development of a Decision Support System.
9.5 Study of Fish Distribution and Abundance in the Upper Susitna River
In general, additional details regarding statistical design and analytical methods would strengthen
this study plan and enhance the ability to review and provide high-quality feedback. Plans should
identify the specific questions driving each main objective and detail how and to what level of
accuracy and precision these investigations are expected to inform questions of fish periodicity,
distribution, and abundance. Lacking the details of how field data will be reviewed and analyzed, it
is difficult to have confidence that the final results will provide reliable information. Adequately
detailed study plans will increase the likelihood that data collected will provide robust information
to predict potential project impacts and understand baseline conditions.
9.5.4.3.1 Objective 1: Fish Distribution, Relative Abundance, and Habitat Associations, Page 9-12
How will relative abundance data be used (what usable information will it provide above
presence/absence)? Given the current sampling scheme, how will the extent of variability in CPUE
be assessed in order to determine the effort required to detect differences in relative abundance by
species, habitat, season, etc.? At what precision are differences in relative abundance between
sites, habitats, and species and life stages likely to be detectable?
9.5.4.3.1 Task B: Relative Abundance, Page 9-13
Capture efficiency varies by species/life stage, habitat and gear type. Comparisons of CPUE
between gear types will not provide reliable information. Collecting CPUE using multiple gear
types will make comparisons between habitat types (or species, sites or life stages) unrealistic, if
each habitat type (or other factor) is sampled with different gear.
Secretary Kimberly Bose 17 January 2013
State of Alaska Resource Agency RSP Comments FERC No. 14241
Page 9 of 68
If relative abundance efforts are unlikely to provide robust information, perhaps resources should
be reallocated to increase radio tagging efforts, which are likely to result in high quality
information.
9.5.4.3.1 Task C: Fish-Habitat Associations, Page 9-13
What statistical methods are proposed for the “… analysis of fish presence, distribution, and
density by mesohabitat type by season.” Will the current sampling scheme provide adequate
sample sizes for meaningful comparisons, appropriate statistical power and accurate results?
9.5.4.3.2 Task B: Describe seasonal movements using biotelemetry, Page 9-14
“…Up to 30 radio transmitters will be implanted in selected species…” Please clarify if this is 30
per species or 30 total.
ADF&G suggests directing as much efforts as possible towards radio tagging efforts as they are
likely to provide more usable information on fish habitat use than PIT tagging and relative
abundance estimates
9.5.4.3.2 Task C: Describe juvenile Chinook salmon movements, Page 9-14
“…All juvenile Chinook salmon of taggable size need to be tagged to obtain sufficient sample
size.” What is the sample size goal? For what analytical method? Are there alternative analytical
methods if sample size goals are not met?
9.5.4.4.5 Trot Lines, Page 9-17 states:
“Trot lines are typically… with a multitude of baited hooks… anchored at both ends.”
Trotlines are lethal. Hoop traps are a preferable method of fish capture where they can be used
effectively. For clarification and details on trotline methods, please visit ADF&G’s website at:
http://www.adfg.alaska.gov/index.cfm?adfg=anglereducation.burbot
9.5.4.4.6 Snorkel Surveys, Page 9-18
The use of snorkel surveys to develop accurate, reliable calibration factors for comparison between
capture methods is likely to require large sample sizes and long term datasets. How will
meaningful calibration factors be developed in this study?
9.5.4.4.7 Fyke/Hoop Nets, Page 9-18
3-4 foot diameter fyke nets are routinely used for juvenile salmon, even in small tributaries. One-
foot diameter seems small.
9.5.4.4.10 Out-Migrant Trap, Page 9-19
Why 48 hours on, 72 hours off? What are the information and sample size goals for out -migrant
capture? Is this to be used for timing of out-migration only or will abundance estimates be
generated as well (mark/recapture)? Might attendance of the trap be altered as the out-migration
progresses in order to maximize sampling during peak out-migration?
Secretary Kimberly Bose 17 January 2013
State of Alaska Resource Agency RSP Comments FERC No. 14241
Page 10 of 68
9.5.4.4.11 Fish Handling, Page 9-20
This section states that five fish per species/age class per sampling site will be sampled for
stomach contents, and refers readers to 9.8.4.7 for details. 9.8.4.7 contains no information.
9.8.4.11, Page 9-120 in the river productivity plan states that a total of eight fish per species/age
class will be sampled for stomach contents. Will stomach contents be sampled in five fish per
species/age class in each of the 18 river productivity sites (144 total stomachs per species/age
class)? Or five fish per species/age class in each of 27 fish distribution sites (135 total stomachs
per species/age class). Clarification is needed.
9.5.4.4.12 Remote Fish Telemetry, Page 9-22
Suggest the use of long-life tags where possible, in order to maximize information return per tag.
9.6 Study of Fish Distribution and Abundance in the Middle and Lower Susitna River
In general, additional details regarding statistical design and analytical methods would strengthen
this study plan and enhance the ability to review and provide high-quality feedback. Plans should
identify the specific questions driving each main objective and detail how and to what level of
accuracy and precision these investigations are expected to inform questions of fish periodicity,
distribution, and abundance. Lacking the details of how field data will be reviewed and analyzed, it
is difficult to have confidence that the final results will provide reliable information. Adequately
detailed study plans will increase the likelihood that data collected will provide robust information
to predict potential project impacts and understand baseline conditions.
9.6.1 Study Goals and Objectives 1), Page 9-39
How will relative abundance data be used (what usable information will it provide above
presence/absence)? Given the current sampling scheme, how will the extent of variability in CPUE
be assessed in order to determine the effort required to detect differences in relative abundance by
species, habitat, season, etc.? At what precision are differences in relative abundance between
sites, habitats, species and life stages likely to be detectable?
9.6.4.3.1 Task B: Relative Abundance, Page 9-45
Capture efficiency varies by species/ life stage, habitat and gear type. Comparisons of CPUE
between gear types will not provide reliable information. Collecting CPUE using multiple gear
types will make comparisons between habitat types (or species, sites or life stages) unrealistic, if
each habitat type (or other factor) is sampled with different gear.
If relative abundance efforts are unlikely to provide robust information, perhaps resources should
be reallocated to increase radio tagging efforts, which are likely to result in high quality
information.
Secretary Kimberly Bose 17 January 2013
State of Alaska Resource Agency RSP Comments FERC No. 14241
Page 11 of 68
9.6.4.3.1 Task C: Fish Habitat Associations, Page 9-46
What statistical methods are proposed for the “… analysis of fish presence, distribution, and
density by mesohabitat type by season.” Will the current sampling scheme provide adequate
sample sizes for meaningful comparisons, appropriate statistical power and accurate results?
9.6.4.3.2 Task B: Describe seasonal movements using biotelemetry, Page 9-46 & 47
This section indicates that up to 1,000 PIT tags per species will deployed (8 resident spp + 3-5
salmon spp. = 11,000-13,000 tags total). How were tagging goals developed?
ADF&G suggests directing as much efforts as possible towards radio tagging efforts as they are
likely to provide more usable information on fish habitat use than PIT tagging and relative
abundance estimates.
9.6.4.3.3 Task C: Determine juvenile salmonid diurnal behavior by season. Page 9-47
Working in open leads is likely to be much more dangerous than working through holes drilled in
stable ice. Extreme caution should be used when planning for any work in open leads.
9.6.4.3.4 Objective 4: Document Winter Movements and Timing and Location of Spawning for
Burbot, Humpback Whitefish, and Round Whitefish, Page 9-48
How many fish per species per site will be targeted for capture to determine gonadal development?
9.6.4.4.4 Trot Lines, Page 9-51
Trotlines are lethal. Hoop traps are a preferable method of fish capture where they can be used
effectively. For clarification and details on trotline methods, please visit ADF&G’s website at:
http://www.adfg.alaska.gov/index.cfm?adfg=anglereducation.burbot
9.6.4.4.6 Snorkel Surveys, Page 9-52
The use of snorkel surveys to develop accurate, reliable calibration factors for comparison between
capture methods is likely to require large sample sizes and long term datasets. How will
meaningful calibration factors be developed in this study?
9.6.4.4.7 Fyke/Hoop Nets, Page 9-52
3-4 foot diameter fyke nets are routinely used for juvenile salmon, even in small tributaries. One-
foot diameter seems small.
9.6.4.4.10 Out-Migrant Traps, Page 9-53
Why 48 hours on, 72 hours off? What are the information and sample size goals for out -migrant
capture? Is this to be used for timing of out-migration only or will abundance estimates be
generated as well (mark/recapture)? Might attendance of the trap be altered as the out-migration
progresses?
Secretary Kimberly Bose 17 January 2013
State of Alaska Resource Agency RSP Comments FERC No. 14241
Page 12 of 68
9.6.4.4.14 Fish Handling, Page 9-58
This section states that five fish per species/age class per sampling site will be sampled for
stomach contents. Section 9.8.4.11, Page 9-120 in the river productivity plan states that a total of
eight fish per species/age class will be sampled for stomach contents. Will stomach contents be
sampled in five fish per species/age class in each of the 18 river productivity sites (144 total
stomachs per species/age class)? Or five fish per species/age class in each of 27 fish distribution
sites (135 total stomachs per species/age class). Clarification is needed.
9.6.4.5 Minnow Traps, Page 9-61
Species/age classes targeted by minnow trapping are likely to occupy different habitats than those
targeted by trot lines. Co-locating minnow traps and trot lines in the same hole is likely to be less
effective than locating each method separately in targeted locations.
9.7 Salmon Escapement Study
9.7.4.1 Fish Capture, Page 9-86
Removing fishwheels at Curry in early September likely misses a substantial portion of the coho
and chum runs. Should consider operating fishwheels through September.
9.7.4.1.1 Fish Capture, Page 9-87
Regarding the newly planned fishwheel(s) to be located in Devils Canyon (RM150-151). What are
the tagging and other goals at this location? Are the tagging goals listed for all middle river
fishwheels (Curry + Devils Canyon) combined?
What hours will fishwheels be operated daily?
9.7.4.2.6 Boat and Ground Surveys, Page 9-95
What is the purpose for obtaining 2 meter resolution for locations of individual salmon
“suspected” to be spawning?
9.7.4.4 Objective 4: Use available technology to document salmon spawning locations in turbid
water in 2013 and 2014, Page 9-96
How will net sampling salmon to determine the degree of sexual maturation reduce confusion
between holding sites and spawning locations? Holding salmon could still be ripe. Is pumping eggs
from gravel to confirm spawning necessary? If so, why? Is it worth disturbing spawning salmon,
potentially influencing the outcome of their spawning, to obtain this information?
9.8 River Productivity Study
9.8.4.3 Benthic Macroinvertebrate Sampling, Page 9-112
How were the number of replicates and total sample size (5 reps x 18 sites) determined and at what
level are differences over time within and among sites likely to be detectable?
Secretary Kimberly Bose 17 January 2013
State of Alaska Resource Agency RSP Comments FERC No. 14241
Page 13 of 68
9.8.4.7 Conduct a trophic analysis, using trophic modeling and stable isotope analysis to describe
the food web relationships in the current riverine community within the middle and upper Susitna
River, Page 9-116
This section is missing.
9.8.4.11 Characterize the invertebrate compositions in the diets of representative fish species in
relationship to their source, Page 9-119 & 120
Clarification on sampling strategy for stomach contents is needed (see comments above).
Additionally, methods for obtaining and preserving stomach contents are not described, but will
likely determine the attainable level of taxonomic resolution for prey items.
9.9 Characterization and Mapping of Aquatic Habitats
In general, we agree with the approach but would like to further discuss some details on the
protocols. For example, it is not clear how runs would be identified compared to a riffle or a
glide. What is the definition for the active channel surface?
Regarding the following statement: "In addition, Susitna River mean daily discharge will be
obtained from the nearest downstream USGS stream gauge and entered onto each day’s survey
forms." We recommend inclusion of the Susitna River mean daily discharge at the Gold Creek
streamgage to provide a means for comparison across different sampling areas and days.
For tier data collection classification protocol, due to the description of a Tier I and III but no
description of Tier II, we are unclear whether there are 2 or 3 categories. How will it be
determined whether a site will be selected for a Tier I versus a Tier III? Will Tier I data also be
collected under the Tier III approach? If not, further discussion on these protocols will be needed;
for example, gradient was not included in the Tier III protocol and we recommend that it be
included.
It is also unclear what is meant is by the following description "To check the general replicability
of the habitat type identification, an independent reviewer conducted video mapping of randomly
selected ground-verified segments representing 20 percent or more of three PHABSIM
reaches." Also, we were not aware that PHABSIM reaches have been identified.
9.11 Study of Fish Passage Feasibility at Watana Dam
9.11.4 Task 4: Develop Concepts, Page 9-188
Explain “fatal flaw analysis” and list the “basic criteria” for fish passage concepts.
Secretary Kimberly Bose 17 January 2013
State of Alaska Resource Agency RSP Comments FERC No. 14241
Page 14 of 68
10.5. Moose Distribution, Abundance, Productivity, and Survival
10.5.4. Study Methods
Continuous amendments of the study plan as this project was underway resulted in inconsistent
wording and errors in tense (i.e. planned future work vs. actual completed work). Although
awkward, these errors are not considered to be significant.
As the principle investigator for this project, DWC has found it necessary to modify the methods
as follows:
Page 10-7: “aerial surveys will be conducted weekly” – Please correct. Surveys will be
conducted every two weeks during this time as shown in Table 10.5-1. Strike “weekly” and
replace it with “every two weeks”.
Page 10-10: “daily monitoring during calving (May 15-31) each year” Strike the dates
May 15-31. We are still working on determining specific dates of peak calving.
10.17 Population Ecology of Willow Ptarmigan in Game Management Unit 13
10.17.4. Study Methods
As the principle investigator for this project, DWC has found it necessary to modify the methods
as follows:
Page 10-146: Use 4-6 capture sites versus the 3 mentioned in the first paragraph of this
section.
Page 10-147: Will not use mist nets to capture ptarmigan. Will use the Coda net gun as
listed in addition to noose carpets.
Page 10-147 Strike the sentence that reads: “Radios will transmit in the frequency range of
148.000 Mhz.” As it turns out, ADF&G will be using a different frequency range. But since
the Department is statutorily required to keep telemetry radio frequencies of monitored
species confidential, simply striking the entire sentence is preferred.
14.5.4.4 Subsistence Mapping
RSP Section 14.5.4.4 lists eight communities to be included in subsistence mapping efforts:
Cantwell, Chase, Healy, Talkeetna, Lake Louise, McKinley Park, Trapper Creek, and Petersville.
However, as a component of the baseline subsistence harvest survey ADF&G Division of
Subsistence will map one year of subsistence activities in the communities of Cantwell, Chase,
Chitna, Gakona, Kenny Lake, McCarthy, Skwentna, Susitna, Talkeetna, and Trapper Creek (2013)
and Copperville, Glennallen, Gakona, Lake Louise, Nelchina, Mendeltna, Paxson, Tazlina,
Tolsona, and Tonsina (2014).
When conducting baseline subsistence harvest surveys it is standard practice for ADF&G Division
of Subsistence to map all subsistence activities which occurred during the study year. Division of
Secretary Kimberly Bose 17 January 2013
State of Alaska Resource Agency RSP Comments FERC No. 14241
Page 15 of 68
Subsistence baseline surveys map search areas by month and map harvest locations for all
subsistence resources. As a component of its mapping activities for Susitna-Watana Division of
Subsistence will also be mapping access routes for subsistence activities.
RSP Table 14.5.5 Communities Selected for Traditional Knowledge, Subsistence Mapping, and
Household Survey should be revised to acknowledge the mapping component of the baseline
harvests surveys in the ADF&G identified study communities. Mapping done as a component of
the baseline subsistence harvest surveys should be labeled as “one-year mapping” to differentiate it
from the historical mapping being done in the communities already listed.
It should also be noted that the Census Designated Place of Susitna North located between
Talkeetna and Trapper Creek, with a population of 1,260 as of the 2010 Census, has not been
included in the study.
14.5.4.5 Traditional and Local Knowledge Interviews
RSP Section 14.4.5 lists eight communities to be included in tradtional and local knowledge
interview efforts: Cantwell, Chickaloon, Chitna, Copper Center, Eklutna, Gakona, Gulkana, and
Tyonek. However, as a component of the baseline subsistence harvest survey ADF&G Division of
Subsistence will conduct local traditional knowledge (LTK) interviews in the communities of
Cantwell, Chase, Chitna, Gakona, Kenny Lake, McCarthy, Skwentna, Susitna, Talkeetna, and
Trapper Creek (2013) and Copperville, Glennallen, Gakona, Lake Louise, Nelchina, Mendeltna,
Paxson, Tazlina, Tolsona, and Tonsina (2014).
When conducting baseline subsistence harvest surveys it is standard practice for ADF&G Division
of Subsistence to select approximately 5 households in each community for participation in LTK
interviews. These interviews are necessary for providing a deeper context to the harvest survey
results.
RSP Table 14.5.5 Communities Selected for Traditional Knowledge, Subsistence Mapping, and
Household Survey should be revised to acknowledge LTK interview components of the baseline
harvests surveys in the ADF&G identified study communities.
It should also be noted that the Census Designated Place of Susitna North located between
Talkeetna and Trapper Creek, with a population of 1,260 as of the 2010 Census, has not been
included in the study.
Appendix 2, Public Comment Letters Part 2
Regarding Glennallen Field Office of the Bureau of Land Management comments on Subsistence;
Comments SUB-04, SUB-05, and SUB-06
RSP Attachment 14-3 Household Harvest Survey Key Informant Interview Protocol (Draft) will be
modified to include questions about perceived impacts of added users as a result of any increased
access opportunities that may occur. A question will be added to the Cantwell interview protocol
to address community population growth and its perceived impacts on subsistence hunting.
Secretary Kimberly Bose 17 January 2013
State of Alaska Resource Agency RSP Comments FERC No. 14241
Page 16 of 68
Regarding The Center for Water Advocacy comments SUB-01 and SUB-02
Attachment 14-3 Household Harvest Survey Key Informant Interview Protocol (Draft) will be
modified to include a question that inquires about local knowledge of in-stream water flows.
General Response to Public Comments Regarding the Role of Baseline Subsistence Harvest
Surveys in Facilitating Impact Analysis
Data obtained from ADF&G Division of Subsistence baseline subsistence harvest surveys
establishes baseline indicators to help facilitate impact analysis. Tools to facilitate impact analysis
include subsistence use area mapping, assessment questions, and the community comments and
concerns questions in the Household Harvest Survey Instrument (RSP Attachment 14-2).
Appendix 3, Informal Comment Response Table, Section 14 Subsistence Resources
ADF&G Division of Subsistence has no additional comments.
Health Impact Study
RSP should note that ADF&G Division of Subsistence baseline subsistence harvest surveys will
also include a Health Impact Component.
Attachments:
Attachment 1: ADEC Water Quality Monitoring Quality Assurance Project Plan (QAPP)
Review Checklist for the Susitna Hydro Project Baseline WQ Monitoring
Sampling and Analysis Plan QAPP
Provides additional comments, detail and guidance from the ADEC review of the Susitna QAPP
for Water Quality Monitoring.
Attachment 2: ADEC Draft Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0.
Provides formal guidance for submitting a QAPP to ADEC for approval.
Secretary Kimberly Bose 17 January 2013
State of Alaska Resource Agency RSP Comments FERC No. 14241
Page 17 of 68
ATTACHMENT 1
ADEC Water Quality Monitoring Quality Assurance Project Plan (QAPP) Review Checklist
For
Susitna Hydro Project Baseline WQ Monitoring Sampling and Analysis Plan QAPP
(13 pages)
STATE OF ALASKA DEPARTMENT OF ENVIRONMENTAL CONSERVATION
Division of Water WQSAR Program
DOW QAPP Checklist Tier 2 Page 1 of 13
January 15, 2009
ADEC Water Quality Monitoring
Quality Assurance Project Plan (QAPP) Review Checklist
The applicant must develop a QAPP for use in a proposed monitoring project. The QAPP will be used by all
parties involved in the monitoring project as a road map to collecting valid monitoring data. Failure to follow the
provisions in the QAPP may likely result in the invalidation of monitoring data and may result in the requirement
for additional monitoring. Responsibility for conducting field monitoring, laboratory and data analysis in
compliance with the QAPP rests with the respective project managers for sampling, laboratory and data analysis
(Note: this responsibility extends to any contracted field monitoring, lab or data analysis vendor). Responsibility
for diligent project oversight rests with the lead project manager/organization.
Project Title: Susitna Hydro Project Baseline WQ Monitoring Date: November 7, 2012
Sampling and Analysis Plan QAPP
Reviewed By: Richard Heffern, DEC WQ QA Officer Date: January 8, 2013
QA Summary Review Comment: This QAPP addresses each of the EPA 24 QAPP Elements
but follows its own format in providing the required project plan information. At times this can
be confusing to review since different critical elements are addressed under different headings.
Some categories are described in depth. However, some key critical categories are minimally
defined or not at all. This QAPP requires some significant revisions before it can be considered
for regulatory approval. Specific comments provided in the table below.
ELEMENT STATUS COMMENTS
A. Project Management Elements
Each page of document numbered and includes revision date and
document title
1. Title and Approval Sheet
Title
Organization’s name(s) implementing project
Effective date of plan November 7, 2012
Printed name and dated signaturse of Organization’s Overall Project
Manager ? Appears there are multiple individuals of authority
but not clear which single individual is responsible
for overall fiscal management and project
management. Clarify. Page not signed/dated
Printed name and dated signature of Organization’s Project QA
Officer/Manager ? Page not signed/dated
Printed name and dated signature of ADEC DOW QA Officer ? Missing
Printed names and dated signatures of Regulatory Agency/s Project
Managers ? Missing
DOW QAPP Checklist Tier 2 Page 2 of 13
January 15, 2009
ELEMENT STATUS COMMENTS
2. Table of Contents
Table of contents follows 24 Element format Follows EPA recommended format. However,
some topics addressed in different sections which
made QA review more complicated as was
constantly cross referencing throughout document.
I.e., MQO elements addressed in different sections
of section B instead of in section A7., etc. When
QAPP is revised it would be helpful to follow DEC
guidance document attached to this QA review,
“Guidance for Tier 2 Water Quality Monitoring
QAPP Rev 0.”
3. Distribution List
In table format list name, person’s job title, organization, email, and
phone # of all who receive the approved QAPP and subsequent
revisions (e.g., Project Manager, Project QA Officer, DEC Project
Manager, DEC QA Officer, Laboratory Project Manager or contact,
lead field sampler(s), and others involved with the sampling as
needed)
Missing following Name, title and contact
information:
All Laboratories involved and primary contacts
Regulatory review agencies primary
contacts/project managers (state, local and
federal)
All lead sampling staff
All QC lead staff
All QA managers
Single QA manager with ultimate QA authority
over project and with sole responsibility for QA
project management in charge of all QA
managers
DEC Water QA Manager/Officer
End data users
4. Project/Task Organization
In table format, identify key individuals and their responsibilities:
(data users, decision-makers, project manager, project QA officer,
,laboratory manager, lead sampling supervisor, contractor/s,
subcontractor/s, etc.)
? Information presented appears confusing in clearly
characterizing overall project management: who
reports to whom, who is ultimately in charge and
how various responsible staff identified are
managed, who they manage. QA appears to be
managed for different project parts by different
staff, however it is unclear who they report to and
who is ultimately responsible for QA. This person
must be independent of any direct project
management responsibilities except Project QA.
Clarify.
Organizational chart showing: 1) line of management authority, 2)
line of data reporting responsibility (this includes relevant sampling
and/or lab contractors/sub contractors), and 3) independent line of
quality assurance authority
Org chart missing. Description of project
management is confusing. It is unclear who is
ultimately responsible fiscally and for project
management. QA management is confusing.
Appears QA management is not independent from
project management. Must be completely
independent. Clear lines of management authority
must be defined, who reports to whom. Likewise
for QA as well as data reporting. See example org
chart in attached , “Guidance for Tier 2 Water
Quality Monitoring QAPP Rev 0. Section A.4.”
5. Problem Definition/Background and Project Objective/s
Clearly states problem(s) and/or decision(s) to be resolved
Provides sufficient historical, background and regulatory perspective ? Section 2.0 of the QAPP provides an overview of
DOW QAPP Checklist Tier 2 Page 3 of 13
January 15, 2009
ELEMENT STATUS COMMENTS
relevant to the proposed monitoring project. If previous monitoring
data exists, results are summarized and made relevant to proposed
monitoring project.
the project and mentions that large large amounts of
data were collected in the 1980s as well as
availability of other data (USGS, etc) data that
will/may be used to augment proposed project
monitoring data to develop a model. However, no
summary data is provided in the QAPP. The QAPP
states, “A comprehensive data set for the Susitna
and tributaries is not available.” It would appear
critical that if historical data is intended to be used
that development of a comprehensive data set be a
key component of this QAPP as well as a critical
data quality assessment on the reliability of the
historical data for use in the project’s goals. The
issue of historical data and how reliable the data is
needs to be adequately addressed. Suggest include
table summarizing historical data. See attached
document, “Guidance for Tier 2 Water Quality
Monitoring QAPP Rev 0. Section A.5.2.”
Provides overall objective(s) for study
6. Project/Task Description (SUMMARY ONLY)
Lists measurements to be made (in Table format) ? Would be helpful to include table of measurements
to be made identifying both field and lab
measurements and in what type of sample matrices.
See attached document, “Guidance for Tier 2 Water
Quality Monitoring QAPP Rev 0. Section A.6, Table
4.”
Briefly describe monitoring location/s Monitoring locations adequately described
Provide large scale introductory map showing relevant region of AK
and overall monitoring/sampling locations.
Lists sampling locations/frequency (in Table format) ? Sample locations and frequency are addressed in
table format (QAPP Tables B1-1 and B1-2. Sample
frequency also addressed but is confusing since it is
unclear how many total samples are planned,
besides stating “monthly, each sampling event, one
survey-summer,” etc. This is confusing as it does
not identify the number of samples scheduled for
collection per siite/analyte and whether the number
of planned samples are adequate to characterize the
watershed sufficient for reliable model
development. Revise accordingly.
Site selection rationale is generally addressed but
better clarity is needed in characterizing the
specific rationale for each type sample
matrix/analyte. Table format would be easier to
follow than just a narrative description of sites,
analytes and sample matrixes to be measured.
Are special personnel or equipment requirements necessary? ? Would be helpful to clarify specialty equipment
needed and specialized personnel
educational/training needed, e.g., QA and QC
specialists, special types of sampling equipment, etc
Provides work schedule for implementation of project tasks (in
Table format)
Summarizes required project & QA records/reports (in Table format) ? Would be helpful to provide better clarity in
describing specific types of QA project
DOW QAPP Checklist Tier 2 Page 4 of 13
January 15, 2009
ELEMENT STATUS COMMENTS
records/reports in table format.
7. Quality Objectives and Criteria for Measurement (in table
format as possible)
States overall Data Quality Objectives (DQOs). References
applicable regulatory/guidance documents (Alaska Water Quality
Standards, etc.) governing DQOs.
? This section needs clarification. For each
analyte/sample matrix to be measured, provide the
applicable most restrictive AWQS for which the
sample analytes will be measured. This defines t
the DQOs are for the project. This is generally
discussed but is unclear what specific AWQS
applies for each sample matrix analyte.
States and characterizes Measurement Quality Objectives (MQOs) as
to applicable action levels or criteria for each parameter measured
(precision, bias, comparability, detectability (mdl and pql) and data
completeness) in Table format. Provides appropriate definition and
algorithms for each. Note: Representativeness to be fully
characterized in section B1, Sampling Process Design. (Note: See
Guidance for Tier 2 Water Quality Monitoring QAPP Rev 0. Section
A.7).
?
Some project MQOs are missing or are not
adequately defined as well as the applicable most
restrictive AWQS for each analyte/sample matrix so
as to assess whether measurement method of choice
has sufficient detectability to measure reliably
below the applicable most restrictive AWQS.
Revise MQO Table A4-1.to include the following:
Most restrictive/controlling AWQS for each
analyte/sample matrix
Specify detectability for each analyte/sample
matrix method of analysis [both method
detection limit (MDL) and practical quantitation
limit (PQL)].
Precision - Clarify that Project Precision to be
assessed via replicate sample measurements, not
sample duplicate measurements and revise
precision acceptance criteria limits as applicable.
Project Accuracy acceptance criteria limits
should be defined by results of Matrix spike and
matrix spike duplicate results (MS/MSD). In
lieu of MS/MSD, lab control standards (LCS)
may be used to define analyte/method accuracy
acceptance criteria limits, but only for those
analytes where MS/MSD analysis is not
practicle. Revise accordingly
Missing algorithm for calculation of accuracy.
Revise accordingly.
Precision criteria for temperature should be not
as ± 10% but as an absolute numerical value,
e.g., ± 0.2°C. Revise accordingly
Accuracy criteria for turbidity (5 NTU) is not
acceptable at lower measurement range. Specify
numerical acceptance criteria (e.g., ± ?NTU)
based upon measurement range. For example
from 0 – 10 NTU, acceptance criteria of ± 1.0
NTU. Revise accordingly for different
measurement ranges expected in the field.
Detectability for Turbidity of 5 NTU not
acceptable. Since AWQS is 5 NTU above
background, must be capable of measuring
turbidity sufficiently reliably below 5 NTU,
especially since this study is to document
baseline water quality conditions and depending
upon time of year when glacial melt water is
minimal, turbidity would likewise be low.
DOW QAPP Checklist Tier 2 Page 5 of 13
January 15, 2009
ELEMENT STATUS COMMENTS
Revise accordingly. Most portable turbidimeters
(Hach, etc) reliably measure turbidity at 1 NTU
and below.
Include measurement method (note, must be
EPA CWA approved for water/wastewater work
for all water quality methods, unless the
applicable drinking water method has the more
restrictive AWQS than the applicable
water/wastewater AWQS. Some of these
methods are specified in QAPP section of
Quality Control. However, a number of the
proposed methods are not acceptable for
water/wastewater analysis under the EPA CWA
and ADEC AWQS regulations. Revise
accordingly.
Some parameters listed are missing one or more
of the required MQO criteria for analyte matrix
and measurement method. Additionally no
specific numeric regulatory or guidance
standards are specified for each pollutant and
sample matrix for which sample results will be
compared against to assess compliance with or
with which to assess future measurement results
against during and/or after post damn
construction. These must be specified for all
analytes/sample matrices.
TAH and TAqH proposed of analysis should
only be methods 624 and 625. Remove methods
602 and 610. These can not adequately
speciate/quantify TAH and TAqH analytes.
Detectablility of 31 µg/L does not have adequate
sensitivity (TAH AWQS is 10 µg/L and TAqH
AWQS is 15 µg/L with some specific analytes
having lower regulatory limits. Must list
individual components for both TAH aand
TAqH as well detectability, precision and
accuracy for all specific TAH and TAqH
analytes. Revise accordingly.
Proposed Metal Analytcial methods 6010B and
6020A are not acceptable EPA CWA water
wastewater work. Select only EPA CWA
water/wastewater methods of analysis with
adequate sensitivity. Revise accordingly.
Missing specific analysis methods for DO, pH,
temperature, turbidity, redox potential, color,
residues. Provide specific EPA CWA approved
method of analysis for each of these parameters.
Fecal coliform method EPA 1604 is DW
approved but not EPA CWA approved for
water/wastewater. Clarify what is the applicable
method and for what compliance purpose. Must
use the most restrictive applicable AWQS
standard for the appropriate method selection.
Revise accordingly.
Radionuclides – specify what method is for what
specific radionuclide. Specify all MQO criteria
DOW QAPP Checklist Tier 2 Page 6 of 13
January 15, 2009
ELEMENT STATUS COMMENTS
and applicable regulatory standards to be
compared against.
Recommend review all methods of analysis
selection to ensure they are appropriate methods
of analysis for applicable sample matrix and
adequate measurement sensitivy. Ensure
appropriate precision, accuracy acceptance
criteria are specified. Revise accordingly.
Mercury in water – method 7470A not
acceptable for EPA CWA water/wastewater
analysis. Suggest use 1631E or other method of
analysis with adequate sensitivity.
Project data completeness- proposed is 95% DC.
This needs clarification that 95% is per analyte
per project. Since the proposed number of
samples to be collected is very limited, the
question should be posed if 95% is adequate?
Better clarity is needed in the
rationalization/justification for such a limited
temporal data set and why what is proposed is
adequate to reliably characterize exiting/pre-
project WQ, sediment, fish bioaccumulation
conditions.
Included in MQO table for each measurement parameter the
applicable numeric Alaska Water Quality Standard (e.g.,
recreational/drinking water, aquatic life fresh water, etc).
Missing. Include in MQO table as mentioned above.
8. Special Training Requirements/Certification Listed
In table format identifies specific training and/or certifications for
key personnel and how/when it will be provided, documented, and
assured. Identifies location where records will be maintained.
? Would be helpful to clarify what specific training is
required and will be provided and for whom.
Suggest use table format similar to, Guidance for
Tier 2 Water Quality Monitoring QAPP Rev 0.
Section A.8, Table 7.” In the proposed QAPP only
a few people are included in the distribution list. All
key leads personnel, Sample team leads, QA staff,
QC staff, managers, labs, etc must have a copy
available at all times as the QAPP lays out the
requirements for sample collection, sample analysis,
data analysis, etc. Revise QAPP accordingly
9. Documentation and Records (in table format as possible)
Itemizes all documents and records to be produced (interim progress
reports, final reports, audits, QAPP revisions, etc. Provides general description of types of information
to be documented and retained.
Lists information to be included in specific types of reports (e.g.,
field reports, lab reports, QA reports, DMR (permitted facilities
only), etc). Examples are:
Final report – Summaryfield reports, lab reports,
Field reports – field logs, field equipment calibrations, QC checks.
Lab reports – sample receipt log, sample prep and lab analysis
logs, instrutment printouts, sample results, results of QC checks,
sample result summary, etc.
QA reports – Performance Evaluation (PE) Sample Reports,
DMRQA, field audit reports, lab audit reports, data audit reports.
? This section does not address lab reports and the
required content in all lab reports. Need to include
requirement in lab reports to provide summary QA
data page, all lab results, data validation flags and
explanation, all QC sample results with each sample
analysis batch and their analyte specific QC
acceptance criteria limits, etc.
States requested lab turnaround time, if applicable Not addressed
Identifies written and electronic (CD/DVD/email) data reports to be
provided to ADEC ? Not addressed. Provide specifics.
DOW QAPP Checklist Tier 2 Page 7 of 13
January 15, 2009
ELEMENT STATUS COMMENTS
Gives retention time and storage location of records and reports ? Retention time and location is 5 years at TetraTech
Seattle office central file following expiration of
contract. Revise to also specify documents
retention time/location for project records with the
primary company/agency commissioning the study,
AIaska Energy Authority (AEA). Since project will
be used to determine background pre-0damn
construction conditions and data will be used to
model projected impact in years to come, retention
time should be significantly longer. Revise
accordingly.
B. Measurement and Data Acquisition
1. Sampling Process Design (in table format when possible)
Provides a clear rational for monitoring project design and
assumptions used to develop the design. ? Missing strong justification for the proposed very
limited temporal data set of analytes to be mea sured
in various sample matrices (water, groundwater,
soil/sediment and fish tissue) is adequate to reliably
characterize the Susitna drainage for development
of a model for damn construction and post damn
construction impacts. Sample data sets vary from 1
sample/site for some analytes (fish and sediment)
to 6 to 8 samples/site for surface waters and a more
limited data set for ground water monitoring.
Defines the parameters to be measured ? QAPP needs to clarify what specific radioncleides
will be measured following which applicable
method. TAH and TAqH samples must also be
speciated to show applicability to regulated AWQS.
Defines the type and number of samples required ? Would be helpful to clarify in table format all
sample analytes, frequency of measurement and
total samples/analyte required for the project.
Much of this information already presented but
missing total number of samples each site/sample
analytes/sample matrix.
Defines when, where, and how samples will be collected ? Generally addressed and some information in great
detail. However, better clarity is needed in
addressing specific sample temporal and special
frequency, time of day, time of month, under what
type of flow conditions, sampling for each sample
analyte/matrix, whether samples are composite,
grab, etc. A table summarizing all this info rather
than extensive narrative would help in review and
approval of this QAPP.
Identifies sampling locations and frequency Sample locations identified and characterized.
Uses photos to characterize sampling locations (photos should be
included either in the QAPP if known prior and/or in final report -4
cardinal directions or others as appropriate.)
Characterizes sampling locations (include detail map/s of local
project area identifying sample sites, topographic/bathymetric map
of area if available, , site specific latitude and longitude, GPS
coordinates, etc.).
Provides site specific GPS coordinates, latitude and longitude,
altitude.
Defines appropriate validation study for non-standard situations
2. Sampling Methods Requirements (in table format)
DOW QAPP Checklist Tier 2 Page 8 of 13
January 15, 2009
ELEMENT STATUS COMMENTS
Identifies specific sample collection procedures and methods.
(Includes equipment preparation and decontaminatio n, sample
containers and sample volumes). Demonstrates compliance with
appropriate referenced method/s. For each parameter/method
describe applicable sample preservation methods, maximum holding
times and temperatures
? Missing required sample bottle types and
preservation criteria for all sample analytes. Some
proposed sample hold times not appropriate. Refer
to 40 CFR 136.3 for required sample analyte
containers, preservation criteria and holding times.
Specific sample collection procedures such as field
filtration of metals for dissolved metals not
addressed and specific sample collection procedures
for metals where WQS near method detection limits
not addressed (Hg, Cu, etc). Revise accordingly.
Recommend provide in table format. Refer to
example table in attached document, Guidance for
Tier 2 Water Quality Monitoring QAPP Rev 0.
Section B.2.2 Table127.”
Specifies calibration procedures for field measurements. ? Generally described. However, needs better clarity.
Needs to address:
standards used for calibrations and QC checks
to bracket expected range of measurements.
Frequency of temperature calibrations against
in-cert NIST Traceable over temperature range
that bracket expected field measurements.
Specify frequency of DO meter calibrations and
by what calibration method. Clarify that DO
meters will be pressure corrected for
atmospheric pressure changes due to weather
and/or altitude for each site.
Specify frequency of calibration of
Conductivity meter, over what measurement
range, with what type certified traceable
standards s and what calibration acceptance
tolerances.
Provide same type information (as above) for
calibration criteria of pH meters and
turbidimeters.
Applicable field measurement SOPs and operator Manuals are
referenced and located in QAPP appendices. Missing. Also provide SOPs, any sample
collection/field measurement forms, calibration
forms, etc to be used in QAPP appendices.
3. Sample Handling and Custody Requirements
Describes sample handling, labeling, collection and transportation
requirements. ? Not adequately addressed. Missing all sample
collection container types, sample preservation
criteria, sample hold time criteria.
Notes chain-of-custody procedures, if required. Appropriate chain -
of-custody forms are referenced in the QAPP appe ndices. ? It appears that chain of custody will be followed but
unclear if required. Need to clarify if COCwill be
followed and include applicable COC forms in
QAPP appendices.
4. Analytical Methods Requirements (in table format)
Identifies specific analytical methods to be followed. Identifies
required equipment and compliance with appropriate method name
and reference number (e.g., fecal coliform, 9222D Standard Methods
20 edition). This section provides more detail than in section A7
MQOs..
This item addressed in Element A.7 above. Missing
some analyte method analytical method references.
Some other referenced analytical measures are not
acceptable for EPA CWA and ADEC
water/wastewater monitoring. Must be revised with
appropriate methods having adequate measurement
sensitivity.
Lists method detection limits (mdl) and practical quantification limit ? Detectability minimally addresseed. Need better
DOW QAPP Checklist Tier 2 Page 9 of 13
January 15, 2009
ELEMENT STATUS COMMENTS
(pql) for each analytical method and provides procedure/algorithm
on how pql determined.
clarity and also include both MDL and PQL for
measurement methods. This is already addressed in
Element A.7 above. Revise accordingly.
Specifies calibration and maintenance procedures. Identifies
performance requirements. For laboratories, a current signed
approved QAPP can be referenced if on file with ADEC DOW.
? Laboratory certifications not addressed. Lab QA
manuals not addressed. Specific calibration
procedures for field measurements minimally
addressed. See Element B.2.above for more
specific comments. Revise accordingly.
Applicable SOPs and QA Manuals are referenced and located in
QAPP appendices. Missing.
5. Quality Control Requirements (in Table format)
Lists Quality Control requirements for field measurements.
Identifies QC procedures and frequency, acceptance criteria limits,
corrective actions, and standards traceability for each measurement
technique. Examples of QC sample measurements and criteria are:
duplicate/replicate precision measurements, field blanks, and QC
”calibration” check standards, This information to be provided as
much as possible in table format. See example table and
information in, Guidance for Tier 2 Water Quality Monitoring
QAPP Rev 0. section B5.1 Field Quality Control Meeasures.”
? QC procedures not adequately addressed. QAPP
does not adequately address required QC types
(temp blanks, field blanks, calibration checks,
sample replicates, etc), frequency of analysis and
acceptance criteria limits for each field
measurement of interest. This is critical criteria that
must be included in QAPP so that all project staff
with responsibilities for analysis, data validation,
data verification and QA assessments have the
required information to reliably evaluate the quality
of project data. Revise accordingly.
Lists Quality Control requirements for field sample collection with
subsequent laboratory analysis. Identifies QC procedures and
frequency, acceptance criteria limits, corrective actions, and
standards traceability for each sample analysis technique. Examples
of QC samples and criteria are: field duplicate/replicate sample
analysis, laboratory duplicate/replicate sample analysis, matrix spike
duplicates, field blank samples, lab blanks, 3rd party QC samples
(commercially prepared QC samples as verification for lab
calibration standards, etc), calibration verification standards and
continuing calibration verification standards. This information to be
provided as much as possible in table format. See example table
and information in, Guidance for Tier 2 Water Quality Monitoring
QAPP Rev 0. section B5.2 Laboratory Quality Control Meeasures.”
? Same comments apply as above but for all
Field/Lab Quality Control Measures (temp blanks,
field blanks, lab blanks, sample replicates, lab
dublicates, lab fortified blanks, internal standards,
continuing calibration standards, MS/MSD, etc)..
Revise accordingly.
6. Instrument/Equipment Testing and Inspection and
Maintenance Requirements (in table format). For laboratories, a
current signed/approved QAPP can be referenced if on file with
ADEC DOW (provide reference location).
Identifies acceptance testing of sampling process and of field and lab
measurement equipment/standards Describes instrument testing, inspection and
maintenance for field instruments. Missing lab
portion of Instrument/Equipment testing inspection
and maintenance. May make t reference applicable
approved Lab QA Manual section. Revise
accordingly.
Describes equipment preventive and corrective maintenance Same as above. Field equipment preventive and
corrective maintenance described. Missing lab
portion. Revise accordingly.
Checklists and worksheets documenting testing, inspection, and
maintenance are included in the QAPP appendices.
No check lists provided
7. Instrument Calibration and Frequency (in table format when
possible). For laboratories, a current signed/approved QAPP can be
referenced if on file with ADEC DOW.
Please summarize as much of the information below in table format:
DOW QAPP Checklist Tier 2 Page 10 of 13
January 15, 2009
ELEMENT STATUS COMMENTS
Specifies calibration (frequency, range, control criteria, etc) for each
instrument or piece of equipment needing calibration. ? Generally described. See element B.2 above that
addresses concerns for calibration of field
measurement instruments. Specifies calibration/certification/traceability (certification date,
expiration date, range, accuracy, etc.) for calibration standards used
and shows compliance with appropriate method.
?
Specifies calibration standards and/or equipment Generally described. See element B.2 above that
addresses concerns for calibration of field
measurement instruments.
Cites calibration records and manner traceable to
equipment/instrumentation
Calibration forms Calibration forms only provided for temperature.
Provide forms for DO, pH, Conductivity, turbidity
8. Inspection/Acceptance Requirements for Supplies and
Consumables (presented in table format). For laboratories, a
current signed/approved QAPP can be referenced if on file with
ADEC DOW.
States acceptance procedure and criteria for supplies & consumables
States how and where records are kept ? Not addressed. Revise accordingly.
Notes responsible individual(s)
9. Data Acquisition Requirements for Nondirect Measurements
(presented in table)
Identifies type of data needed from nonmeasurement sour ces (e.g.,
computer databases, literature files, historical data bases, NOAA
weather data, etc.), along with acceptance criteria for their use.
? QAPP mentions it will be using some USGS data
from stations along the Susitna drainage. QAPP
needs to define how it will assess the reliability of
this data for use in the project.
It appears that some historical data may be used for
qualitative assessment only. The terms and
conditions of how, when, where and why must be
defined if data is to be used, especially if data is of
unknown or questionable reliability.
Describes any limitations on use of such data Not addressed other than stating, “Assessment of
applicability for historical data is outside the scope
of this document and is not addressed further in this
data collection QAPP.” If this data is to be used,
then the reliability of this data must be evaluated
and applicable limitations /restriction of use applied
depending upon quality of data.
10. Data Management (presented in table format when possible)
Describes project data management process and traces path from
sample collection and field measurements, lab analysis, data
validation/verification, QA assessments and reporting of data of
known quality to the respective ADEC Division of Water Program
Office. It also shows and describes control mechanisms for detecting
and correcting errors. Include flow chart. See, “Guidance for Tier
2 Water Quality Monitoring QAPP Rev 0. Section B.10,” for specific
types of info to include in this section as well as an example Data
Management Flow Chart.
The QAPP provides a general description of data
management, but it does not adequately describe the
overall process from data collection thru to
reporting of data to the intended data users.
Since this is a complex project with multiple
individuals responsible for various components, it is
critical this section be described in sufficient detail
to ensure all responsible individuals are fully
knowledgeable of their individual duties and
responsibilities and how they integrate with the
overall project data management scheme. This
section needs to characterize in detail the project’s
data management process tracing the path of the
data from generation to their final use or storage
[e.g., from field measurements and sample
collection/recording through transfer of data to
DOW QAPP Checklist Tier 2 Page 11 of 13
January 15, 2009
ELEMENT STATUS COMMENTS
computers (laptops, data acquisition systems, etc.),
laboratory analysis, data validation/verification, QA
assessments and reporting of data of known quality
to the respective ADEC Division of Water Program
Office]. Additionally, data management must also
discuss the control mechanisms for detecting and
correcting errors.
Missing Data Management Flow Chart. Must
provide.
Describes standard record-keeping, including data storage and
retrieval requirements ?
Generally described, missing details. Data retrieval
requirements not described. Revise accordingly.
Checklists or standard forms are included in QAPP appendices ? Not provided. Provide.
Describes data handling equipment and procedures used to process,
compile, & analyze data ?
Sort of described. Revise to provide specifics.
C. Assessments and Oversight
1. Types of Project Assessments & Response Actions (in table
format). Indicate which types of assessment to be performed, at
what frequency and number and the criteria used to ensure
performance or effectiveness.
The Project QAO must have sole responsibility for
all assessments performed. The TT Technical lead
may neither perform audits nor direct audits. QAO
must be completely independent from direct
management of project monitoring operations and
the TT Technical Lead and TT PM. The Project
QAO may delegate specific QA duties to other
staff, however such staff work only under his/her
direction. Revise accordingly
Specify Assessment types, frequency and acceptance criteria –
Note: Frequency and occurrence of all assessments must be specified in the
QAPP. Responsibility for scheduling and conducting audits, issuing
report findings and monitoring corrective actions lies with Project
QA Officer.
QAPP mentions audits but provides no specifics.
This section must identify:
Field Assessments (each pollutant)
Precision (replicate) sample measurements.
Project should have at least a bare minimum of
three paired measurements/project/analyte or
15% of project samples, whichever is greater.
Replicate measurements should be evenly
spaced over project timeline. Precision criteria
are specified in the project’s Measurement
Quality Objectives (MQO) table, see section A7.
Field samples collected for subsequent laboratory
analysis (each pollutant)
Blind replicate samples for each pollutant to be
measured. Project should have at least a bare
minimum of three paired
measurements/project/analyte or 15% of project
samples, whichever is greater. Replicate
samples should be evenly spaced over project
timeline. Precision criteria are specified in
project’s MQO table, see section A7.
Matrix spike duplicates (MSD) (assesses total
measurement bias for project – both precision
and accuracy). Frequency of MSDs is usually
specified by the analytical method. Accuracy
and precision of criteria for each pollutant and
DOW QAPP Checklist Tier 2 Page 12 of 13
January 15, 2009
ELEMENT STATUS COMMENTS
analytical method are specified in the project’s
MQO table, see section A7.
Third party performance evaluation samples (PE
samples also called performance test (PT)
samples) for wastewater analytes of interest. PT
water/wastewater sample participation is at a
frequency of 1/year from a NELAC certified
vendor (http://www.nelac-
institute.org/PT.php#pab1_4). For APDES
permit monitoring, these are called DMRQA
samples.
On-Site Assessments
Inspection of field monitoring operations for
compliance with QAPP requirements.
Laboratory Audit (if concerns arise regarding
laboratory data quality)
Audit of project field measurement data results.
Project Data Assessments
Audits of Monitoring Data for reproducibility of
results from recalculation/reconstruction of
field/lab unprocessed data.
Calculation of monitoring project’s overall
achieved precision, accuracy and data
completeness compared to QAPP defined
precision, accuracy and data completeness goals.
Corrective Action Report(s) and Corrective Action Response(s)
QAPP Revisions – describes process to revise QAPP (if monitoring
methods, criteria, or other elements change).
2. Quality Assurance Reports to Management (in Table
format)
For the following QA reports describe the frequency, content,
responsible position or individual for issuing each report and
distribution of each to management and others (summarize in table
format):
?
General QA reports are mentioned in QAPP section
C 2.0. However, some key information needs
clarification for each type of assessment report
(e.g., on-site field assessment, on-site lab
assessment, 3rd party PT/DMRQA Data Quality
Assessments, Corrective action report, Annual/End
of Project QA Summary Report (including overall
assessment of project precision, accuracy, data
completeness, problems encountered and how
resolved, did project achieve DQO and MQO
goals/requirements):
Description of Assessment report content
Presentation method, and
Position responsible for issuance of report and
frequency of reporting.
D. Data Validation and Usability
1. Data Review, Validation, and Verification Requirements (in
table format if possible)
States method-specific criteria for accepting, rejecting, or qualifying
data. Data Validation Tables summarizing these criteria should be
referenced and may be located in QAPP appendices.
?
Generally addressed. Some key performance info
missing in QAPP that has been addressed above
sections. These performance criteria must also be
DOW QAPP Checklist Tier 2 Page 13 of 13
January 15, 2009
ELEMENT STATUS COMMENTS
included in the data review, verification and
validation process.
Includes project-specific calculations or algorithms ?
Except for precision and data completeness, no
other project specific calculations/algorithms
provided. Missing calculation for assessing
accuracy. If other project specific
calculations/algorithms will be used, revise as
appropriate. Provide accuracy calculation.
2. Validation and Verification Methods
Describes process for data validation and how criteria will be used to
validate, qualify and/or invalidate data. Include validation
forms/checklists in the QAPP appendices.
?
No data validation forms/checklists provided. If
forms will be used, provide in QAPP appendices.
Describes process for data verification and how conclusions can be
correctly drawn from the validated data. Include verification
forms/checklists in the QAPP appendices.
?
No data verification forms/checklists provided. If
forms will be used, provide in QAPP appendices.
Identifies issue resolution procedure and responsible individual(s)
Identifies method for conveying results to data users Addressed in an earlier section.
3. Reconciliation with User Requirements
Describes process for reconciling project results with project
objectives and reporting any limitations on use of data
These elements, when adequately completed, meet the State and Federal QAPP requirements.
For further guidance see EPA QA/R-5 (http://www.epa.gov/r10earth/offices/oea/epaqar5.pdf), EPA QA/G-5
(http://www.epa.gov/r10earth/offices/oea/epaqag5.pdf) and Elements of a Water Quality Monitoring QAPP rev 1
Acceptable- no other information needed.
Information must be changed or fixed.
Not acceptable: major additions or changes required.
Information is provided for benefit of applicant.
? Information is incomplete: some clarification is necessary.
Secretary Kimberly Bose 17 January 2013
State of Alaska Resource Agency RSP Comments FERC No. 14241
Page 31 of 68
ATTACHMENT 2
ADEC Draft Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0.
(37 Pages)
Guidance for a Generic Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
DRAFT
GUIDANCE DOCUMENT FOR WRITING A
TIER 2 WATER QUALITY MONITORING
QUALITY ASSURANCE PROJECT PLAN (QAPP)
June 2012
Alaska Department of Environmental Conservation
Division of Water
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 2 of 37
Suitability: This document is to be used as a guidance for writing a project specific Quality
Assurance Project Plans (QAPPs) along with Template For A Water Quality Monitoring Tier 2
Quality Assurance Project Plan, June 2012 for: Alaska’s Clean Water Actions (ACWA) Grants,
Total Maximum Daily Load (TMDL)s, Domestic Wastewater, Alaska Pollutant Discharge
Elimination System (APDES) and Compliance Permits. Tier 2 water quality monitoring QAPPS
are to be designed with a necessary level of rigor to demonstrate compliance with Alaska Water
Quality Standards (AWQS). Providing the prescribed requested information and following this
format as defined will assure that sufficient quality assurance and quality control procedures are
designed into the project to lead to reliable and defensible monitoring data sufficient for showing
compliance with AWQS.
Note: Red font is used throughout this document to provide direction on information to include
in specific areas and sections.
A PROJECT MANAGEMENT ELEMENTS
A.1 TITLE AND APPROVALS:
In this section include title of the plan, the name of the organization(s) implementing the project,
and the effective date of the plan. It must have printed name, signature and date lines for the
following individuals: overall Project Manager and Project QA Officer, ADEC Project Manager,
and the ADEC Division of Water QA Officer.
Title: Date:
Name: Project Manager Phone:
Organization Name: email:
Signature: ______________________________ Date: ______________
Name: Project QA Officer Phone:
Organization Name: email:
Signature: ______________________________ Date: ______________
Name: ADEC DOW Project Manager Phone:
ADEC DOW Program Name: email:
Signature: ______________________________ Date: ______________
Name: ADEC DOW QA Officer Phone:
ADEC DOW WQSAR Program email:
Signature: ______________________________ Date: ______________
Guidance for a Generic Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
A.2 TABLE OF CONTENTS
In this section include the table of contents following the prescribed detailed format. Table of Tables,
Table of Figures and abbreviations may be modified to be consistent with QAPP contents. If QAPP
contains pictures, include Table of Pictures.
A PROJECT MANAGEMENT ELEMENTS ........................................................................................................... 2
A.1 TITLE AND APPROVALS: ............................................................................................................................. 2
A.2 TABLE OF CONTENTS .................................................................................................................................. 3
A.3 DISTRIBUTION LIST ..................................................................................................................................... 6
A.4 PROJECT TASK/ORGANIZATION ............................................................................................................... 7
A.5 PROBLEM DEFINITION/BACKGROUND AND PROJECT OBJECTIVES ................................................ 9
A.5.1 Problem Definition ............................................................................................................................................ 9
A.5.2 Project Background .......................................................................................................................................... 9
A.5.3 Project Objective(s) ........................................................................................................................................ 10
A.6 PROJECT/TASK DESCRIPTION AND SCHEDULE .................................................................................... 10
A.6.1 Project Description ......................................................................................................................................... 10
A.6.2 Project Implementation Schedule.................................................................................................................... 11
A.7 DATA QUALITY OBJECTIES AND CRITERIA FOR MEASUREMENT DATA ...................................... 12
A.7.1 Data Quality Objectives (DQOs) .................................................................................................................... 12
A.7.2 Measurement Quality Objectives (MQOs) ...................................................................................................... 12
A.8 SPECIAL TRAINING REQUIREMENTS/CERTIFICATION ...................................................................... 17
A.9 DOCUMENTS AND RECORDS ................................................................................................................... 18
B. DATA GENERATION AND ACQUISITION .................................................................................................... 19
B.1 SAMPLING PROCESS DESIGN (EXPERIMENTAL DESIGN).......................................................................... 19
B.1.1 Define Monitoring Objectives(s) and Appropriate Data Quality Objectives .................................................. 19
B.1.2 Characterize the General Monitoring Location/s ........................................................................................... 19
B.1.3 Identify the Site-Specific Sample Collection Location(s), Parameters to be Measured and Frequencies of
Collection ........................................................................................................................................................ 20
B.2 SAMPLING METHOD REQUIREMENTS ................................................................................................... 21
B.2.1 Sample Types .................................................................................................................................................. 21
B.2.2 Sample Containers and Equipment ................................................................................................................. 21
B.2.3 Sampling Methods ........................................................................................................................................... 23
B.3 SAMPLE HANDLING AND CHAIN OF CUSTODY REQUIREMENTS ................................................... 24
B.3.1 Sampling Procedures ...................................................................................................................................... 24
B.3.2 Sample Custody Procedures ........................................................................................................................... 24
B.3.3 Shipping Requirements.................................................................................................................................... 24
B.4 ANALYTICAL METHODS AND REQUIREMENTS .................................................................................. 24
B.5 QUALITY CONTROL REQUIREMENTS .................................................................................................... 25
B.5.1 Field Quality Control (QC) Measures ............................................................................................................ 25
B.5.2 Laboratory Quality Control (QC) Measures ................................................................................................... 26
B.6 INSTRUMENT/EQUIPMENT TESTING, INSPECTIONAND MAINTENANCE REQUIREMENTS ....... 27
B.7 INSTRUMENT CALIBRATION AND FREQUENCY ................................................................................. 27
B.8 INSPECTION/ACCEPTANCE OF SUPPLIES AND CONSUMABLES ...................................................... 28
B.9 DATA ACQUISITION REQUIREMENTS (NON-DIRECT MEASUREMENTS) ....................................... 28
B.10 DATA MANAGEMENT ................................................................................................................................ 28
B.10.1 Data Storage and Retention ............................................................................................................................ 30
C. ASSESSMENTS .................................................................................................................................................... 31
C.1 ASSESSMENTS AND RESPONSE ACTIONS ............................................................................................. 31
C.1.1 High Quality End-Use Tier 2 Monitoring Data .............................................................................................. 32
C.1.2 Lower Quality End-Use Tier 2 Monitoring Data ............................................................................................ 33
C.2 REVISIONS TO QAPP ................................................................................................................................... 34
C.3 QA REPORTS TO MANAGEMENT ............................................................................................................. 34
D. DATA VALIDATION AND USABILITY ........................................................................................................... 35
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 4 of 37
D.1 DATA REVIEW, VERIFICATIONAND VALIDATION REQUIREMENTS .............................................. 35
D.1.1 Data validation ............................................................................................................................................... 35
D.1.2 Data Verification............................................................................................................................................. 36
D.1.3 Data Review .................................................................................................................................................... 36
D.2 VERIFICATION AND VALIDATION METHODS ...................................................................................... 36
D.2.1 Validation Methods ......................................................................................................................................... 36
D.2.2 Verification Methods ....................................................................................................................................... 37
D.3 RECONCILIATION WITH USER REQUIREMENTS ................................................................................. 37
TABLE OF TABLES
Table 1: Distribution List.......................................................................................................................... 6
Table 2: Project Organizational Responsibilities ..................................................................................... 7
Table 3: Example Summary Table of Previous Project Relevant Monitoring Data............................... 10
Table 4: Parameters to be Measured ....................................................................................................... 10
Table 5: Example Project Implementation Schedule .............................................................................. 11
Table 6: Project Measurement Quality Objectives (MQOs) .................................................................. 16
Table 7: Project Training/Certification ................................................................................................... 17
Table 8: Project Documents and Records ............................................................................................... 18
Table 9: Site Location and Rationale ...................................................................................................... 20
Table 10: Criteria for Establishing Site Representativeness ................................................................... 20
Table 11: Sample Schedule (Parameters, Sample Type, Frequency) ..................................................... 21
Table 12: Preservation and Holding Times for the Analysis of Samples ............................................... 22
Table 13: Field Quality Control Samples ............................................................................................... 25
Table 14: Field/Laboratory Quality Control Samples ............................................................................ 26
Table 15: Project Assessments ............................................................................................................... 34
Table 16: QA Reports to Management ................................................................................................... 35
TABLE OF FIGURES
Figure 1: Example Project Organizational Structure ................................................................................ 9
Figure 2: Example Data Management Flow Chart ................................................................................. 31
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 5 of 37
LIST OF ABBREVIATIONS
ACWA Alaska’s Clean Water Actions
ADEC Alaska Department of Environmental Conservation
APDES Alaska Pollutant Discharge Elimination System
ASTM American Society for Testing and Materials
AWQMS Ambient Water Quality Monitoring System
BETX Benzene, Ethylbenzene, Toluene, Xylenes (m, р, ο)
CWA Clean Water Act
COC Chain of Custody
cfu/100mL coliform forming units/100 milliliters
DMR Discharge Monitoring Report
DMRQA sample Discharge Monitoring Report Quality Assurance sample
DQO Data Quality Objective
DO Dissolved Oxygen
DOW Division of Water
DROPS Discharge Reporting and Online Permitting System
EPA Environmental Protection Agency
GPS Global Positioning System
ICIS-NPDES Integrated Compliance Information System – National Pollutant Discharge and
Elimination System
IDL Instrument Detection Limit
MQO Measurement Quality Objective
MDL Method Detection Limit
MSDS Material Safety Data Sheet
mS/cm microsiemens/centimeter
mg/L milligrams/liter
μg/L micrograms/liter
ND Non Detect
NELAC National Environmental Laboratory Accreditation Counsel
PE Sample Performance Evaluation Sample
PT Sample Performance Test Sample
PQL Practical Quantification Limit
QA Quality Assurance
QAP Quality Assurance Plan
QAPP Quality Assurance Project Plan
QC Quality Control
QMP Quality Management Plan
RL Reporting Limit
RPD Relative Percent Difference
RSD Relative Standard Deviation
SPAR Spill Response and Recovery
SOP Standard Operating Procedure
STORET Storage and Retrieval System
TAH Total aromatic hydrocarbons
TMDL Total Maximum Daily Load
VOC Volatile Organic Compounds
WA DOE Washington State Department of Ecology
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 6 of 37
WQS Water Quality Standards
A.3 DISTRIBUTION LIST
List the names and addresses of those who receive copies of the approved QAPP and subsequent
revisions in Table 1. Distribution list at a minimum must include all those involved with management
direction, QAPP approvals, data management, and senior staff directing monitoring operations in the
field, key laboratory staff, key data management staff and the end data users. Modify Table 1 as
appropriate for the project.
Table 1: Distribution List
NAME POSITION AGENCY/
Company
DIVISION/
BRANCH/SECTION
CONTACT
INFORMATION
Project
Manager
Phone:
Email:
Project
Quality
Assurance
Officer
Phone:
Email:
Sampling
Manager
Phone:
Email:
Lab Manager Phone:
Email:
Data Manager Phone:
Email:
Lab QA
Manager
Phone:
Email:
Project
Manager
ADEC Division of Water/ Phone:
Email:
QA Officer ADEC Division of Water/
WQSAR/QA
Phone:
Email:
Phone:
Email:
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 7 of 37
A.4 PROJECT TASK/ORGANIZATION
List the duties and responsibilities of key individuals and organizations participating in the monitoring
project in Table 2: Modify Table 2 as appropriate for the project.
Table 2: Project Organizational Responsibilities
Position Title Agency or
Company
Division
Branch/Section
Responsibilities
Project Manager Add project
info
Add project
info
Revise as appropriate
Responsible for overall technical, financial
and contractual management of the project
and subsequent reporting of QA reviewed
(validated and verified) data to DEC.
Project QA Officer Add project
info
Add project
info
Revise as appropriate
Responsible QA review and approval of
plan and to ensure all monitoring complies
with the QAPP specified criteria. This is
accomplished through routine technical
assessments of the sample collection,
analysis and data reporting process.
Assessments may include, but are not
limited to: on-site field audits, data audits,
QA review of blind lab performance
evaluation samples, lab audits, etc. These
assessments are performed independent of
overall project management.
Sampling &
Analysis Manager
Add project
info
Add project
info
Add project responsibilities
Field Sampling
staff
Add project
info
Add project
info
Add project responsibilities
Laboratory
Manager
Add project
info
Add project
info
Responsible for the overall review and
approval of contracted laboratory analytical
work, responding to sample result inquiries
and method specific details. Responsible
for QA/QC of laboratory analysis as
specified in the QAPP and reviews and
verifies the validity of sample data results as
specified in the QAPP and appropriate EPA
approved analytical methods.
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 8 of 37
Position Title Agency or
Company
Division
Branch/Section
Responsibilities
Laboratory Quality
Assurance
Manager/Officer
Add project
info
Add project
info
Laboratory Quality Assurance
Manager/Officer – Responsible for QA/QC
of water quality laboratory analyses as
specified in the QAPP. Along with
Laboratory Manager, the Lab QA Officer
reviews and verifies the validity of sample
data results as specified in the QAPP and
appropriate EPA approved analytical
methods.
Project Manager ADEC Division of
Water
Responsible for overall technical and
contractual management of the project. For
Permit related monitoring projects,
responsible for ensuring permittee complies
with permit required water quality
monitoring as specified in the approved
QAPP
Water Quality
Assurance Officer
ADEC Division of
Water
Responsible for QA review and approval of
plan and oversight of QA activities ensuring
collected data meets project’s stated data
quality goals
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 9 of 37
Revise Figure 1, Project Organizational Structure, as appropriate for the monitoring project. Be sure to
use separate identifying lines to discriminate from each other the following: management direction,
data reporting and QA assessment/reporting.
A.5 PROBLEM DEFINITION/BACKGROUND AND PROJECT
OBJECTIVES
A.5.1 Problem Definition
In this section clearly state the specific problem to be solved, decision to be made, or outcome to be
achieved.
A.5.2 Project Background
Provide a brief background summary for the purpose of the monitoring project. Include sufficient
information to provide historical, scientific and regulatory perspective. If previous monitoring data
exists and is relevant to proposed monitoring project, provide summary of results in Table 3 along with
the appropriate numeric ADEC water quality standard/s (pollutant concentration: e.g., ground water,
surface water, aquatic life freshwater, aquatic life marine water, etc). Explain how this data was used
to rationalize the proposed monitoring plan.
Revise Table 3 as appropriate for the monitoring project.
Management Direction
Data Reporting
QA Assessment/Reporting
Figure 1: Example Project Organizational Structure
ADEC DOW
Project Manager
ADEC DOW
QA Officer
DEC DOW Data
Base (AWQMS,
DROPS)
Field Sampling Laboratory Sampling & Analysis
Manager
Project Manager Project QA Officer
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 10 of 37
Table 3: Example Summary Table of Previous Project Relevant Monitoring Data
Site Location Date Measurement Parameter Alaska WQS
Analyte Conc. Meas.
units
Aquatic Life Recreational/
Drinking Water
A.5.3 Project Objective(s)
In this section define the overall objectives for this monitoring project. Clearly state what is the
purpose for collecting monitoring data, why it is being collected and how this data will be used to
support the project’s purpose? If there are regulatory requirements governing the reason/s for
collecting monitoring data, cite the specific federal and/or state statue/s. State how the proposed
monitoring plan fulfills this requirement.
A.6 PROJECT/TASK DESCRIPTION and SCHEDULE
A.6.1 Project Description
In this section provide a summary paragraph describing the work to be performed.
In Table 4, list the parameters to be measured and recorded. Use the appropriate column to list
samples analyzed in the field and samples analyzed in the laboratory.
Table 4: Parameters to be Measured
Field Measurements Laboratory Measurements
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 11 of 37
The following information is provided as a guide to selecting laboratories for the analysis of project
samples. Before selecting a laboratory, consider the following:
Note 1: ADEC certifies laboratories for drinking water and contaminated sites analysis only. At
the present time, ADEC does not certify laboratories for water/wastewater analyses.
However, an ADEC drinking water-approved laboratory lends credibility to a laboratory’s
quality assurance and quality control processes. A list of ADEC-approved microbiological
laboratories is available at: http://www..state.ak.us/dec/deh/water/labs.htm and for
laboratories providing chemical analysis at:
http://www.state.ak.us/dec/deh/water/chemlabs.htm.
Note 2: For microbiological analyses, only a laboratory with current ADEC drinking water
certification that resides within Alaska may be used. Due to the short sample holding time
(< 8 hours), labs outside of Alaska would not reasonably be able to receive and start the
analysis as specified by the EPA water/wastewater approved microbiological method.
Note 2: For labs contracted outside of Alaska, it is strongly recommended that the contracted
laboratory have either NELAC and/or State certification (e.g., Washington State
Department of Ecology, http://www.ecy.wa.gov/programs/eap/labs/lab-accreditation.html)
for the respective water/waste water analytical methods.
In this section insert a large scale map showing the overall geographic location/s of field tasks. (Note
in section B1, Sampling Process Design, include larger scale topographic map(s) identifying specific
geographic location(s) of sampling sites).
A.6.2 Project Implementation Schedule
Revise Table 5 as appropriate to describe the project implementation schedule.
Table 5: Example Project Implementation Schedule
Product Measurement/
Parameter(s)
Sampling Site Sampling
Frequency
Time
Frame
QAPP
Preparation
Field
Sampling
DO, pH, Temp, Cond.
Turbidity, Fecal
Coliforms
River Road Mile 3 Site #1,
upstream side of culvert,
above outfall
Weekly June – Sept
DO, pH, Temp, Cond.,
Turbidity, Fecal
Coliforms, TAHs
River Road Mile 3 Site #2,
downstream side of culvert
below outfall
Weekly
randomized
sample timeframe
June – Sept
DO, pH, Temp, Cond.,
Turbidity, Fecal
Coliforms, TAHs
Site # 3, Mile 3 River
Road, Downstream of
bridge
Weekly,
randomized
sample timeframe
June – Sept
Lab Analysis Fecal Coliforms All sites Analyses within
sample holding
time requirements
June - Sept
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 12 of 37
Product Measurement/
Parameter(s)
Sampling Site Sampling
Frequency
Time
Frame
Field Audit Audit of field
monitoring operations
All sites < 30 days of
project start-up
1/project
Data
Analysis
Data Review
Data Report
A.7 DATA QUALITY OBJECTIES AND CRITERIA FOR MEASUREMENT
DATA
A.7.1 Data Quality Objectives (DQOs)
Data Quality Objectives (DQOs, EPAQA/G4). DQOs are qualitative and quantitative statements derived from
the DQO Process that:
Clarify the monitoring objectives (i.e., determine water/wastewater pollutant concentrations of
interest and how these values compare to water quality standards regulatory limits).
Define the appropriate type of data needed. In order to accomplish the monitoring objectives,
the appropriate type of data needed is defined by the respective AWQS. For pollutants,
compliance with the AWQS is determined by specific measurement requirements. The
measurement system is designed to produce water pollutant concentration data that are of the
appropriate quantity and quality to assess compliance.
In this section define the project’s DQOs. Include a brief paragraph stating what the project’s data
quality objectives are. For most Tier 2 QAPPs, the DQOs may be to capture data of sufficient quality
to demonstrate compliance with Alaska’s Water Quality Standards.
A.7.2 Measurement Quality Objectives (MQOs)
Measurement Quality Objectives (MQOs) are a subset of DQOs. MQOs are derived from the
monitoring project’s DQOs. MQOs are designed to evaluate and control various phases (sampling,
preparation, and analysis) of the measurement process to ensure that total measurement uncertainty is
within the range prescribed by the project’s DQOs. MQOs define the acceptable quality (data
validity) of field and laboratory data for the project. MQOs are defined in terms of the following data
quality indicators:
Detectability
Precision
Bias/Accuracy
Completeness
Representativeness
Comparability
Detectability is the ability of the method to reliably measure a pollutant concentration above
background. DEC DOW uses two components to define detectability: method detection limit (MDL)
and practical quantification limit (PQL) or reporting limit (RL).
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 13 of 37
The MDL is the minimum value which the instrument can discern above background but with no certainty
to the accuracy of the measured value. For field measurements, the manufacturer’s listed instrument
detection limit (IDL) can be used.
The PQL or RL is the minimum value that can be reported with confidence (usually some multiple of the
MDL).
Note: The measurement method of choice should at a minimum have a practical quantification
limit or reporting limit 3 times more sensitive than the respective DEC WQS and/or
permitted pollutant level (for permitted facilities).
Sample data measured below the MDL is reported as ND or non-detect. Sample data measured ≥
MDL but ≤ PQL or RL is reported as estimated data. Sample data measured above the PQL or RL is
reported as reliable data unless otherwise qualified per the specific sample analysis.
Precision is the degree of agreement among repeated measurements of the same parameter and
provides information about the consistency of methods. Precision is expressed in terms of the relative
percent difference (RPD) between two measurements (A and B).
For field measurements, precision is assessed by measuring replicate (paired) samples at the same
locations and as soon as possible to limit temporal variance in sample results. Overall project
precision is measured by collecting blind (to the laboratory) field replicate samples. Laboratory
precision is determined similarly via analysis of laboratory duplicate samples. For paired and small
data sets, project precision is calculated using the following formula:
( )
(( ) ⁄)
Where: RPD = relative percent difference
A = primary sample
B = replicate field sample or laboratory duplicate sample
For larger paired precision data sets (e.g. overall project precision) or multiple replicate precision data,
use the following formula:
RSD = 100*σ/mean
∫
Where: RSD = relative standard deviation
σ = standard deviation
k = number of paired replicate samples (A and B)
d = A - B
A = primary sample
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 14 of 37
B = replicate field sample or laboratory duplicate sample
Bias (Accuracy) is a measure of confidence that describes how close a measurement is to its “true”
value. Methods to determine and assess accuracy of field and laboratory measurements include,
instrument calibrations, various types of QC checks (e.g., sample split measurements, sample spike
recoveries, matrix spike duplicates, continuing calibration verification checks, internal standards,
sample blank measurements (field and lab blanks), external standards), performance audit samples
(DMRQA, blind Water Supply or Water Pollution PE samples from American Association for
Laboratory Accreditation (A2LA) certified, etc. Bias/Accuracy is usually assessed using the following
formula:
100TrueValue
lueMeasuredVaAccuracy
Completeness is a measure of the percentage of valid samples collected and analyzed to yield
sufficient information to make informed decisions with statistical confidence. As with
representativeness, data completeness is determined during project development and specified in the
QAPP. Project completeness is determined for each pollutant parameter using the following formula:
T – (I+NC) x (100%) = Completeness
T
Where T = Total number of expected sample measurements.
I = Number of invalid sample measured results.
NC = Number of sample measurements not completed (e.g. spilled sample, etc).
Project % Data Completeness Goal = Insert numeric % here /analyte for all project analytes
Representativeness is determined during project development and specified in the QAPP.
Representativeness assigns what parameters to sample for, where to sample, type of sample (grab,
continuous, composite, etc.) and frequency of sample collection.
Comparability is a measure that shows how data can be compared to other data collected by using
standardized methods of sampling and analysis. Comparability is shown by referencing the
appropriate EPA CWA approved measurement method as specified in federal and/or state
regulatory guidance documents for the parameter/s to be sampled and analyzed (e.g., Alaska Water
Quality Standards (http://www.dec.state.ak.us/water/wqsar/wqs/index.htm), EPA Guidelines
Establishing Test Procedures for the Analysis of Pollutants Under the Clean Water Act; National
Primary Drinking Water Regulations; National Secondary Drinking Water Regulations; and Analysis
and Sampling Procedures (http://www.access.gpo.gov/nara/cfr/waisidx_05/40cfr136_05.html), etc).
As with representativeness and completeness, comparability is determined during project development
and must be specified in the QAPP.
For each parameter to be sampled/measured, list the measurement method to be used and the MQOs to
meet the overall data quality objectives. This applies to both direct field measurements (e.g., field pH
meters, DO meters, etc.) as well as samples collected for subsequent laboratory analyses.
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 15 of 37
Use Table 6 on the following page to present MQO information along with the appropriate WQS
numerical value! Revise Table 6 as appropriate for the monitoring project.
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 16 of 37
Table 6: Project Measurement Quality Objectives (MQOs)
Group Analyte Method MDL
(µg/L)
PQL
(µg/L)
Alaska WQS
Precision
(RPD)
Accuracy
(%
Recovered) Aquatic Life Recreation/Drinking
Water
VOCs
Benzene EPA 602a 0.33 1.0
10 µg/lb
10 86-126
Toluene EPA 602a 0.46 1.5 15 52-148
Ethylbenzene EPA 602a 0.35 1.2 20 60-140
Xylene, total EPA 602a 0.82 3.0 20 60-140
Settleable
Solids
Settleable
Solids
EPA
160.5
0.2
ml/L/hr
0.2
ml/L/hr
No
measureable
increase
above
natural
condition
<5% increase in 0.1 mm to 0.4
mm fine sediment for waters
with anadromous fish; <30%
by weight of fines in gravel
beds
NA NA
Water
Quality
DO
(dissolved
oxygen)
In situ (electronic
probe)
EPA 360.1
NA 0.01
mg/L
>4.0 mg/L
>7 mg/l for anadromous fish;
>5 mg/l for non-anadromous
fish; < 17 mg/L
±20% NA
pH
In situ
(electronic
probe)
EPA 150.1
NA ±0.01 pH
units
6.5 - 8.5; not vary by 0.5
from natural condition
6.5 - 8.5 ±0.1 pH
units
±0.1 pH
units
Temperature
In situ
(electronic
probe)
EPA 170.1
NA 0.1°C
<20°C Migration routes <
15°C
Spawning areas < 13°C
Rearing areas < 15°C Egg /fry
incubation < 13°C
<30°C ±0.2°C
±0.2°C
Conductivity
In situ
(electronic
probe)
EPA 120.1
NA
0-1: 0.001
1-10: 0.01
10-100:
0.1
(mS/cm)
NA NA ± 10% ± 10%
Total
Recoverable
Inorganics
Aluminum EPA200.8 0.33 1.0 750 g/L Acute; 87
g/L chronic NA 20 80-120
Iron EPA200.7 2.7 50 NA Acute; 1000 g/L
chronic NA 20 80-120
Dissolved
Inorganics
Arsenic EPA200.8 0.044 0.15 340 g/L Acute; 150
g/L chronic 0.018 g/L 20 80-120
Cadmium EPA200.8 0.062 0.20 Hardness Dependentc NA 20 80-120
Copper EPA200.8 0.034 0.10 Hardness Dependentc 1300 g/L 20 80-120
Lead EPA200.8 0.030 0.10 Hardness Dependentc NA 20 80-120
Mercury EPA245.1 0.05 0.2 1.4 g/L Acute; 0.77
g/L Chronic NA 20 80-120
Zinc EPA200.8 0.08 0.25 Hardness Dependentc 7400 g/L 20 80-120
Hardness Hardness 2340B 1000 1000 NA NA 5 100
Nutrients
Nitrogen,
Total
Kjeldahl
4500-
NH3C 112 400 NA NA 30 80 - 120
Total
Phosphorous
4500
PE/4500-PB 25.7 51.4 NA NA 8 80 - 120
Fecal
Coliforms
Fecal
Coliforms EPA1604 1cfu/100mL 1cfu/100mL NA 100 FC/100 mL 5 95 - 105
NA = None available.
a EPA Method 602 for screening BETX. If BETX measured, confirm with EPA method 624 (GCMS).
b Total Aromatic Hydrocarbons are BTEX (Benzene, Toluene, Ethylbenzene, and Xylene) only.
c Metal standards for the protection of aquatic life are hardness dependent, the formulas for calculating the appropriate standard are:
Acute Chronic Total to Dissolved conversion Factor
Cadmium e 1.0166(ln hardness) -3.924 e 0.7409(ln hardness) -4.179 1.136672-[(lnhardness)(0.041838) for acute
1.101672-[(lnhardness)(0.041838) for chronic
Copper e0.9422(ln hardness) - 1.700 e 0.8545(ln hardness) - 1.702 0.960 acute and chronic
Lead e1.273(ln hardness) - 1.460 e1.273(ln hardness) -4.705 1.46203 -[(ln hardness)(0.145712)] for acute
1.46203 -[(ln hardness)(0.145712)] for chronic
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 17 of 37
A.8 SPECIAL TRAINING REQUIREMENTS/CERTIFICATION
In this section, describe any specialized training or certifications needed by personnel in order to
successfully complete the project. Describe how training is to be provided and how the necessary
skills are assured and documented, as well as how the organization implementing the data collection is
qualified and competent. Training may be formal or obtained by “mentoring” provided by senior staff,
and by coordination with the sub-contracted laboratory. Revise Table 7 as appropriate to summarize
project training.
Contracted and sub-contracted laboratories performing analytical work must have the requisite
knowledge and skills in execution of the analytical methods being requested. Information on
laboratory staff competence is usually provided in each lab’s Quality Assurance Plan (QAP). The
agency and/or organization implementing the monitoring project is responsible to ensure that the
contracted lab maintains on file with the Project QA Officer and the ADEC DOW QA Officer a
current copy (electronic preferred) of the laboratory’s QAP.
Table 7: Project Training/Certification
Specialized Training/Certification Field
Staff
Lab
Staff
Monitoring
Supervisor
Lab
Supervisor
Project
QA
Officer
Safety training X X X X X
Water sampling techniques X X X
Instrument calibration and QC activities for
field measurements
X X X
Instrument calibration and QC activities for
laboratory measurements
X X X
QA principles X X X
QA for water monitoring systems X X
Chain of Custody procedures for samples
and data
X X X X X
Handling and Shipping of Hazardous Goods X X X X X
Specific Field Measurement Methods
Training
X X X
ADEC Microbiological Drinking Water Certification for microbiological analysis is limited to the
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 18 of 37
Specialized Training/Certification Field
Staff
Lab
Staff
Monitoring
Supervisor
Lab
Supervisor
Project
QA
Officer
Certification individually certified analyst.
Lab Analytical Methods Training X X X
A.9 DOCUMENTS AND RECORDS
In this section, list all the project specific documents and records that will be produced, such as interim
progress reports, final reports, audits, and Quality Assurance Project Plan revisions, etc. Records
should include field logs, sample preparation and analysis logs, laboratory analysis, instrument
printouts, model inputs and outputs, data from other sources such as databases or literature, the results
of calibration and QC checks. Copies of example data sheets should be included in the appendix.
Revise Table 8 as appropriate, including records disposition (location and retention time). Use the
following categories to list appropriate documents and records. Record and document types are
examples only.
Table 8: Project Documents and Records
Categories Record/Document Types Location Retention Time
Site Information Network Description
Site characterization file
Site maps
Site pictures
Environmental
Data Operations
QA Project Plan
Field Method SOPs
Field Notebooks
Sample collection/measurement records
Sample Handling & Custody Records
Chemical labels, MSDS sheets
Inspection/Maintenance Records
Raw Data Lab data (sample, QC and calibration)
including data entry forms
Data Reporting Discharge Monitoring Reports (DMRs) for
permitted facility
Progress reports
Project data/summary reports
Lab analysis reports
Inspection Report
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 19 of 37
Categories Record/Document Types Location Retention Time
Data
Management
Data management plans/flowcharts
Data algorithms
Quality
Assurance
Control charts
Data quality assessments
DMRQA and PE samples
Site audits
Lab audits
QA reports/corrective action reports
Response
Performance Evaluation Samples
In addition to any written report, data collected for a project will be submitted electronically to ADEC
via a CD ROM, ZIP Disk or email ZIP file. All dates are to be formatted as “MM-DD-YYYY”.
B. DATA GENERATION AND ACQUISITION
B.1 SAMPLING PROCESS DESIGN (Experimental Design)
In this section provide a thorough description of the following three major activities:
Define the monitoring objective(s) and appropriate data quality objectives.
Characterize the general monitoring location(s).
Identify the site specific sample collection location/s, parameters to be measured and frequency of
collection.
B.1.1 Define Monitoring Objectives(s) and Appropriate Data Quality Objectives
In this section describe in sufficient detail such that a person, knowledgeable with water quality
monitoring but unfamiliar with the monitoring site and history, clearly understands the project’s breadth,
scope, underlying rationale and monitoring plan design assumptions. Describe how these monitoring
objectives relate to the appropriate data quality objectives.
Note: If the proposed project plan is as a result of previous monitoring efforts, the previous
data is to be summarized in table format including parameters and concentrations
measured, methods employed and how the results relate to the Alaska water quality
standards criteria. Provide reference to previous data report if available or attach as
appendix.
B.1.2 Characterize the General Monitoring Location/s
In this section provide a description of the monitoring locations and the rationale for their selection. Be
sure to include a map providing an overview of all monitoring locations. Use Table 9 to identify sample
sites and to describe the rationale for their selection.
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 20 of 37
Table 9: Site Location and Rationale
Site ID Latitude Longitude Site Description and Rationale for Selection
B.1.3 Identify the Site-Specific Sample Collection Location(s), Parameters to be Measured and
Frequencies of Collection
In this section describe site specific sampling locations, specific parameters to be measured, type of
sample(s) to be collected and frequency of collection and representativeness of scale. Be sure to include
topographic map(s) showing each monitoring site with sufficient gradient relief detail to characterize the
watershed and how each sample site is representative of the monitoring project’s stated goals. Identify
any structures or obstructions affecting sample collection and potential sources of pollutant
contamination.
Note 1: Consider in the design plan how samples are to be collected to best represent
environmental conditions of concern (e.g., consider how the temporal and spatial variables
of sample collection may provide differing results based upon sample collection times,
sample depth and location within water (stream, lake, etc.) boundaries).
Note 2: In baseline monitoring, sample site locations should be determined to ensure both
temporal and spatial representativeness. If possible, samples should be taken directly
from the water body, rather than from a container filled from the water body.
Note 3:When water samples are taken in response to water pollution complaints, care should be
taken to ensure the sampling sites are both representative of the pollution event and
characterize the extent; e.g., collecting samples at the suspect pollution site, and above and
below it.
Note 4: When a sample is taken at a wastewater facility discharge outfall, a volume of water equal
to at least ten times the volume of the sample discharge line will first be discharged into a
bucket or similar container to clear the line of standing water and possible contamination.
Use Table 10 to clarify key “Site Representativeness” criteria for each site selection.
Table 10: Criteria for Establishing Site Representativeness
Site ID Monitoring Purpose Criteria for Site Selection
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 21 of 37
Site ID Monitoring Purpose Criteria for Site Selection
Use Table 11 to define the key parameters to be measured, types of samples (in situ measurements, grab,
composite, etc), numbers of samples and collection frequency.
Table 11: Sample Schedule (Parameters, Sample Type, Frequency)
Site ID Parameters to be measured Sample Type
(I, G, C, etc.)
Sampling
Frequency
Sample
Time
Total number
measurements
I ≡ In Situ Measurement G ≡ Grab Sample C ≡ Composite Sample
Insert detailed map(s), (topographic, batholitic, etc.) identifying location of all monitoring sites. Map(s)
should be of sufficient clarity and resolution of scale to represent each individual sampling site along
with buildings, structures and topographic features (water bodies, elevation change, etc) and point
sources of pollution that could possibly influence quality of the water bodies to be monitored.
B.2 SAMPLING METHOD REQUIREMENTS
Project sampling staff should wear disposable gloves and safety eyewear, if needed, and observe
precautions while collecting samples. Sampling staff need to be aware of the potential chemical and
biological hazards present. The Project Sampling Staff collecting samples must take care not to touch
the insides of bottles or lids/caps during sampling.
B.2.1 Sample Types
In this section describe sample types to be collected/measured. Samples will be listed as “composite”
or “grab” on the Chain-of- Custody or Transmission Form and in field logbook or field data sheets.
B.2.2 Sample Containers and Equipment
In this section describe specific sample handling and custody requirements (If the results of a sampling
program may be used as evidence, a strict written record (Chain of Custody) must be documented
tracking location and possession of the sample/data at all times).
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 22 of 37
All sampling equipment and sample containers must be cleaned according to the equipment
specifications and/or the analytical laboratory. Bottles supplied by a laboratory are pre-cleaned, must
never be rinsed, and will be filled only once with a sample.
For samples requiring cooling preservation, a temperature blank shall accompany each cooler
(min/max thermometer preferred). Any min/max thermometer used shall be readable to at least 0.2°C.
Use Table 12 to list specific analyte/method criteria for required parameter holding times and
preservation methods. Revise Table 12 as appropriate for the monitoring project. For parameters not
listed in this table, see 40 CFR 136 Table II-Required Containers, Preservation Techniques, and
Holding Times (http://ecfr.gpoaccess.gov/cgi/t/text/text-
idx?c=ecfr&sid=50e6d452bc564b99d249b2212375f89f&rgn=div8&view=text&node=40:23.0.1.1.1.0.
1.3&idno=40 ).
Table 12: Preservation and Holding Times for the Analysis of Samples
Analyte Matrix Container Necessary
Volume
Preservation and
Filtration
Maximum
Holding
Time
Residue (settleable
solids)
Surface
Water P, FP, G 1 L Cool <6oC, do not freeze 48 hours
BTEX
Surface
Water G with FP
lined septum
120 mL (3-
40mL)
HCl to pH < 2; < 6°C, do not
freeze 14 days
Cu, Cd, As, Pb
(Dissolved)
Surface
Water
P, FP, G 250 mL
Filtered within 15 minutes of
collection using a 0.45 µm
filter; HNO3 to pH < 2 6 months
Cu, Cd, As, Al,
Pb (Total
Recoverable)
Surface
Water
P, FP, G 250 mL HNO3 to pH < 2 6 months
Nitrate-Nitrite
Surface
Water
P, FP, G 1 L
Cool <6oC;
H2SO4 to pH < 2, do not freeze 28 Days
Total
Phosphorous
Surface
Water
P, FP, G 1 L
Cool <6oC;
H2SO4 to pH < 2, do not freeze 28 Days
Fecal Coliform
Surface
Water
G, PA 250 mL
Cool <10oC; do not freeze,
0.0008% Na2S2O3
6 hours
2 hrs lab prep
(note: time not
additive)
Hardness
Surface
Water
P, FP, G 100 mL
HNO3 to pH < 2; < 4°C, do
not freeze 6 months
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 23 of 37
Analyte Matrix Container Necessary
Volume
Preservation and
Filtration
Maximum
Holding
Time
P = polyethylene, FP = flouropolymer, G = glass, PA = autoclavable plastic
B.2.3 Sampling Methods
This section provides general guidance on how to collect different types of samples. Delete those
sections not appropriate for the type of samples to be collected. If specific sample collection methods
will be followed, cite the appropriate source/method or else include in this section a detailed
description of the sampling method to be followed.
Surface Water Samples, Streams -
Sampling stations should always be located in the main stream channel. Since stream waters are
usually well mixed vertically, often subsurface sampling at a convenient depth is adequate for
collection of representative samples at a given point. Subsurface samples are taken within the upper
meter or may be a composite of two or more strata. The sampler should be aware of thermal
stratification due to discharges or tributaries.
Lakes, Ponds, Reservoirs-
A sufficient number of stations should be established in random locations to define adequately the
parameters of concern. Usually the deepest part of the lake should be included as one of the stations.
Where concentrations of chemical or physical parameters can vary with depth, samples should be
collected from all major depth zones, or water masses. In shallow waters (2 to 3 m), samples shall be
collected at 0.5 to 1 m. In deeper water (> 3 m), samples should be collected at regular depth intervals.
Groundwater Wells-
Only grab samples may be obtained. The well should be purged of at least three casing volumes of
water before sample collection, and the purged well should be allowed sufficient time to equilibrate
and fines to settle. If a bailer is used, it should be slowly lowered and raised to minimize disturbances.
Samples should be taken as close as possible to the water level, unless analysis indicates that
contamination is at a different depth. All sampling equipment must be certified clean by the laboratory
providing it. An equipment blank, should be collected into a separate container and analyzed along
with the other groundwater samples.
All previously used sampling equipment must be properly decontaminated before sampling and
between sampling locations to prevent introduction of cross-contamination. Washwater and rinsate
solutions must be collected in appropriate containers and disposed of properly in accordance with
federal, state, and local regulations. Bailing strings and wires and other disposable sampling tools
must be properly disposed of after use. For more information on groundwater monitoring and
monitoring wells, see the ADEC SPAR Underground Storage Tank Procedures Manual, Section 4,
Sampling Procedures November 7, 2002 at: http://dec.alaska.gov/spar/ipp/docs/ust_man02_10_07.pdf
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 24 of 37
Note 1: Bailers should not be used for collecting metal samples due to potential introduction
of metal contaminants to the sample.
Note 2: Peristaltic pumps should not be used for collection of volatile organic compounds
(VOC) samples due to potential loss of volatile components.
Grab Samples – Sample bottles will be filled sequentially, to the shoulder of the bottle, leaving a small
space for expansion and mixing. Note that some sample types, such as VOC and fecal coliform
bacteria have specific bottle filling requirements. The laboratory will provide sampling instructions
with the sample bottles. If necessary, samplers will consult with the laboratory regarding sampling
procedures.
Composite Samples – Samples will be composited directly into the sample. Between composite
subsets, bottles will be kept in a cooler with ice to reach and maintain a sample temperature of 4 +/-
2°C. The time of the initial portion of the composite, composite intervals, and the final compositing
time must be noted in the field logbook or data sheets. Sample time listed on the Chain of Custody
(COC) or Transmission Form and the sample bottle must be the time of the final sample composite
portion.
Note: Composite samples must be in accordance with analyte specific EPA CWA prescribed
preservation and holding time criteria found in 40 CFR 136 Table II-Required
Containers, Preservation Techniques, and Holding Times.
B.3 SAMPLE HANDLING AND CHAIN OF CUSTODY REQUIREMENTS
B.3.1 Sampling Procedures
See Section B.2 of this QAPP – Sampling Method Requirements
B.3.2 Sample Custody Procedures
In this section describe any chain of custody (COC) procedures if required . Include example COC
form and COC SOP as an appendix to the QAPP.
B.3.3 Shipping Requirements
Packaging, marking, labeling, and shipping of samples will comply with all regulations promulgated
by the U. S. Department of Transportation in 49 CFR 171-177. Staff should receive the necessary
training for shipping samples or consult with the laboratory for shipping instructions.
Temperature preservation method and holding time limitations must be considered when decisions are
made regarding sampling and shipping times for time and temperature sensitive sample analytes.
Describe any analyte/method specific shipping requirements in this section and how project is
designed to meet these requirements.
B.4 ANALYTICAL METHODS AND REQUIREMENTS
In this section reference the laboratory’s Quality Assurance Plan (QAP) and applicable SOPs for each
method analyte to be measured. If the lab has a current QAP and relevant SOPs on file with ADEC
DOW QA Officer, these can be specifically referenced in this section. If not, it is responsibility of the
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 25 of 37
monitoring project manager to ensure the lab’s QAP and relevant SOPs are included (as attachments)
to the monitoring project’s QAPP.
Monitoring shall be conducted in accordance with EPA-approved analytical procedures and in
compliance with 40 CFR Part 136, Guidelines Establishing Test Procedures for Analysis of Pollutants.
Reference the Project’s MQO table (section A7) of this QAPP for list of parameters of concern,
approved analytical methods, method-specific detection and reporting limits, accuracy and precision
values applicable to this project.
Under direction of the Project Manager, project staff will ensure that all equipment and sampling kits
used in the field and laboratories use EPA CWA approved methods. The project’s QA officer will
verify that only EPA CWA approved methods (or in specific incidences ADEC DOW pre- approved
methods) are used.
B.5 QUALITY CONTROL REQUIREMENTS
Quality Control (QC) is the overall system of technical activities that measures the attributes and
performance of a process, item, or service against defined standards to verify that they meet the
monitoring project’s data quality objectives.
In this section define the QC activities that will be used to control the monitoring process to validate
sample data. Use separate tables to define field QC measurements and Lab QC measurement and their
criteria for accepting/rejecting project specific water quality measurement data.
B.5.1 Field Quality Control (QC) Measures
QC measures in the field include but are not limited to:
Proper cleaning of sample containers and sampling equipment.
Maintenance, cleaning and calibration of field equipment/kits per the manufacturer’s and/or
laboratory’s specification, and field SOPs.
Chemical reagents and standard reference materials used prior to expiration dates.
Proper field sample collection and analysis techniques.
Correct sample labeling and data entry.
Proper sample handling and shipping/transport techniques.
Field replicate samples (blind to the laboratory), e.g. 1 replicate/10 samples).
Field replicate measurements (e.g. 1 replicate measurement/10 field measurements).
Field Replicate samples and Field Replicate measurements should generally be equal to 15% of total
field and/or lab measurements or at least 1/sampling event, whichever is greater. Use Table 13 and
revise as appropriate to define all project field QC types, frequency and acceptance criteria limits.
Table 13: Field Quality Control Samples
Field Quality Control Sample Measurement
Parameter
Frequency
QC Acceptance
Criteria Limits
Frequency
of
Occurrence
Total # of QC
Type Samples
Field Blank
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 26 of 37
Field Quality Control Sample Measurement
Parameter
Frequency
QC Acceptance
Criteria Limits
Frequency
of
Occurrence
Total # of QC
Type Samples
Trip Blank
Field Replicate (Blind to Lab)
Field Replicate Measurement
Calibration Verification Check
Standard
B.5.2 Laboratory Quality Control (QC) Measures
In this section detail the Laboratory Quality Control Measures including QC samples collected in the
field for subsequent laboratory analysis as well as method-specific laboratory QC activities prescribed
in each analytical method’s SOP and in the monitoring project’s QAPP. Modify Table 14 as
appropriate for the project.
Laboratory QC includes the following:
Laboratory instrumentation calibrated with the analytical procedure.
Laboratory instrumentation maintained in accordance with the instrument manufacturer’s
specifications, the laboratory’s QAP and Standard Operating Procedures (SOPs).
Matrix spike/matrix spike duplicates, sample duplicates, calibration verification checks, surrogate
standards, external standards, etc. per the laboratory’s QAP and SOPs.
Specific QC activities prescribed in the project’s QAPP.
Laboratory data verification and validation prior to sending data results to ADEC and/or permitted
facility.
Contracted laboratories will provide analytical results after verification and validation by the
laboratory QA Officer. The laboratory must provide all relevant QC information with its summary of
data results so that the project manager and project QA officer can perform field data verification and
validation and review the laboratory reports. The Project Manager reviews these data to ensure that
the required QC measurement criteria have been met. If a QC concern is identified in the review
process, the Project Manager and Project QA Officer will seek additional information from the
contracted laboratory to resolve the issue and take appropriate corrective action.
Table 14: Field/Laboratory Quality Control Samples
Field/Lab Quality Control
Sample
Measurement
Parameter
Frequency
QC Acceptance
Criteria Limits
Frequency
of
Occurrence
Total # of QC
Type Samples
Field Blank
Trip Blank
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 27 of 37
Field/Lab Quality Control
Sample
Measurement
Parameter
Frequency
QC Acceptance
Criteria Limits
Frequency
of
Occurrence
Total # of QC
Type Samples
Field Replicate
Lab Blank
Lab Fortified Blank
Calibration Verification Check
Standard
Continuing Calibration
Verification Check Standard
Matrix Spike/Matrix Spike
Duplicate
Lab Duplicate Sample
External QC Check Standard
Surrogate Standard
B.6 INSTRUMENT/EQUIPMENT TESTING, INSPECTIONAND
MAINTENANCE REQUIREMENTS
In this section describe the procedures and criteria used to verify that all instruments and equipment are
acceptable for use.
Prior to a sampling event, all sampling instruments and equipment are to be tested and inspected in
accordance with the manufacturers’ specifications. All equipment standards (thermometers,
barometers, etc) are calibrated appropriately and within stated certification periods prior to use.
Monitoring staff should document that required acceptance testing, inspection and maintenance have
been performed. Records of this documentation should be kept with the instrument/equipment kit in
bound logbooks or data sheets.
Contracted and sub-contracted laboratories will follow the testing, inspection and maintenance
procedures required by EPA Clean Water Act approved methods and as stated in the respective
laboratory’s QAP and SOPs.
B.7 INSTRUMENT CALIBRATION AND FREQUENCY
Field instruments must be calibrated where appropriate prior to using the instruments. Calibrations
must be in accordance with the respective EPA CWA approved method against standards of known
traceability and within stated certification (expriation) dates. If equipment and/or kits require
calibration immediately prior to the sampling event, the calibration date will be recorded in the
operator’s field logbook or field data sheets. When field instruments require only periodic calibration,
the record of this calibration should be kept with the instrument. The project manager will delegate a
field project team member to ensure that instruments are calibrated correctly and appropriate
documents recorded and retained.
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 28 of 37
In this section specify instrument calibration procedures and their frequency for field measurement
methods. Reference applicable instrument/method SOPs in QAPP appendices.
Contracted and sub-contracted laboratories will follow the calibration procedures found in its QAP and
the laboratory’s SOPs. Specific calibration procedures for regulated pollutants will be in agreement
with the respective EPA Approved CWA method of analysis. Field and/or laboratory calibration
records will be made available to ADEC upon request.
B.8 INSPECTION/ACCEPTANCE OF SUPPLIES AND CONSUMABLES
In this section describe how and by whom supplies and consumables (e.g., standard materials and
solutions, filters, pumps, tubing, sample bottles, glassware, reagents, calibration standards, electronic
data storage media, etc.) are inspected and accepted for use in the monitoring project.
All reagents, calibration standards, and kit chemicals are to be inspected to ensure that expiration dates
are not exceeded prior to use in the monitoring project.
All sample collection devices and equipment will be appropriately cleaned prior to use in the
monitoring project.
All sample containers, tubing, filters, etc. provided by a laboratory or by commercial vendor will be
certified clean for the analyses of interest. The sampling team will take note of the information on the
certificate of analysis that accompanies sample containers to ensure that they meet the specifications
and guidance for contaminant-free sample containers for the analyses of interest.
No standard solutions, buffers, or other chemical additives shall be used if the expiration date has
passed. The sampling manager or his/her designee is responsible to maintain appropriate records (e.g.
logbook entries, checklists, etc.) to verify inspection/acceptance of supplies and consumables, and
restock these supplies and consumables when necessary.
Contracted and sub-contracted laboratories will follow procedures in their laboratory’s QAP and SOPs
for inspection/acceptance of supplies and consumables.
B.9 DATA ACQUISITION REQUIREMENTS (NON-DIRECT
MEASUREMENTS)
In this section identify the type of data needed for project implementation or decision-making obtained
from non-measurement sources such as maps, charts, GPS latitude/longitude measurements, computer
data bases, programs, literature files and historical data bases. Describe the acceptance criteria for the
use of such data and specify any limitations to the use of the data. If data of known and accepted
quality is to be modeled to predict water quality impacts, the specific model of use is to be identified,
referenced and justified.
B.10 DATA MANAGEMENT
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 29 of 37
The success of a monitoring project relies on data and their interpretation. It is critical that data be
available to users and that these data are:
Of known quality,
Reliable,
Aggregated in a manner consistent with their prime use, and
Accessible to a variety of users.
Quality Assurance/Quality Control (QA/QC) of data management begins with the raw data and ends
with a defensible report, preferably through the computerized messaging of raw data.
Data management encompasses and traces the path of the data from their generation to their final use
or storage [e.g., from field measurements and sample collection/recording through transfer of data to
computers (laptops, data acquisition systems, etc.), laboratory analysis, data validation/verification,
QA assessments and reporting of data of known quality to the respective ADEC Division of Water
Program Office]. Data management also includes/discusses the control mechanism for detecting and
correcting errors.
In this section include a flow chart as well as a detailed narrative of the monitoring project’s data
management process. An example Data Management Flow Chart (Figure 2) at the end of this section
provides a visual summary description of the data flow/management process for environmental data
collected in support of ADEC’s Division of Water. Revise Figure 2 as appropriate for the specific
monitoring project.
Various people are responsible for separate or discrete parts of the data management process:
The sampling team is responsible for field measurements/sample collection and recording of data
and subsequent shipment of samples to laboratories for analyses. They assemble data files, which
includes raw data, calibration information and certificates, QC checks (routine checks), data flags,
sampler comments and meta data where available. These files are assembled and forwarded for
secondary data review by the sampling manager or supervisor.
Laboratories are responsible to comply with the data quality objectives specified in the QAPP and
as specified in the laboratory QAP and method specific SOPs. Validated sample laboratory data
results with respective analytical method QA/QC results and acceptance criteria are reported to the
sampling manager or project supervisor.
Secondary reviewers (sampling coordinator/supervisor/project supervisor) are responsible for
QA/QC review, verification and validation of field and laboratory data and data reformatting as
appropriate for reporting to STORET, AQMS, ICIS-NPDES, DROPS (if necessary), and reporting
validated data to the project manager.
The project QA officer is responsible for performing routine independent reviews of data to ensure
the monitoring projects data quality objectives are being met. Findings and recommended
corrective actions (as appropriate) are reported directly to project management.
The project manager is responsible for final data certification
DEC DOW Project Manager/WQAO conducts a final review (tertiary review) and submits the
validated data to STORET, AQMS, ICIS-NPDES, DROPS as appropriate.
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 30 of 37
B.10.1 Data Storage and Retention
Data management files will be stored on a secure computer or on a removable hard drive that can be
secured. Laboratory records must be retained by the contract laboratory for a minimum of five years.
Project records must be retained by the lead organization conducting the monitoring operations for a
minimum of five years, preferably longer. Site location and retention period for the stored data will be
specified in Section A9, Documents and Records, Table 8.
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 31 of 37
Figure 2: Example Data Management Flow Chart
C. ASSESSMENTS
C.1 ASSESSMENTS AND RESPONSE ACTIONS
In this section describe in detail the type, number and frequency and acceptance criteria for each type
of assessment scheduled for the monitoring project. Revise Table 15 as appropriate to summarize the
scheduled project assessment types, number, frequency and acceptance criteria limits.
Use the following guidance to design the appropriate QA assessment activities for a Tier 2 Water
Quality Monitoring QAPP. Each monitoring project is different, with different intended data uses,
different parameters to be measured and different project budgets. The key is to design an appropriate
strategy to evaluate the overall monitoring system (data collection, analysis and reporting) with some
level of confidence to independently substantiate the end-use quality data required by the monitoring
project.
STORET, DROPS,
ICIS-NPDES,
AWQMS
Field Staff
Supervisor
100% check of all
data, logbooks, field
data sheets & initial
data flags, providing
flag rational
Project QA Officer
Minimum 10% random check of all data, 100% check
of all elevated values and outlier values. Verify QAPP
& SOP compliance Verify and validate flags, SOP
procedural adjustment & Recommendations. Assess
attainment of overall project required MQOs
Field Staff Operator Data Management
Responsibilities
Maintains all log books, field data sheets, QC forms
Calculates concentrations as needed, Conducts
preventative maintenance, calibrations and QC
checks. Ensures all test equipment is in certification
and all SOPs are followed.
Field Data
Data is collected and
recorded on forms,
logbooks computer
files and
concentrations
calculated
Analytical Laboratory
100% check of all field sample request data sheets,
sample integrity checks (preservation, temperature and
holding times met). Samples analyzed according to
QAPP approved methods. Sample analysis and
relevant QC results reported.
Project Supervisor
Data review and 10% check of all field
and laboratory data (field notes, sample
field and lab results, QC data
verification/validation and appropriate use
of data flags)
Project Manager
Review Data. Report
sample data results per
QAPP requirements,
Data Management Legend
Data reporting
QA Assessments
Data not okay or needs more info
DEC
Division of Water
Project Manager/QA
Officer
Reviews Data for
acceptability
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 32 of 37
Assessments are independent (of management) evaluations of the monitoring project that are
performed by the Project’s QA Officer or his/her designee. For Tier 2 QAPPs, assessments may
include, but are not limited to, any of the following: on-site field surveillance, on-site laboratory
audits, performance evaluation samples, DMRQA samples, blind sample replicates (precision
samples), field split samples, data quality audits, and data reviews. The number and types of
assessments are dependent upon the monitoring project’s intended data uses.
C.1.1 High Quality End-Use Tier 2 Monitoring Data
Generally, monitoring projects requiring high end-use quality data results for comparison to Alaska’s
water quality standards (e.g., compliance monitoring, listing/de-listing of impaired waters, etc.) need
more frequent and varied assessments to provide a more thorough and independent validation that the
monitoring project did actually capture high end-use quality data. Monitoring projects collecting
samples for subsequent laboratory analysis need more types of assessments than just project field
measurements to independently evaluate the overall monitoring system. Example QA Assessments are:
Field Assessments (each pollutant)
Precision (replicate) sample measurements. Project should have minimum of three paired
measurements/project or 15% of project samples, whichever is greater. Replicate
measurements should be evenly spaced over project timeline. Precision criteria are specified in
the project’s Measurement Quality Objectives (MQO) table, see section A7.
Field samples collected for subsequent laboratory analysis (each pollutant)
Blind replicate samples for each pollutant to be measured. Project should have minimum of
three paired measurements/project or 15% of project samples, whichever is greater. Replicate
samples should be evenly spaced over project timeline. Precision criteria are specified in
project’s MQO table, see section A7.
Sample splits (one split sent to lab analyzing project samples, other split sent to a reference
lab).
Matrix spike duplicates (MSD) (assesses total measurement bias for project – both precision
and accuracy). Frequency of MSDs is usually specified by the analytical method. Accuracy
and precision of criteria for each pollutant and analytical method are specified in the project’s
MQO table, see section A7.
Third party performance evaluation samples (PE samples also called performance test (PT)
samples) for wastewater analytes of interest. PT water/wastewater sample participation is at a
frequency of 1/year from a NELAC certified vendor (http://www.nelac-
institute.org/PT.php#pab1_4). For APDES permit monitoring, these are called DMRQA
samples.
Microbiological samples should be analyzed by a current DEC Division of Environmental
Health Drinking Water certified lab (http://www.dec.state.ak.us/eh/lab/certmicrolabs.aspx) for
the methods of interest. For those microbiological methods not covered under the DEC EH
Lab DW certification program, the microbiological lab will enroll in an approved PT study for
the microbiological method of interest (see above link for approved NELAC PT vendors).
Laboratory third party microbiological PT samples results will be submitted directly to the
DEC Water QA Officer and the Monitoring Project’s QA Officer.
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 33 of 37
Note 1: It is the laboratory’s responsibility to enroll itself in these blind PT studies with the
results mailed/emailed directly to the DEC DOW Water QA Officer and the
Monitoring Project’s QA Officer. Routine laboratory performance in the blind PT
sample studies will be used to assess overall laboratory data quality, as well as
monitoring project data quality.
Note 2: It is the responsibility of the Project Manager and project QA Officer to ensure the
selected laboratory is annually self-enrolled in a NELAC certified PT
water/wastewater study for those analytes required in the monitoring project.
On-Site Assessments
Inspection of field monitoring operations for compliance with QAPP requirements.
Laboratory Audit (if concerns arise regarding laboratory data quality)
Audit of project field measurement data results.
Project Data Assessments
Audits of Monitoring Data for reproducibility of results from recalculation/reconstruction of
field/lab unprocessed data.
Calculation of monitoring project’s overall achieved precision, accuracy and data completeness
compared to QAPP defined precision, accuracy and data completeness goals.
C.1.2 Lower Quality End-Use Tier 2 Monitoring Data
Generally low quality end-use Tier 2 monitoring projects are not structured for making determinations
for compliance with Alaska’s WQS or requiring only field measurements (but no subsequent
laboratory analysis) need minimal QA oversight. Example projects include: field measurements of
DO, pH, conductivity, turbidity, TSS (Imhoff cones) and stream flow measurements. Example QA
assessments are:
Field Assessments (each pollutant)
Precision (duplicate/replicate) sample measurements. Project should have minimum of three
paired measurements/project or 10% of project samples, whichever is greater. Replicate
measurements should be evenly spaced over project timeline. Precision criteria are specified in
MQO table, see section A7.
On-Site Assessments
Inspection of field measurement activities for compliance with QAPP requirements.
Project Data Assessments
QA review of project field measurement data results.
Calculation of monitoring project’s overall achieved precision, accuracy and data completeness
compared to QAPP defined precision, accuracy and data completeness goals.
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 34 of 37
Table 15: Project Assessments
Assessment Type Measurement Parameters Frequency Acceptance
Criteria Limits Analyte Method
On-site Field
Audit/Inspection XXXX XXXX
1/site/monitoring
season
Site technicians
in compliance
with QAPP
sampling
protocols, sample
sites meet sample
design criteria
3rd Party Blind PT/DMR
QA Sample (Lab) XXXX XXXX annually Analytes within
PT study limits
Field Split Sample (sent
to different labs for
comparison analysis)
On-site Technical
System Lab audit
Independent Data
Review Audit XXXX XXXX 10% of reported
data XXXX
Project Precision,
Accuracy and Data
Completeness
Assessment
XXXX XXXX
end of project
and at least
1/year
Defined in
Section A7 and
Table 6
C.2 REVISIONS TO QAPP
Annually the QAPP will be reviewed and revised as needed by the project manager and the project QA officer.
Minor revisions may be made without formal comment. Such minor revisions may include changes to identified
project staff (but not lead project staff:: QA project officer, project manager, sampling manager, contracted
laboratories), QAPP distribution list and/or minor editorial changes.
Revisions to the QAPP that affect stated monitoring Data Quality Objectives, Measurement Quality Objectives,
method specific data validation “critical” criteria and/or inclusion of new monitoring methods must seek review
and pre-approval by DEC DOW QA Officer/DEC Project Management before being implemented.
C.3 QA REPORTS TO MANAGEMENT
Use Table 16 to describe assessment types, frequency, content, responsible individual/s, and
distribution of assessment reports to management and other recipients and actions to be taken. Revise
as appropriate to list project QA assessments.
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 35 of 37
Table 16: QA Reports to Management
QA Report Type
Contents
Presentation
Method
Report
Issued by
Reporting Frequency
As Required Year
On-site Field Inspection
Audit Report
Description of audit results, audit methods
and standards/equipment used and any
recommendations
Written text and tables,
charts, graphs
displaying results
Project QA
Officer/auditor
Field Split Sample
Report
Evaluation/comparison of result of split
sample results from different laboratories,
audit method
Written text and tables,
charts, graphs
displaying results
Project QA
Officer/auditor
On-site Laboratory
Audit Report
Description of audit results, audit methods
and standards/equipment used and any
recommendations
Written text and tables,
charts, graphs
displaying results
Project QA
Officer/auditor
3rd Party PT (DMRQA,
etc.) Audit Report
Description of audit results, methods of
analysis and any recommendations
Written text and
charts, graphs
displaying results
Project QA
Officer/auditor
Corrective Action
Recommendation
Description of problem(s), recommended
corrective action(s), time frame for
feedback on resolution of problem(s)
Written text/table QA
Officer/auditor
Response to Corrective
Action Report
Description of problem(s),
description/date corrective action(s)
implemented and/or scheduled to be
implemented
Written text/table Project Manager
overseeing
sampling and
analysis
Data Quality Audit Independent review and recalculation of
sample collection/analysis (including
calculations, etc) to determine sample
result. Summary of data audit results;
findings; and any recommendations
Written text and
charts, graphs
displaying results
Project QA
Officer
Quality Assurance
Report to Management
Project executive summary: data
completeness, precision, bias/accuracy
Written text and
charts, graphs
displaying results
Project QA
Officer
D. DATA VALIDATION AND USABILITY
D.1 DATA REVIEW, VERIFICATIONAND VALIDATION REQUIREMENTS
The purpose of this section is to define the criteria used to review and validate monitoring data-that is,
accept, reject or qualify data in an objective and consistent manner. Data review, verification and
validation is a way to decide the degree to which each data item has met its quality specifications (i.e.
analyte specific QC criteria and overall project measurement quality objectives).
D.1.1 Data validation
Data validation means determining if data satisfy QAPP-defined user requirements, that is, that the
data refer back to the overall data quality objectives. Data validation is an analyte and sample-specific
process that extends the evaluation of data beyond method, procedural, or contractual compliance (i.e.,
data verification) to determine the analytical quality of a specific data set to ensure that the reported
data values meet the quality goals of the environmental data operations (analyte and method specific
data validation criteria).
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 36 of 37
D.1.2 Data Verification
Data verification is the process of evaluating the completeness, correctness, and
conformance/compliance of a specific data set against the method, procedural, or contractual
requirements.
D.1.3 Data Review
Data review is the process that evaluates the overall data package to ensure procedures were followed
and that reported data is reasonable and consistent with associated QA/QC results.
D.2 VERIFICATION AND VALIDATION METHODS
In this section describe the project’s specific procedures for validating and verifying data. Discuss
how issues are resolved and identify the authorities for resolving such issues. Describe how the resu lts
are to be conveyed to the data users. This section should reference examples of QAPP forms and
checklists, which could be provided in the appendices. Any project-specific calculations are identified
in this section.
D.2.1 Validation Methods
Data validation determines whether the data sets meet the project-specific requirements as described in
the QAPP. That is, were the data results of the right type, quality, and quantit y to support their
intended use. Data validation also attempts to give reasons for sampling and analysis anomalies, and
the effect that these anomalies have on the overall value of the data.
All data generated shall be validated in accordance with the QA/QC requirements specified in the
methods and the technical specifications outlined in this QAPP. Raw sample data will be maintained
by the agency or company responsible for the monitoring project. Raw laboratory data shall be
maintained by the laboratory. The laboratory may archive the analytical data into their laboratory data
management system. All data will be kept a minimum of seven years.
The summary of all laboratory analytical results will be reported to the project manager. Data
validation will be performed by the laboratory for all analyses prior to the release of data. All
laboratory data will be validated according to the laboratory’s QAP and SOPs and, as specified in the
Monitoring Project’s QAPP. The rationale for any anomalies in the QA/QC of the laboratory data will
be provided to the Project Manager with the data results. Completed COC or transmission forms (if
required) will be sent back from the laboratory to the Project Manager.
Data will be qualified as necessary. Sampling may need to be repeated. Unacceptable data (i.e., data
that do not meet the QA measurement criteria of precision, accuracy, representativeness, comparability
and completeness) will not be used or if used, the problems with the data will be clearly defined,
flagged appropriately and data use clearly delimited and justified. Any actions taken to correct QA/QC
problems in sampling, sample handling, and analysis must be noted. Under the direction of the Project
Manager, project staff will document any QA/QC problems and the respective QA/QC corrective
actions taken .
The Project Manager/monitoring supervisor or his/her designee is responsible for reviewing field log
notebooks and field data sheets for accuracy and completeness within 48 hours of each sample
collection activity, if possible. Sample results provided by the laboratory will be verified and validated
DRAFT
Guidance for a Tier 2 Water Quality Monitoring QAPP, Rev. 0 Date: June 2012
Page 37 of 37
by the laboratory QA Officer prior to issuing the laboratory report. Laboratory results will include the
results of all QA/QC results as part of the sample data report. The laboratory report will become part of
the permanent file for the monitoring project. The Project Manager or his/her designee will compare
the sample information in the field log notebooks and/or data field sheets with the laboratory analytical
results to ensure that no transcription errors have occurred and to verify project QA/QC criteria have
been met (e.g., relative percent difference (RPD) results for blind sample duplicates, percent analyte
recovery results for matrix spike and matrix spike duplicate (MS/MSD) results, etc).
The Project QA Officer or his/her designee will calculate the RPD between field replicate samples.
Laboratories calculate and report the RPD and percent analyte recovery of analytical duplicate samples
and MS/MSD samples.
Analyte specific precision, accuracy and data completeness results greater than project MQO’s will be
noted by the Project Manager and justified in the final data report. The Project Manager, along with
supervisors and/or the Project QA Officer, if necessary, will decide if any QA/QC corrective action is
necessary if the precision, accuracy (bias) and data completeness values exceed the project’s MQO
goals.
D.2.2 Verification Methods
The primary goal of verification is to document that applicable method, procedural and contractual
requirements were met in field sampling and laboratory analysis. Verification checks to see if the data
is complete, if sampling and analysis matched QAPP requirements, and if Standard Operating
Procedures (SOPs) were followed.
Verification of data is the responsibility of the Project QA Officer. The Project QA Officer should
verify at least 10% of generated project data in addition to all sample data anomalies and sample
results approaching or exceeding AWQS and permit limits.
D.3 RECONCILIATION WITH USER REQUIREMENTS
The Project Manager and the Project QA Officer will review and validate data against the Project’s
defined MQOs prior to final reporting stages. If there are any problems with quality sampling and
analysis, these issues will be addressed immediately and methods will be modified to ensure that data
quality objectives are being met. Modifications to monitoring that affect the quality of reported data
will require notification to and pre-approval by ADEC as well as subsequent edits to the approved
QAPP.
Only data that have been validated, verified and qualified, as necessary, shall be submitted to ADEC
Division of Water and entered into the applicable database (STORET, AQMS, ICIS-NPDES, DROPS).