An official website of the United States government.

This is not the current EPA website. To navigate to the current EPA website, please go to www.epa.gov. This website is historical material reflecting the EPA website as it existed on January 19, 2021. This website is no longer updated and links to external websites and some internal pages may not work. More information »

EPA's Elements of Systematic Planning for Data Quality Assurance

EPA's elements of systematic planning are stated in Chapter 3 of the EPA Quality Manual for Environmental Programs and include:

  • Identification and involvement of the project manager, sponsoring organization and responsible official, project personnel, stakeholders, and experts, etc. (e.g., all customers and suppliers). This element ensures that the study will be designed to address the needs of all vested parties (for example, data users, data generators, data analysts and other stakeholders). Consulting cross-disciplinary experts familiar with the different technical aspects of the problem ensures that important details of the study are not overlooked or ignored and technical challenges will be addressed appropriately. It is also important to assign responsibilities for the project so that conflicts can be resolved and progress is tracked. For some projects, it may be most effective to create a formal "planning team," while for others, one individual may be responsible for the project and involve other individuals when necessary.
     
  • Description of the project goals, objectives, and questions and issues to be addressed. This element ensures that the participants formulate a clear statement of the project's goals and objectives and therefore understand the purpose of the project and expected results. The objectives reflect a general statement of the intent of a project and how that project is linked to addressing the environmental problem (or contributing to the field of science). The project's questions will define what data or information is needed to address the project's goals and objectives. The transition from the project goals, to statement of objectives, to specific and appropriate questions are some of the most important steps in systematic planning.
     
  • Identification of project schedule, resources (including budget), milestones, and any applicable requirements (e.g. regulatory requirements, contractual requirements). Identifying the available resources and deadlines at the beginning of a project helps ensure the project is feasible and timely. A clear statement of the project's resources, constraints, and deadlines helps prevent potential issues and/or conflicts by determining practical bounds on the project as early as possible. Regulatory, statutory, contractual and other constraints should be considered that might affect the project schedule.
     
  • Identification of the type of data needed and how the data will be used to support the project's objectives. This element focuses on identifying the specific type of data or information needed to complete the project. Types of, sources for, and how to obtain information needed to address the study questions should be listed. Sources may include literature, existing databases and/or new data collection. By developing a list of the information needed to address the project questions, the project requirements will be clearly defined. In addition, the list may identify other information that will be helpful, or that can be economically collected to facilitate the use of the project results for other purposes.
     
  • Determination of the quantity of data needed and specification of performance criteria for measuring quality. This element focuses on establishing criteria to ensure that the information and products generated meet the objectives of the project. These quality specifications are established at both the product level and at the level of components of that product, such as the quality of individual measurements. Examples of product-level criteria include EPA's information quality guidelines components -- objectivity, utility, integrity and reproducibility.
    Examples of component-level criteria are quality criteria for individual measurements, for example:
    • criteria for precision
    • bias
    • accuracy
    • representativeness
    • comparability
    • completeness
    • sensitivity
    • criteria for decisions or estimates [for example, a stated desired confidence that results will fall within a specified window such as Type I and Type II error rates (false rejection and acceptance error rates), uncertainty intervals, etc.]
      After the information, data or product is generated, these criteria are used to determine if they met the project's objectives.
  • Description of how and where the data will be obtained (including existing data) and identification of any constraints on data collection. This element focuses on how to amass the data or information needed for a project by collecting new data, using existing data, citing information from other resources, etc. When collecting new data or information, consider where to collect samples (sampling design), when, how to best acquire physical specimens of an adequate size and dimension (sample support) to represent the variable of interest within the sampling unit, questionnaires and survey instruments, sampling technologies, analytical methods, representativeness, etc. When existing data or information (i.e., from models, databases, literature, etc.) is used, consider sources and methods for assembling it. Also consider how the data will be inspected to ensure compatibility with the project's goals and the handling of information/data either through physical custody of samples or the entering of specific information into a database or spreadsheet.
     
  • Specification of QA and QC activities to assess the quality performance criteria (e.g., QC samples for both the field and laboratory, audits, technical assessments, performance evaluations etc.). It is often necessary to plan ahead for QA and QC activities to ensure that a process, item, or service is of the type and quality needed and expected by the customer. QA and QC activities measure the attributes and performance of a process, item or service against defined standards to verify that it meets the stated requirements.
    Example of these activities include: 
    • assessments/audits of field sampling and laboratory activities
    • QC samples (blanks, duplicates, etc)
    • project reports and inspections/testing/maintenance of equipment, supplies and consumables
  • Description of how the acquired data will be analyzed (either in the field or the laboratory), evaluated (i.e., QA review, verification, validation), and assessed against its intended use and the quality performance criteria. This element focuses on the reviews of both the information (such as verification and validation) and the project (peer reviews, clearance procedures, etc.). It is important to determine up front how data and information will be summarized, displayed and communicated; how uncertainty in the information will be determined and accounted for in the final product; and how the information will be used to achieve the project's goals.