Top Menu

Quality Control

OOI Data Quality Control procedures were designed with the goal of meeting IOOS Quality Assurance of Real Time Ocean Data (QARTOD) standards. In addition to daily human-in-the-loop QC tests, as data streams are collected, data products are run through up to six automated QC algorithms. QC reports are created on a biweekly or monthly basis.

Specific details of QA/QC methods for OOI data and data products and physical samples can be found within the Protocols and Procedures for OOI Data Products document. This includes calibration and field verification procedures.
Download Here

Automated QC Algorithms

Data products are run through six automated QC algorithms. QC reports are created on a biweekly or monthly basis. Automated QC algorithms were coded based on specifications created by OOI Project Scientists and derived from other observatory experiences. The six algorithms currently implemented are:

  • Global Range Test (pdf)
    • Generates flags for data points according to whether they fall within universally valid world ocean ranges or instrument limits (whichever is more restrictive)
  • Local Range Test (pdf)
    • Generates flags for data points according to whether they fall within locally valid site-specific or depth ranges
  • Spike Test (pdf)
    • Generates a flag for individual data values that deviate significantly from surrounding data values
  • Gradient Test (pdf)
    • Generates QC flags indicating whether changes between successive data points are remote from a baseline of ‘good’ data points
  • Trend Test (pdf)
    • Used to test time-series for whether the data contain a significant portion of a polynomial, to check for sensor drift.
  • Stuck Value Test (pdf)
    • Generates a flag for repeated occurrence of one value in a time series.

A seventh (Density Inversion) test will hopefully also be implemented, based on its utility in other observatory projects (e.g. Argo). This test generates a flag if data at increasing depths do not also increase in density (or vice versa)

QC algorithms are only run on science data products, not on auxiliary or engineering parameters. The automated QC algorithms do not screen out or delete any data, or prevent it from being downloaded. They only flag “suspect” data points in the plotting tools and deliver those flags as additional attributes in downloaded data.

The algorithm code used to process the QC algorithms in the OOI Cyberinfrastructure system can be found in the ion-functions GitHub repository.

QC lookup tables that describe these limits can be in the ooi-integration GitHub repository.

Manual QC Tests

After automated QC flags are generated by the algorithms and saved, daily human-in-the-loop QC tests are performed.  Manual QC tests are conducted by a team of four data evaluators, led by the data manager. Tests include both Quick Look tests (a first pass by evaluators using automated tools) as well as Deep Dives (a closer look at data flagged as suspect, drawing in Subject Matter Experts). The data team will clearly annotate any data stream that triggers QC-related alerts, as well as any data that are flagged as suspect during manual inspection.

If a user identifies an issue with an OOI data product or has a question about QC procedures, please contact the Data Team through the OOI Helpdesk.

Quality Control Goals

OOI instrument deployment and data quality control procedures were designed with the goal of meeting QARTOD quality control standards:

  • Every real-time observation must be accompanied by a quality descriptor
  • All observations should be subject to automated real-time quality tests
  • Quality flags and test descriptions must be described in the metadata
  • Observers should verify / calibrate sensors before deployment
  • Observers should describe methods / calibration in real-time metadata
  • Observers should quantify level of calibration accuracy and expected error
  • Manual checks on automated procedures, real-time data collected, and status of observing system must be provided on an appropriate timescale