The observing infrastructure of the Ocean Observatories Initiative (OOI) was designed in response to community inputs and fields many diverse sensors, instrument packages, and platforms. The OOI cyberinfrastructure (CI) collects and serves data from the sensors and instruments. The CI team is engaged in evaluating data collected to date and in verifying the fidelity of the end-to-end processes that results in calibrated records in engineering units served to users on the Data Portal. A Scientific Oversight Committee (SOC) within the OOI is working with the CI to further validate the data by selecting periods when SOC members, many of whom who served as Chief Scientists on the deployment cruises, know either of available intercomparison data or are aware of the context in which these observations were taken. This has been called the First Article verification.
However, because of the great diversity of the sensor types deployed by the OOI, it is understood that the familiarity and expertise of the SOC and the data team members will not be sufficient to fully investigate and evaluate all types of data being collected. The SOC members have identified the instruments for which assistance is sought. In order to recruit the expertise needed to evaluate and validate the data from these instruments, OOI issued a request for assistance for Subject Matter Experts (SMEs).
The task for the SMEs is to take a segment of raw data from a specific instrument, and to apply their knowledge and processing tools together with the requisite calibration, sampling, and metadata to produce a calibrated, engineering units version and assess its validity. Assessing validity means answering whether or not the instruments are working properly and yielding data that are realistic. It also means examining whether or not the sampling protocol implemented for the deployment is appropriate to achieve the scientific goals of deploying that instrument. For example, does the sampling avoid problems such as aliasing, does it capture the variability of the signal, and does it avoid false biases and undersampling errors. In most cases, the segment of data will be taken from an early deployment time interval and/or a segment already under scrutiny by the CI Data Evaluators and SOC.
The term “raw data” implies that the most basic form of the data will be provided to the SME, as recorded in instrument memory or supplied via a cable. In other words, no transformations or processing have yet been applied. These raw data may come from one of the Marine Implementing Organizations (MIOs) that deployed that specific instrument or from CI that captured the raw file.
The SME will be asked to fill out an evaluation form, providing specifics of what processing tools they used themselves, any problems encountered, and any questions or comments about data quality. The SME will also be asked to provide a file of their processed data.
Closure for the task will come by linking the SME’s feedback form and file to the combined SOC and CI data assessment teams, which exist for each of Pioneer, Endurance, Global, and Cabled components of the OOI. The data produced by the SME will be compared to the data for the same time/place/instrument/sensor produced by the CI. Improvements to sampling protocols, data quality algorithms and calibration procedures will be implemented by OOI and added to the OOI metadata as appropriate.