Quality Control and Pre-Analysis Treatment of the Environmental Datasets Collected by an Internet Operated Deep-Sea Crawler during Its Entire 7-Year Long Deployment (2009–2016)

Autor: Laurenz Thomsen, Martin Scherwath, Damianos Chatzievangelou, Jacopo Aguzzi
Přispěvatelé: University of Victoria, Transport Canada, Province of British Columbia, Helmholtz-Zentrum Dresden-Rossendorf, Tecnoterra, Helmholtz Association, Ministerio de Ciencia, Innovación y Universidades (España), Fisheries and Oceans Canada, Canada Foundation for Innovation, Agencia Estatal de Investigación (España)
Rok vydání: 2020
Předmět:
Zdroj: Sensors (Basel, Switzerland)
Sensors
Volume 20
Issue 10
Sensors, Vol 20, Iss 2991, p 2991 (2020)
Digital.CSIC. Repositorio Institucional del CSIC
instname
ISSN: 1424-8220
2017-8786
DOI: 10.3390/s20102991
Popis: Special issue Selected Papers from the 2019 IMEKO TC-19 International Workshop on Metrology for the Sea.-- 20 pages, 8 figures, 2 tables, 1 appendix, supplementary materials at http://www.mdpi.com/1424-8220/20/10/2991/s1
Deep-sea environmental datasets are ever-increasing in size and diversity, as technological advances lead monitoring studies towards long-term, high-frequency data acquisition protocols. This study presents examples of pre-analysis data treatment steps applied to the environmental time series collected by the Internet Operated Deep-sea Crawler “Wally” during a 7-year deployment (2009–2016) in the Barkley Canyon methane hydrates site, off Vancouver Island (BC, Canada). Pressure, temperature, electrical conductivity, flow, turbidity, and chlorophyll data were subjected to different standardizing, normalizing, and de-trending methods on a case-by-case basis, depending on the nature of the treated variable and the range and scale of the values provided by each of the different sensors. The final pressure, temperature, and electrical conductivity (transformed to practical salinity) datasets are ready for use. On the other hand, in the cases of flow, turbidity, and chlorophyll, further in-depth processing, in tandem with data describing the movement and position of the crawler, will be needed in order to filter out all possible effects of the latter. Our work evidences challenges and solutions in multiparametric data acquisition and quality control and ensures that a big step is taken so that the available environmental data meet high quality standards and facilitate the production of reliable scientific results
This research was developed within the framework of Ocean Networks Canada and NEPTUNE Canada, an initiative of the University of Victoria, and primarily funded by the Canadian Foundation for Innovation, Transport Canada, Fisheries and Oceans Canada, and the Canadian Province of British Columbia; Helmholtz Alliance and Tecnoterra (ICM-CSIC/UPC) and the following project activities: ROBEX (HA-304); ARIM (Autonomous Robotic sea-floor Infrastructure for benthopelagic Monitoring; MartTERA ERA-Net Cofound); ARCHES (Autonomous Robotic Networks to Help Modern Societies; German Helmholtz Association) and RESBIO (TEC2017-87861-R; Ministerio de Ciencia, Innovación y Universidades)
With the funding support of the ‘Severo Ochoa Centre of Excellence’ accreditation (CEX2019-000928-S), of the Spanish Research Agency (AEI)
Databáze: OpenAIRE
Nepřihlášeným uživatelům se plný text nezobrazuje