Popis: |
In the era of cloud computing, big data and Internet of things, research is very often data-driven: based on the analysis of data, increasingly available in large quantities and collected by experiments, observations or simulations. These data are very often characterized as being dynamic in space and time and as continuously expanding (monitoring) or change (data quality management or survey). Modern Spatial Data Infrastructures (e.g. swisstopo or INSPIRE), are based on interoperable Web services which expose and serve large quantities of data on the Internet using widely accepted and used open standards defined by the Open Geospatial Consortium (OGC) and the International Organization for Standardization (ISO). These standards mostly comply with FAIR principles but do not offer any capability to retrieve a dataset how it was in a defined instant, to refer to its status in that specific instant and to guarantee its immutability. These three aspects hinder the replicability of research based on such a kind of services. We discuss the issue here and the state of the art and propose a possible solution to fill this gap, using or extending when needed the existing standards and or adopting best practices in the fields of sensor data, satellite data and vector data. |