Balancing performance and preservation lessons learned with HDF5

Autor: Mike Folk, Elena Pourmal
Rok vydání: 2010
Předmět:
Zdroj: Proceedings of the 2010 Roadmap for Digital Preservation Interoperability Framework Workshop.
DOI: 10.1145/2039274.2039285
Popis: Fifteen years ago, The HDF Group set out to re-invent the HDF format and software suite to address two conflicting challenges. The first was to enable exceptionally scalable, extensible storage and access for every kind of scientific and engineering data. The second was to facilitate access to data stored in the HDF long into the future.This challenge grew out of necessity. Some of the most ambitious scientific projects, such as NASA's Earth Observing System, need scalable solutions to their data generation and data gathering activities. At the same time, data consumers in these projects need assurances that their data will retain its value and accessibility for decades to centuries into the future. The HDF Group has worked to discover and pursue technological and institutional strategies that address these requirements for the broadest possible range of data applications.To achieve this objective, care and resources must be applied in the design, development, and maintenance of the technologies, and attention must be paid to integration with complementary technologies. This technical rigor must be complemented by an institutional model that will provide resources for current activities and sustainability for the long term, as well as active involvement with data producers and consumers to understand and respond to their needs.The paper describes how The HDF Group balances its commitment to providing the best solutions to today's data challenges against the need to meet data preservation requirements.
Databáze: OpenAIRE