HPC and the Big Data challenge
Autor: | Matthew Newall, Violeta Holmes |
---|---|
Rok vydání: | 2016 |
Předmět: |
business.industry
Computer science Dynamic data Scale (chemistry) 05 social sciences Big data 02 engineering and technology General Medicine Data science HPC Challenge Benchmark Computer cluster 0202 electrical engineering electronic engineering information engineering 020201 artificial intelligence & image processing 0501 psychology and cognitive sciences Computational problem business 050107 human factors Computer technology Efficient energy use |
Zdroj: | Safety and Reliability. 36:213-224 |
ISSN: | 2469-4126 0961-7353 |
Popis: | High performance computing (HPC) and Big Data are technologies vital for advancement in science, business and industry. HPC combines computing power of supercomputers and computer clusters, and parallel and distributed processing techniques for solving complex computational problems. The term Big Data refers to the fact that more data are being produced, consumed and stored than ever before. This is resulting in datasets that are too large, complex, and/or dynamic to be managed and analysed by traditional methods. Access to HPC systems and the ability to model, simulate and manipulate massive and dynamic data, is now critical for research, business and innovation. In this paper an overview of HPC and Big Data technology is presented. The paper outlines the advances in computer technology enabling Peta and Exa scale and energy efficient computing, and Big Data challenges of extracting meaning and new information from the data. As an example of HPC and Big Data synergy in risk analysis, a case study... |
Databáze: | OpenAIRE |
Externí odkaz: |