Autor: |
P. Senthil Kumari, Nurul Aisha Binti Aboo Bucker |
Rok vydání: |
2022 |
Předmět: |
|
Zdroj: |
Materials Today: Proceedings. 60:1329-1333 |
ISSN: |
2214-7853 |
DOI: |
10.1016/j.matpr.2021.09.435 |
Popis: |
A distributed computing platform is preferred to meet out the current challenges of cloud computing paradigm. Hadoop manages distribution of large volume of big data and data is stored using cluster analysis of data. Hadoop Distributed File System (HDFS) is used to manage the problems in handling big data. HDFS can easily store large terabytes of data using commodity servers. The proposed system provides efficient data integrity verification service for big data management based on HDFS. HDFS has data integrity commitment, since it handles large volumes of data. Hadoop HDFS framework uses CRC-32C checksum verification on data node of content blocks to identify data corruption. A data aware module is implemented in Hadoop which provides more clustering process and reduces the computing performances of server. A broad range of data types is analyzed and clustered. The proposed research work uses balanced and proxy encryption technique using Cloud me tool and gives optimized query time and resource usage. |
Databáze: |
OpenAIRE |
Externí odkaz: |
|