2021
DOI: 10.1155/2021/3839800
|View full text |Cite
|
Sign up to set email alerts
|

Block Storage Optimization and Parallel Data Processing and Analysis of Product Big Data Based on the Hadoop Platform

Abstract: The traditional distributed database storage architecture has the problems of low efficiency and storage capacity in managing data resources of seafood products. We reviewed various storage and retrieval technologies for the big data resources. A block storage layout optimization method based on the Hadoop platform and a parallel data processing and analysis method based on the MapReduce model are proposed. A multireplica consistent hashing algorithm based on data correlation and spatial and temporal propertie… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 82 publications
(3 citation statements)
references
References 29 publications
0
3
0
Order By: Relevance
“…Traditional computer paradigms are challenged by data growth and real-time processing. In particular, data storage and retrieval have grown exponentially (Wang, Cheng, Zhang, Leng, & Liu, 2021). Using several computer resources simultaneously has become common to overcome these difficulties.…”
Section: Parallel Processing: a Comprehensive Insightmentioning
confidence: 99%
“…Traditional computer paradigms are challenged by data growth and real-time processing. In particular, data storage and retrieval have grown exponentially (Wang, Cheng, Zhang, Leng, & Liu, 2021). Using several computer resources simultaneously has become common to overcome these difficulties.…”
Section: Parallel Processing: a Comprehensive Insightmentioning
confidence: 99%
“…Reliability and persistency of data storage 1) Effective and valid data collection. The process of obtaining raw data from an external system or network is referred to as data collection [36,37]. Effective and valid data collection is necessary because inefficient and improper data collection will negatively impact subsequent processing.…”
Section: Efficient Data Transmissionmentioning
confidence: 99%
“…The data volume of a single flight test reaches tens of GB. At the same time, with the gradual development of the civil aircraft test flight represented by the C919 aircraft, the total amount of civil aircraft test parameters has increased dramatically, and the total amount of iNET-X [2] data collected and recorded in single flight test network test has reached hundreds of GB [3]. The surge of flight test data brings great pressure to data processing.…”
Section: Introductionmentioning
confidence: 99%