2022
DOI: 10.3390/sym14071392
|View full text |Cite
|
Sign up to set email alerts
|

Decentralized and Privacy Sensitive Data De-Duplication Framework for Convenient Big Data Management in Cloud Backup Systems

Abstract: The number of customers transferring information to cloud storage has grown significantly, with the rising prevalence of cloud computing. The rapidly rising data volume in the cloud, mostly on one side, is followed by a large replication of data. On the other hand, if there is a single duplicate copy of stored symmetrical information in the de-duplicate cloud backup the manipulation or lack of a single copy may cause untold failure. Thus, the deduplication of files and the auditing of credibility are extremely… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 38 publications
0
1
0
Order By: Relevance
“…In [11], the researchers proposed a decentralized block-level data de-duplication framework for big data management in cloud Systems. In order to improve deduction efficacy and reduce workload, the proposed approach employed a two-level routing decision for directing the file after clients based on data similitude and locality.…”
Section: Related Workmentioning
confidence: 99%
“…In [11], the researchers proposed a decentralized block-level data de-duplication framework for big data management in cloud Systems. In order to improve deduction efficacy and reduce workload, the proposed approach employed a two-level routing decision for directing the file after clients based on data similitude and locality.…”
Section: Related Workmentioning
confidence: 99%