AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |
Back to Blog
Deduplicator em client3/5/2023 ![]() The application software and databases in cloud computing are moved to the centralized large data centers, where the management of the data and services may not be fully trustworthy. In these days since the data is very big in size many users are interested to store their valuable data in the cloud. Cloud is global platform that allows digital information to be stored and distributed at very less cost and very fast to use. Moreover, dualDup framework provides all the aspects of deduplication, attack mitigation, key security and management, reliability, and QoS features as compared to other state-of-the-art deduplication techniques.Ĭloud Computing is the important buzzword in the today’s world of computer. We validated through security analysis that the proposed framework is secure from insider and outsider adversaries. The results demonstrate that the proposed framework achieves reliability with an average storage overhead of 66.66% corresponding to the Reed–Solomon(3,2) codes. Experiments are conducted in a realistic environment. The proposed approach is implemented in Python on the top of the Dropbox datacenter and corresponding results are reported. To address these problems, this paper proposes the dualDup framework that (a) optimizes the storage by eliminating the duplicate encrypted data from multiple users by extending DupLESS concept, and (b) securely distributes the data and key fragments to achieve the privacy and reliability using Erasure Coding scheme. Hence, there is a need to develop a secure-deduplication mechanism that is not vulnerable to any malicious activity, semantically secures both data and key, and achieves the reliability. ![]() In essence, the existing related works aim to handle either secure-deduplication or reliability limited to either key reliability or the data reliability. The DupLESS scheme, on the other hand, keeps both the key and the data on a single storage server, which is unreliable if that server goes down. Besides, the data stored in the cloud is unreliable due to the possibility of data losses in remote storage environments. To address this problem, recently, Duplicateless Encryption for Simple Storage (DupLESS) scheme is introduced in the literature. They cannot deduplicate the identical data when the clients upload the data in the encrypted form. ![]() Through this method, the data theft can be prevented not only to the reward user but also to the old user who previously owned that data.Ĭloud Storage Providers generally maintain a single copy of the identical data received from multiple sources to optimize the space. So, Random convergent encryption and stable ownership key distribution are used. ![]() By following this model there is a proper control over the delegated data even after the control changes from person to person. We present a novel server-side deduplication model for all encrypted data in this paper. But by following this described method many flaws were found. That was actually done by sharing the same encrypted data for the same data which is shared by each owner. Recently so many models were released to solve this problem. Deduplication finds its usage when a group of users stores the same data to the cloud storage service, but by using this method there is an issue regarding the ownership and security, and before getting into this process many users first encrypt their data and then share it with the cloud storage so that there is minimum or no privacy issue found. It is used to reduce the space and the bandwidth requirements as a result, it reduces the redundancy and stores only the original copy. Deduplication technology is familiar in cloud-based services.
0 Comments
Read More
Leave a Reply. |