Paper Compression

Published on January 2017 | Categories: Documents | Downloads: 23 | Comments: 0 | Views: 136
of 1
Download PDF   Embed   Report

Comments

Content


On the fly data compression will again be a topic - the data flow through the network pipe
between data owner and data processor should be compressed, and encrypted. We know
database access are great candidate for data compression because of the sparsity of the
data retrieved.
Splitting data storage and data processing is the key for the success of Cloud computing in
Enterprise world.
Deduplication and compression in cloud computing aims at reduction in storage
space and bandwidth usage during file transfers. Only a copy of the duplicate files is
retained while others are deleted. The existence of duplicate files is determined
from the metadata. The files are clustered into bins depending on their size. They
are then segmented, deduplicated, compressed and stored. When the user requests
a file, compressed segments of the file are sent over the network along with the
file-to-segment mapping. These are the uncompressed and combined to create a
complete file, hence minimizing bandwidth requirements
In enterprise databases, a large amount of redundant copies of data exist because
of full system backups, shared documents etc. Data deduplication aims at reducing
these redundant copies of data. When a backup application creates a backup, which
is scheduled fortnightly or weekly depending on the criticality of the data, it creates
a big file or series of individual files. Backup of these files has to be taken. Every
such backup creates a redundant copy of data. Also, transmission of huge volumes
of data over the network consumes a lot of network resources.
So if there is a .jpeg image, and it is inserted in both a Word document as well as a PowerPoint
presentation, only one copy of the three images is stored. A reader on the user's PC, server or NAS head
eliminates any noticeable read latency penalty. Application-aware data reduction ratios typically range
from 4:1 to 10:1, which is usually two to five times greater than other data reduction technologies when
used on primary data storage.

Sponsor Documents

Or use your account on DocShare.tips

Hide

Forgot your password?

Or register your new account on DocShare.tips

Hide

Lost your password? Please enter your email address. You will receive a link to create a new password.

Back to log-in

Close