by Hewlett-Packard Company
> View this now
Published on: June 29, 2012
Type of content: WHITE PAPER
Length: 8 pages
Despite the many benefits of deduplication, first-generation dedupe technologies also have significant drawbacks.
Older dedupe uses an inefficient process that reads each entire data chunk on disk to determine if a new chunk is a match. This laborious process taxes the CPU and slows down hardware and other applications. This can greatly impact the performance of backup or application servers to the point where deduplication makes them virtually unusable, or prevents them from scaling to back up large volumes of data.
This short white paper explores how you can easily solve the challenges of traditional deduplication with a federated dedupe approach. A federated approach supports the notion that dedupe should be performed only once, anywhere, with efficient data movement, and all managed through a single pane of glass. Read now to learn more.