Published By: Unitrends
Published Date: Apr 12, 2010
The purpose of deduplication is to provide more storage, particularly backup storage, for less money, right? Then wouldn't it be ridiculous if deduplication vendors were demanding that their customers pay more per terabyte of storage? Or if they were simply pushing the task of integrating, monitoring, and managing deduplication back onto their users?
Fast-growing data and widespread virtualization have ratcheted up data protection challenges no matter what the size of your business. This paper will describe the challenges of data protection, why deduplication is critical to meeting the challenges, how HP is achieving its vision of federated dedupe with StoreOnce – and what HP’s StoreOnce VSA announcement and achievement means to backup services providers, enterprises with remote or branch offices and small and medium businesses as well.
Deduplication is not new—and HP was certainly not the first to market it. Instead, by watching how deduplication was introduced and listening to the evolving demands of customers who struggle with storage and backup issues, HP built StoreOnce as a next-generation or “Dedupe 2.0” architecture that is available now. With its formidable enterprise experience, server and storage product lines, and broad partner ecosystem, HP intends to catch up with and, in fact, surpass the status quo to bring better deduplication at a lower cost.
In these days of rampant data growth, a technology that can increase the effective capacity of a disk-based backup system by a ratio of up to 50:1 is big news. Data deduplication allows you to store up to 50 times more backup data into the same disk footprint — giving you a better chance of restoring your users' lost data from exactly the point they need it. This white paper explains how this innovative data deduplication solution works and also establishes how a 50:1 deduplication ratio may be achieved.
Data deduplication technology represents one of the most significant storage enhancements in recent years, promising to reshape future data protection and disaster recovery solutions. Data deduplication offers the ability to store more on a given amount of storage and replicate data using lower bandwidth links at a significantly reduced cost. This white paper explains how deduplication technologies work in practice, the pros and cons of each approach, and when to choose a particular type.
Published By: Arcserve
Published Date: May 29, 2015
Today, data volumes are growing exponentially and organizations of every size are struggling to manage what has become a very expensive and complex problem. It causes real issues such as:
• Overprovisioning their backup infrastructure to anticipate rapid future growth.
• Legacy systems can’t cope and backups take too long or are incomplete.
• Companies miss recovery point objectives and recovery time targets.
• Backups overload infrastructure and network bandwidth.
• Not embracing new technologies, such as cloud backup, because there is too much data to transfer over wide area networks.
DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
Our portfolio of live events, online and print publishing, business intelligence and professional development brands are centred on the complexities of technology convergence. Operating in 42 different countries, we have developed a unique global knowledge and networking platform, which is trusted by over 30,000 ICT, engineering and technology professionals.
Data Centre Dynamics Ltd.
102-108 Clifton Street
London EC2A 4HW