In recent years, data deduplication has made its ways from the storage to the networking community. More specifically, it has become an important tool for optimizing application performance across the WAN. By eliminating the transfer of repetitive IP traffic, deduplication significantly improves WAN utilization and accelerates data transfers between geographically disperse locations. This saves bandwidth costs and helps to overcome many obstacles when communicating across a WAN.
This paper discusses how WAN deduplication works, and how it can be effectively deployed as a complement to existing storage deduplication solutions.
Between increased use of server virtualization, the dynamic proliferation of private cloud, the ever-growing unstructured data pool, and the advent of “big data,” organizational data is going to continue to grow. Learn more about what HP has in store
This paper looks at the benefits to be realized in the combined scenario of HP StoreOnce appliances being used with HP’s own Data Protector 7 backup software while considering the topic of “where” the deduplication occurs.
The early days of the de-duplication target market could be characterized as the Wild West, with a slew of start-ups shooting it out in an emerging market with tremendous potential. HP has a good chance of joining the ranks of IBM and EMC.
In this white paper, learn the difference between application-based deduplication vs. appliance-based deduplication, and discover how each method fares when it comes to cost of deployment, flexibility in operating with other platforms and rehydration of data during restores.
Published By: Unitrends
Published Date: Aug 18, 2015
The purpose of deduplication is to provide more storage, particularly backup storage, for less money, right? Then wouldn't it be ridiculous if deduplication vendors were demanding that their customers pay more per terabyte of storage? Or, if they were simply pushing the task of integrating, monitoring, and managing deduplication back onto their users? This white paper is to help you understand the various approaches to deduplication, the strengths and weaknesses of each, and to introduce a different approach to deduplication, Adaptive Deduplication.
Published By: Arcserve
Published Date: Feb 26, 2015
In typical organizations, most data resides outside the data center, so it is important that the protection of desktop and laptop computers is given the same priority as file servers and application servers. Have you deployed the right data protection strategy for endpoints?
We’re here to help! Arcserve UDP offers a FREE Workstation edition product that specifically focuses on backing up data on endpoints.
Not only can desktops and laptops be protected for FREE with award-winning technology that minimizes bandwidth and storage requirements, but they can participate in the global deduplication schema offered by UDP (for 30 days), and have their data protected in public and private clouds, and more!
This is too good to pass up! Get your FREE Arcserve UDP Workstation edition now.
Published By: Arcserve
Published Date: May 29, 2015
The traditional taxonomy of data protection appliances – also known as Purpose Built Backup Appliances (PBBAs) – covers two primary categories:
• Backup Appliances
• Deduplication Appliances
Let’s explore this taxonomy and market dynamics.
This paper outlines some of the important considerations for encrypting data on tape, and is intended to help IT administrators make the right decision for their environment and their company's data security needs.
In this competitive whitepaper, Edison Group provides an independent, third-party perspective and evaluation of HP's new B6200 StoreOnce Backup System versus EMC Data Domain. Criteria considered included scalability (including capacity and performance), high availability, architectural approach, pricing, and licensing.
Enterprise Strategy Group shares why client-side deduplication is the best. Dedupe 2.0 leverages intelligence and awareness at the source, backup server, and storage device. In these scenarios, the awareness of what data is already in the deduplicated storage and the discernment to send new data or not is performed within the production server instead of the backup server or deduplicated storage. Hence, network savings begin at the production server and backups are significantly faster since only changed data is transmitted from the production server to the storage solution.
This report describes how improving the efficiency of data storage, deduplication solutions has enabled organizations to cost-justify the increased use of disk for backup and recovery. However, the changing demands on IT storage infrastructures have begun to strain the capabilities of initial deduplication products. To meet these demands, a new generation of deduplication solutions is emerging which scale easily, offer improved performance and availability and simplify management and integration within the IT storage infrastructure. HP refers to this new generation as "Deduplication 2.0.
This report covers the challenges of first generation deduplication technology and the advantages of next-gen deduplication products. Next generation Dedupe 2.0 systems use a common deduplication algorithm across all storage systems-whether they're smaller systems in branch offices or large data center storage facilities. That means no more reconstituting data as it traverses different storage systems, which saves bandwidth and improves performance.
HP 3PAR StoreServ thin deduplication is the latest feature in a line of thin technologies, including thin provisioning, thin persistence and thin reclaim that deliver value and cost savings to the customer. Each of the technologies is fully built-in to the 3PAR StoreServ architecture. In this study, HP 3PAR StoreServ was compared to competing all-flash offerings from SolidFire, Pure Storage and EMC.
Hardware accelerated deduplication with no performance penalty is now available with the HP 3PAR StoreServ 7450 All Flash Array. In this ChalkTalk, HPStorageGuy Calvin Zito shows how HP now has the most innovative all-flash array in the industry.
For remote office protection, you can reliably get a backup and a DR copy of data at any location without the need for onsite expertise. Read this paper to learn more about the next generation of deduplication.
The early days of the de-duplication target market could be characterized as the Wild West according to 451 Research, with a slew of startups shooting it out in an emerging market with tremendous potential.
Published By: Unitrends
Published Date: Apr 12, 2010
The purpose of deduplication is to provide more storage, particularly backup storage, for less money, right? Then wouldn't it be ridiculous if deduplication vendors were demanding that their customers pay more per terabyte of storage? Or if they were simply pushing the task of integrating, monitoring, and managing deduplication back onto their users?
Published By: Unitrends
Published Date: May 12, 2011
In this paper, we'll compare and contrast the advantages and disadvantages of each of these techniques and explain why incremental forever when combined with byte-level deduplication is the superior methodology for reducing redundant data in the most efficient manner possible.
DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
Our portfolio of live events, online and print publishing, business intelligence and professional development brands are centred on the complexities of technology convergence. Operating in 42 different countries, we have developed a unique global knowledge and networking platform, which is trusted by over 30,000 ICT, engineering and technology professionals.
Data Centre Dynamics Ltd.
102-108 Clifton Street
London EC2A 4HW