Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.

sections
 

petabyte

Results 1 - 24 of 24Sort Results By: Published Date | Title | Company Name
Published By: MarkLogic     Published Date: Jun 09, 2017
Today, data is big, fast, varied and constantly changing. As a result, organizations are managing hundreds of systems and petabytes of data. However, many organizations are unable to get the most value from their data because they’re using RDBMS to solve problems they weren’t designed to fix. Why change? In this white paper, we dive into the details of why relational databases are ill-suited to handle the massive volumes of disparate, varied, and changing data that organizations have in their data centers. It is for this reason that leading organizations are going beyond relational to embrace new kinds of databases. And when they do, the results can be dramatic
Tags : 
    
MarkLogic
Published By: Dell EMC     Published Date: Aug 22, 2017
Identifying the benefits, costs, and risks associated with an Isilon implementation, Forrester interviewed several customers with experience using Isilon. Dell EMC Isilon is a scale-out NAS platform that enables organizations to store, manage, and analyze unstructured data. Isilon clusters are composed of different node types that can scale up to 68 petabytes (PB) in a single cluster while maintaining management simplicity. Isilon clusters can also scale to edge locations and the cloud
Tags : 
    
Dell EMC
Published By: Pure Storage     Published Date: Nov 02, 2017
In the new age of big data, applications are leveraging large farms of powerful servers and extremely fast networks to access petabytes of data served for everything from data analytics to scientific discovery to movie rendering. These new applications demand fast and efficient storage, which legacy solutions are no longer capable of providing.
Tags : 
big data analytics, genomics, medicine, digital science, engineering, design, software, development, next-generation
    
Pure Storage
Published By: MarkLogic     Published Date: Nov 08, 2016
Today, data is big, fast, varied and constantly changing. As a result, organizations are managing hundreds of systems and petabytes of data. However, many organizations are unable to get the most value from their data because they’re using RDBMS to solve problems they weren’t designed to fix. Download now to learn more.
Tags : 
    
MarkLogic
Published By: MarkLogic     Published Date: Nov 07, 2017
Today, data is big, fast, varied and constantly changing. As a result, organizations are managing hundreds of systems and petabytes of data. However, many organizations are unable to get the most value from their data because they’re using RDBMS to solve problems they weren’t designed to fix. Why change? In this white paper, we dive into the details of why relational databases are ill-suited to handle the massive volumes of disparate, varied, and changing data that organizations have in their data centers. It is for this reason that leading organizations are going beyond relational to embrace new kinds of databases. And when they do, the results can be dramatic.
Tags : 
    
MarkLogic
Published By: BlueArc Corporation     Published Date: Mar 31, 2008
BlueArc’s Titan 3000 Series is designed to meet the requirements of today’s sophisticated enterprise data centers and vertical applications with new levels of storage performance, scalability and reliability. Titan is the first storage solution that consolidates and manages up to 4 petabytes of data in a single storage pool.
Tags : 
titan, tiered storage, storage, bluearc, storage system, network management
    
BlueArc Corporation
Published By: BlueArc Corporation     Published Date: Apr 24, 2008
BlueArc’s Titan Architecture creates a revolutionary step in file servers by creating a hardware-based file system that can scale throughput, IOPS and capacity well beyond conventional software-based file servers. With its ability to virtualize a massive storage pool of up to 4 petabytes of usable, tiered storage, Titan can scale with growing data requirements and provide a competitive advantage for business processes by consolidating applications while ensuring optimal performance.
Tags : 
bluearc, storage, titan, network architecture
    
BlueArc Corporation
Published By: Oracle     Published Date: Apr 03, 2012
The 2011 IOUG Database Growth Survey, conducted in partnership with Oracle Corporation, includes input from 611 IOUG members and summarizes the survey results explore issues and solutions around managing fast-growing database environments.
Tags : 
ioug, petabyte, database, security, growth, database environments, solutions
    
Oracle
Published By: Dell EMC     Published Date: Jun 29, 2016
EMC Isilon scale-out network-attached storage (NAS) is a simple and scalable platform to build a scale-out data lake and persist enterprise files of all sizes that scale from terabytes to petabytes in a single cluster. It enables you to consolidate storage silos, improve storage utilization, reduce costs, while providing you a future proofed platform to run today and tomorrow's workloads.
Tags : 
network, storage, data, best practices
    
Dell EMC
Published By: Dell EMC     Published Date: Mar 18, 2016
EMC Isilon scale-out network-attached storage (NAS) is a simple and scalable platform to build out a scale-out data lake and persist enterprise files of all sizes that scale from terabytes to petabytes in a single cluster.
Tags : 
emc, data lake, emc isilon, network, storage, enterprise
    
Dell EMC
Published By: HP     Published Date: Aug 03, 2009
The emergence of genomics and advanced gene sequencing techniques has made the collection and storage of data a centerpiece of biomedical research. As the data generated in biomedical research becomes richer and richer, having the infrastructure in place to deal with data growth efficiently is going to be a cornerstone of biomedical data management. This white paper examines a joint solution that features data reduction technologies combined with a network-attached storage system that offers storage optimization capacities along with an affordable, manageable, and scalable petabyte-ready storage platform.
Tags : 
data, storage, genomics
    
HP
Published By: HP     Published Date: Aug 03, 2009
Medical research is surging into the 21st Century, and includes the dawn of personalized medicine. The realization of personalized medicine is being driven by the increasing speed and dropping costs of gene sequencing. These new technologies for rapid sequencing have created a dramatic need for storage technologies that will radically increase the speed, while reducing costs for research storage. Read this white paper to learn how the exponential growth in genome-mapping data has spawned the growth in affordable, petabyte-capacity storage solutions that can scale as quickly as the data is produced.
Tags : 
hp storageworks, data storage, petabyte-scale storage, network-attached storage (nas)
    
HP
Published By: HP     Published Date: Aug 03, 2009
In the world of biomedical research, particularly in the fields of genomics and personalized medicine, the exponential volumes of data created are rapidly overtaking conventional data storage technologies. In answer to this challenge, the next generation of petabyte-scale data storage technologies is here. Watch this archived Webinar to learn about affordable and easily scalable data storage solutions that are ready to partner with and facilitate the future of biomedical research breakthroughs.
Tags : 
storage, petabyte-scale, data challenge
    
HP
Published By: Microsoft     Published Date: Jul 20, 2018
If you’re like most developers, you spend about two-thirds of your time building and the rest of your time learning. Sometimes your research is proactive, based on personal interest or a desire to maintain your skills, but more likely it’s out of necessity—learning what you need to tackle your next big project. Not long ago, new projects assumed an on-premises or hosted approach, forcing you to spend considerable time on essential requirements like scalability, availability, and security. Today, you still must deliver on those essentials, but the apps you’re building need to do a lot more: support millions of users, span the globe, make sense of petabytes of data, and delight users in new ways. Cloud services not only deliver “built-in” scalability and availability, but also provide an unprecedented range of prebuilt functionality for driving innovation into your apps—ready to evaluate for your next project, without your having to worry about physical infrastructure and the related ove
Tags : 
    
Microsoft
Published By: Microsoft     Published Date: Jul 20, 2018
When you’re in the software business, your resources can be spread pretty thin. You like to stay focused on building great software, but there are always distractions— from supporting sales opportunities to assisting with customer deployments and troubleshooting existing installations. With so much time spent maintaining the status quo, few cycles remain for modernizing your technology, streamlining internal operations, delivering new customer value, and broadening your customer base. More likely than not, your offerings are based on an on-premises approach, forcing you and your customers to spend considerable time on essential requirements like infrastructure, scalability, availability, and security. Today, you still must deliver on those essentials, but the solutions you’re building—or, more accurately, the solutions you want to be building—need to do a lot more: support millions of users, span the globe, make sense of petabytes of data, and wow users in new ways. At the same time, y
Tags : 
    
Microsoft
Published By: CDW     Published Date: Apr 04, 2016
The EMC VNX family delivers industry-leading innovation and enterprise capabilities for file and block storage in a scalable, easy-to-use unified storage solution. VNX storage combines powerful and flexible hardware with advanced efficiency, management, and protection software to meet your economical, and performance needs. With the VNX you can consolidate multiple workloads and still enjoy the best economics for yoru virtual and physical applications without compromise. The VNX allows you to enjoy the performance of flash at the cost-effectiveness of disk for all your mixed workloads. With industry leading application integration, you can start small and scale to multi-petabyte environments, you can utilize automated storage tiering, encryption, local and remote protection all while knowing that VNX is trusted by tens of thousands of customers.
Tags : 
storage management, performance management, best practices, application integration, software protection, business intelligence, network management, network performance
    
CDW
Published By: CDW - EMC     Published Date: Apr 07, 2016
The EMC VNX family delivers industry-leading innovation and enterprise capabilities for file and block storage in a scalable, easy-to-use unified storage solution. VNX storage combines powerful and flexible hardware with advanced efficiency, management, and protection software to meet your economical, and performance needs. With the VNX you can consolidate multiple workloads and still enjoy the best economics for yoru virtual and physical applications without compromise. The VNX allows you to enjoy the performance of flash at the cost-effectiveness of disk for all your mixed workloads. With industry leading application integration, you can start small and scale to multi-petabyte environments, you can utilize automated storage tiering, encryption, local and remote protection all while knowing that VNX is trusted by tens of thousands of customers.
Tags : 
storage management, performance management, best practices, application integration, software protection, business intelligence, it management, enterprise applications, infrastructure, network management, network performance
    
CDW - EMC
Published By: CrowdStrike     Published Date: May 10, 2018
One of the biggest challenges to effectively stopping breaches lies in sifting through vast amounts of data to find the proverbial “needle in the haystack” – the subtle clues that indicate an attack is imminent or underway. As modern computer systems generate billions of events daily, the amount of data to analyze can reach petabytes. Compounding the problem, the data is often unstructured, discrete and disconnected. As a result, organizations struggle to determine how individual events may be connected to signal an impending attack. In this context, detecting attacks is often difficult, and sometimes impossible. This white paper describes how CrowdStrike solved this challenge by building its own graph data model – the CrowdStrike Threat Graph? – to collect and analyze extremely large volumes of security-related data, and ultimately, to stop breaches. This revolutionary approach applies massive graph-based technologies, similar to the ones developed by Facebook and Google, to detect k
Tags : 
    
CrowdStrike
Published By: AstuteIT_ABM_EMEA     Published Date: Feb 28, 2018
Viele Unternehmen, die einen Wechsel in die Cloud in Angriff nehmen, verlagern erst einmal sekundäre Daten vor Ort in den Cloud-Speicher. Bei Kosten von einem Bruchteil eines Cents pro Gigabyte ist der wirtschaftliche Aspekt sehr interessant – wer möchte nicht von drastisch gesenkten Speicherkosten profitieren? Die Entscheidung, Petabyte von Daten in die Cloud zu senden, sollte jedoch nicht leichtfertig getroffen werden. Wir bei Veritas wissen, dass eine solide Datenverwaltung in der Cloud vor allem globale Datentransparenz voraussetzt. Nachdem Sie Ihre Daten verstanden und visualisiert haben, können Sie: • SIE KLASSIFIZIEREN • SIE VERLAGERN • DATENANOMALIEN IDENTIFIZIEREN Im Folgenden finden Sie wichtige Tipps, mit denen sichergestellt wird, dass der Wechsel in die Cloud erhebliche Kosten einspart und gleichzeitig Datenrisiken vermeidet.
Tags : 
bestimmungen, cloud, astute, speicher, unternehmen
    
AstuteIT_ABM_EMEA
Published By: Astute IT DACH     Published Date: Feb 28, 2018
Viele Unternehmen, die einen Wechsel in die Cloud in Angriff nehmen, verlagern erst einmal sekundäre Daten vor Ort in den Cloud-Speicher. Bei Kosten von einem Bruchteil eines Cents pro Gigabyte ist der wirtschaftliche Aspekt sehr interessant – wer möchte nicht von drastisch gesenkten Speicherkosten profitieren? Die Entscheidung, Petabyte von Daten in die Cloud zu senden, sollte jedoch nicht leichtfertig getroffen werden. Wir bei Veritas wissen, dass eine solide Datenverwaltung in der Cloud vor allem globale Datentransparenz voraussetzt. Nachdem Sie Ihre Daten verstanden und visualisiert haben, können Sie: • SIE KLASSIFIZIEREN • SIE VERLAGERN • DATENANOMALIEN IDENTIFIZIEREN Im Folgenden finden Sie wichtige Tipps, mit denen sichergestellt wird, dass der Wechsel in die Cloud erhebliche Kosten einspart und gleichzeitig Datenrisiken vermeidet.
Tags : 
bestimmungen, cloud, astute, speicher, unternehmen
    
Astute IT DACH
Published By: SAS     Published Date: Sep 13, 2013
Insights from a webinar in the Applying Business Analytics webinar series.
Tags : 
sas, big data, big data quality, data, terabytes, petabytes, exabytes, software development
    
SAS
Published By: Exablox     Published Date: Jan 27, 2015
Object-based storage (referred to as OBS throughout this document) platforms continue to perpetuate cloud and enterprise IT infrastructure. As businesses move toward petabyte-scale data storage, OBS solutions are turning out to be the right choice for balancing scale, complexity, and costs. By way of their core design principles, OBS platforms deliver unprecedented scale at reduced complexity and reduced costs over the long term. Early OBS platforms suffered from "necessity crisis," were too cumbersome to deploy and, in some cases, caused a platform lock-in because of their proprietary access mechanisms. In spite of their from-the-ground-up design, a departure from how traditional SAN and NAS arrays are deployed and, more importantly, a lack of standard interfaces made it difficult for IT organizations to deploy OBS solutions in the infrastructure. Thanks to Amazon S3 and OpenStack Swift becoming de facto access interfaces, this situation has changed.
Tags : 
network attached storage, nas arrays, storage efficiency, idc storage marketscape, storage data resiliency, storage platform scalability, exablox, oneblox, it management
    
Exablox
Published By: Seagate Technology     Published Date: Apr 14, 2015
You can have all the latest features of iOS 8 on your iPhone, but if you’re still using an iPhone 4s, you won’t be able to take advantage of most of them. This should serve as an appropriate metaphor for this audience because when it comes to data storage, many of today’s storage solutions have been around since the 1970s…as in the the world of mobile, today’s data storage needs are revealing just how limited those older solutions are. Even if a company has updated almost everything in its IT ecosystem, it takes only one unchanged layer to prevent them from realizing the benefits of the latest technology. With petabyte scale, reliability, performance, and efficiency the order of the day we are fast approaching that point with storage. There is without question a need to change in the current storage paradigm. In this webinar you will learn about current IT storage challenges and useful insights about next-generation storage solutions.
Tags : 
storage solutions, mobile data storage, it ecosystem, petabyte scale, network architecture
    
Seagate Technology
Published By: CrowdStrike     Published Date: Feb 01, 2017
One of the biggest challenges to effectively stopping breaches lies in sifting through vast amounts of data to find the subtle clues that indicate an attack is imminent or underway. As modern computer systems generate billions of events daily, the amount of data to analyze can reach petabytes. Compounding the problem, the data is often unstructured, discrete and disconnected. As a result, organizations struggle to determine how individual events may be connected to signal an impending attack. Download the white paper to learn: • How to detect known and unknown threats by applying high-volume graph-based technology, similar to the ones developed by Facebook and Google • How CrowdStrike solved this challenge by building its own proprietary graph data model • How CrowdStrike Threat Graph™ collects and analyzes massive volumes of security-related data to stop breaches
Tags : 
    
CrowdStrike
Search      

Related Topics

Add Research

Get your company's research in the hands of targeted business professionals.