Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.

sections
 

data capacity

Results 1 - 25 of 204Sort Results By: Published Date | Title | Company Name
Published By: Panduit     Published Date: Sep 03, 2015
Capacity planning is one of the most difficult aspects of building a data center today, given the complexity and number of variables to consider. This new White Paper, from leading solutions and innovation provider Panduit, brings fresh insight into the central issues facing data center planners both for initial “build” and for future growth, with valuable “real life” examples drawn from actual customer case studies. Download now to understand how best to: • Avoid capacity fragmentation: identify areas of “lost” or “stranded” capacity • Establish “best fits” for capacity in your center • Maintain equilibrium between the four critical elements of any effective data center management plan • Implement proactive planning: speed decision-making and deployment • Optimize existing resources • Improve data center operations and efficiency • Reduce OPEX • Avoid unnecessary CAPEX by extending the life of your center.
Tags : 
    
Panduit
Published By: Anixter     Published Date: Sep 23, 2015
This report covers the complex nature of powering the data center. The drive for improved efficiencies and managing variable IT capacity requirements are top priorities for all data center operators and managers. The intelligent power chain achieved via the right hardware configuration coupled with intelligence and analytics can provide robust gains in five key areas.
Tags : 
    
Anixter
Published By: Schneider Electric     Published Date: May 05, 2016
This paper proposes standard terminology for categorizing the types of prefabricated modular data centers, defines and compares their key attributes, and provides a framework for choosing the best approach(es) based on business requirements.
Tags : 
datacenters, tco, capacity, modular design, load, cost, architecture, networking
    
Schneider Electric
Published By: Schneider Electric     Published Date: May 05, 2016
This white paper provides a quantitative TCO analysis of the two architectures, and illustrates the key drivers of both the capex and opex savings of the improved architecture.
Tags : 
datacenters, tco, capacity, modular design, load, cost, architecture, networking
    
Schneider Electric
Published By: Schneider Electric     Published Date: May 05, 2016
In this paper, we focus on quantifying the capital cost differences of a prefabricated vs. traditional 440 kW data center, both built with the same power and cooling architecture, in order to highlight the key cost drivers, and to demonstrate that prefabrication does not come at a capex premium.
Tags : 
datacenters, tco, capacity, modular design, load, cost, architecture, networking
    
Schneider Electric
Published By: Server Technology     Published Date: Oct 26, 2016
A brand new white paper from DCD and Server Technology.
Tags : 
data center, density, capacity, dcd, server technology, pdu, networking, it management
    
Server Technology
Published By: Raritan     Published Date: Sep 05, 2016
Sensors are an easy to install, cost-effective way to reduce energy costs, improve reliability, and increase capacity for future data center growth. See why you should be using them.
Tags : 
data center
    
Raritan
Published By: Nlyte     Published Date: Nov 16, 2017
The IT professional’s guide to moving beyond spreadsheets and diagrams.
Tags : 
cooling, power, energy, data center, network, infrastructure, capacity, space, series_of_nytle_nam_white_papers
    
Nlyte
Published By: Infinidat EMEA     Published Date: May 14, 2019
Digital transformation is a business enabler, one that also translates to an increase in the demand for greater storage capacity and performance. This increasing demand requires IT organizations to re-examine their data storage strategy as the growth in capacity doesn’t align with a shrinking IT budget. To support the growth of the business and remain competitive in a global digital market, CIOs are asked more than ever to “do more with less,” while improving performance and availability at the same time.
Tags : 
    
Infinidat EMEA
Published By: Infinidat EMEA     Published Date: May 14, 2019
Infinidat® enterprise storage solutions are based upon the unique and patented Infinidat storage architecture—A fully abstracted set of Software-Defined Storage (SDS) functions integrated with the best-of-breed off-the-shelf commodity hardware. Infinidat’s software-focused architecture, an evolution and revolution in data management design over 30 years in the making, solves the conflicting requirements of bigger, faster, and less expensive. This paper discusses the technology that enables Infinidat to be the only enterprise storage provider that achieves multi-petabyte capacity with faster than all-flash performance (over 1.3M IOPS at microsecond latency) and an unprecedented seven-nines availability, all at the lowest Total Cost of Ownership (TCO).
Tags : 
    
Infinidat EMEA
Published By: Upsite Technologies     Published Date: Sep 18, 2013
The average computer room today has cooling capacity that is nearly four times the IT heat load. Using data from 45 sites reviewed by Upsite Technologies, this white paper will show how you can calculate, benchmark, interpret, and benefit from a simple and practical metric called the Cooling Capacity Factor (CCF). Calculating the CCF is the quickest and easiest way to determine cooling infrastructure utilization and potential gains to be realized by AFM improvements.
Tags : 
ccf, upsite technologies, cooling capacity factor, energy costs, cooling, metrics, practical, benchmark
    
Upsite Technologies
Published By: Spectrum Enterprise     Published Date: Oct 29, 2018
Bandwidth. Speed. Throughput. These terms are not interchangeable. They are interrelated concepts in data networking that help measure capacity, the time it takes to get from one point to the next and the actual amount of data you’re receiving, respectively. When you buy an Internet connection from Spectrum Enterprise, you’re buying a pipe between your office and the Internet with a set capacity, whether it is 25 Mbps, 10 Gbps, or any increment in between. However, the bandwidth we provide does not tell the whole story; it is the throughput of the entire system that matters. Throughput is affected by obstacles, overhead and latency, meaning the throughput of the system will never equal the bandwidth of your Internet connection. The good news is that an Internet connection from Spectrum Enterprise is engineered to ensure you receive the capacity you purchase; we proactively monitor your bandwidth to ensure problems are dealt with promptly, and we are your advocates across the Internet w
Tags : 
    
Spectrum Enterprise
Published By: Hitachi Vantara     Published Date: Mar 08, 2019
Today, data center managers are being asked to do more than ever before: Bring on more applications at a faster pace, add more capacity to existing applications and deliver more performance.
Tags : 
    
Hitachi Vantara
Published By: Hewlett Packard Enterprise     Published Date: May 11, 2018
Very little data is available on how effectively enterprises are managing private cloud deployments in the real world. Are they doing so efficiently, or are they facing challenges in areas such as performance, TCO and capacity? Hewlett Packard Enterprise commissioned 451 Research to explore these issues through a survey of IT decision-makers and data from the Cloud Price Index.
Tags : 
    
Hewlett Packard Enterprise
Published By: Hewlett Packard Enterprise     Published Date: May 11, 2018
If your business is like most, you are grappling with data storage. In an annual Frost & Sullivan survey of IT decision-makers, storage growth has been listed among top data center challenges for the past five years.2 With businesses collecting, replicating, and storing exponentially more data than ever before, simply acquiring sufficient storage capacity is a problem. Even more challenging is that businesses expect more from their stored data. Data is now recognized as a precious corporate asset and competitive differentiator: spawning new business models, new revenue streams, greater intelligence, streamlined operations, and lower costs. Booming market trends such as Internet of Things and Big Data analytics are generating new opportunities faster than IT organizations can prepare for them.
Tags : 
    
Hewlett Packard Enterprise
Published By: Hewlett Packard Enterprise     Published Date: May 10, 2019
Nimble Secondary Flash array represents a new type of data storage, designed to maximize both capacity and performance. By adding high-performance flash storage to a capacity-optimized architecture, it provides a unique backup platform that lets you put your backup data to work. Nimble Secondary Flash array uses flash performance to provide both near-instant backup and recovery from any primary storage system. It is a single device for backup, disaster recovery, and even local archiving. By using flash, you can accomplish real work such as dev/test, QA, and analytics. Deep integration with Veeam’s leading backup software simplifies data lifecycle management and provides a path to cloud archiving.
Tags : 
    
Hewlett Packard Enterprise
Published By: Spectrum Enterprise     Published Date: Mar 01, 2019
Bandwidth. Speed. Throughput. These terms are not interchangeable. They are interrelated concepts in data networking that help measure capacity, the time it takes to get from one point to the next and the actual amount of data you’re receiving, respectively.
Tags : 
    
Spectrum Enterprise
Published By: Dell EMC     Published Date: Nov 08, 2016
Time-to-market, consolidation, and complexity struggles are a thing of the past. Join yourpeers in database storage nirvana with the Dell EMC All-Flash portfolio, powered by Intel® Xeon® processors.
Tags : 
database, consolidation, capacity, storage, complexity
    
Dell EMC
Published By: Oracle CX     Published Date: Oct 19, 2017
The Software in Silicon design of the SPARC M7 processor, and the recently announced SPARC S7 processor, implement memory access validation directly into the processor so that you can protect application data that resides in memory. It also includes on-chip Data Analytics Accelerator (DAX) engines that are specifically designed to accelerate analytic functions. The DAX engines make in-memory databases and applications run much faster, plus they significantly increase usable memory capacity by allowing compressed databases to be stored in memory without a performance penalty. The following Software in Silicon technologies are implemented in the SPARC S7 and M7 processors: Note: Security in Silicon encompasses both Silicon Secured Memory and cryptographic instruction acceleration, whereas SQL in Silicon includes In-Memory Query Acceleration and In-Line Decompression. Silicon Secured Memory is the first-ever end-to-end implementation of memory-access validation done in hardware. It
Tags : 
    
Oracle CX
Published By: Dell and Nutanix     Published Date: Jan 16, 2018
Because many SQL Server implementations are running on virtual machines already, the use of a hyperconverged appliance is a logical choice. The Dell EMC XC Series with Nutanix software delivers high performance and low Opex for both OLTP and analytical database applications. For those moving from SQL Server 2005 to SQL Server 2016, this hyperconverged solution provides particularly significant benefits.
Tags : 
data, security, add capacity, infrastructure, networking, virtualization, dell
    
Dell and Nutanix
Published By: Digital Realty     Published Date: Feb 25, 2015
When measuring competitive differentiation in milliseconds, connectivity is a key component for any financial services company’s data center strategy. In planning the move of its primary data center, a large international futures and commodities trading company needed to find a provider that could deliver the high capacity connectivity it required.
Tags : 
financial services, trade processing, data center, connectivity, data center, it management
    
Digital Realty
Published By: Dell EMC     Published Date: May 12, 2016
Businesses face greater uncertainty than ever. Market conditions, customer desires, competitive landscapes, and regulatory constraints change by the minute. So business success is increasingly contingent on predictive intelligence and hyperagile responsiveness to relentlessly evolving demands. This uncertainty has significant implications for the data center — especially as business becomes pervasively digital. IT has to support business agility by being more agile itself. It has to be able to add services, scale capacity up and down as needed, and nimbly remap itself to changes in organizational structure.
Tags : 
    
Dell EMC
Published By: Dell Brought to you by Intel     Published Date: Dec 09, 2013
Database performance and memory capacity with the Intel Xeon Processor E5-2660V2- Powered Dell Poweredge M620.
Tags : 
dell, xeon processors e5-2660, database performance, intel xeon processor, poweredge m620., software development, it management
    
Dell Brought to you by Intel
Published By: Dell EMC     Published Date: Aug 17, 2017
For many companies the appeal of the public cloud is very real. For tech startups, the cloud may be their only option, since many don’t have the capital or expertise to build and operate the IT systems their businesses need. Existing companies with established data centers are also looking at public clouds, to increase IT agility while limiting risk. The idea of building-out their production capacity while possibly reducing the costs attached to that infrastructure can be attractive. For most companies the cloud isn’t an “either-or” decision, but an operating model to be evaluated along with on-site infrastructure. And like most infrastructure decisions the question of cost is certainly a consideration. In this report we’ll explore that question, comparing the cost of an on-site hyperconverged solution with a comparable set up in the cloud. The on-site infrastructure is a Dell EMC VxRailTM hyperconverged appliance cluster and the cloud solution is Amazon Web Services (AWS).
Tags : 
public cloud, it systems, data center, it agility, hyperconverged solution, hyperconverged appliance
    
Dell EMC
Published By: Butler Technologies     Published Date: Jul 03, 2018
MPO connectors increase your data capacity with a highly efficient use of space. But users have faced challenges such as extra complexities and time required for testing and troubleshooting multi-fiber networks. VIAVI helps overcome these challenges with the industry's most complete portfolio of test solutions for MPO connectivity.
Tags : 
    
Butler Technologies
Start   Previous   1 2 3 4 5 6 7 8 9    Next    End
Search      

Related Topics

Add Research

Get your company's research in the hands of targeted business professionals.