Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.

sections
 

traditional data

Results 1 - 25 of 352Sort Results By: Published Date | Title | Company Name
Published By: Schneider Electric     Published Date: May 05, 2016
This white paper provides a quantitative TCO analysis of the two architectures, and illustrates the key drivers of both the capex and opex savings of the improved architecture.
Tags : 
datacenters, tco, capacity, modular design, load, cost, architecture, networking
    
Schneider Electric
Published By: Schneider Electric     Published Date: May 05, 2016
In this paper, we focus on quantifying the capital cost differences of a prefabricated vs. traditional 440 kW data center, both built with the same power and cooling architecture, in order to highlight the key cost drivers, and to demonstrate that prefabrication does not come at a capex premium.
Tags : 
datacenters, tco, capacity, modular design, load, cost, architecture, networking
    
Schneider Electric
Published By: Nexans     Published Date: Apr 20, 2017
According to Cisco’s Global Cloud Index (2015), global data centre IP traffic will grow at a CAGR of 25% between 2014 and 2019 and data centre workloads within ‘traditional’ (as opposed to cloud) data centres will more than double. In this guide you will find best practices to meet the demands of the Digital Economy, with help from DatacenterDynamics and Nexans.
Tags : 
data center
    
Nexans
Published By: CloudHealth by VMware     Published Date: Sep 05, 2019
Public clouds have fundamentally changed the way organizations build, operate, and manage applications. Security for applications in the cloud is composed of hundreds of configuration parameters and is vastly different from security in traditional data centers. According to Gartner, “Through 2020, at least 95% of cloud breaches will be due to customer misconfiguration, mismanaged credentials or insider theft, not cloud provider vulnerabilities”1. The uniqueness of cloud requires that security teams rethink classic security concepts and adopt approaches that address serverless, dynamic, and distributed cloud infrastructure. This includes rethinking security practices across asset management, compliance, change management, issue investigation, and incident response, as well as training and education. We interviewed several security experts and asked them how public cloud transformation has changed their cloud security and compliance responsibilities. In this e-book, we will share the top
Tags : 
    
CloudHealth by VMware
Published By: TIBCO Software     Published Date: Feb 14, 2019
Digital business initiatives have expanded in scope and complexity as companies have increased the rate of digital innovation to capture new market opportunities. As applications built using fine-grained microservices and functions become pervasive, many companies are seeing the need to go beyond traditional API management to execute new architectural patterns and use cases. APIs are evolving both in the way they are structured and in how they are used, to not only securely expose data to partners, but to create ecosystems of internal and/or third-party developers. In this datasheet, learn how you can use TIBCO Cloud™ Mashery® to: Create an internal and external developer ecosystem Secure your data and scale distribution Optimize and manage microservices Expand your partner network Run analytics on your API performance
Tags : 
    
TIBCO Software
Published By: TIBCO Software     Published Date: Feb 14, 2019
Digital business initiatives have expanded in scope and complexity as companies have increased the rate of digital innovation to capture new market opportunities. As applications built using fine-grained microservices and functions become pervasive, many companies are seeing the need to go beyond traditional API management to execute new architectural patterns and use cases. APIs are evolving both in the way they are structured and in how they are used, to not only securely expose data to partners, but to create ecosystems of internal and/or third-party developers. In this datasheet, learn how you can use TIBCO Cloud™ Mashery® to: Create an internal and external developer ecosystem Secure your data and scale distribution Optimize and manage microservices Expand your partner network Run analytics on your API performance
Tags : 
    
TIBCO Software
Published By: Hewlett Packard Enterprise     Published Date: Jan 31, 2019
Managing infrastructure has always brought frustration, headaches, and wasted time. That’s because IT professionals have to spend their days, nights, and weekends dealing with problems and manually tuning their infrastructure. Traditional monitoring and support are too far removed from infrastructure, resulting in an endless cycle of break-fix-tune-repeat. Infrastructure powered by artificial intelligence, however, can overcome the limitations of humans and traditional tools. This white paper explores how HPE InfoSight with its recommendation engine paves the path for an autonomous data center your Hybrid Cloud World.
Tags : 
    
Hewlett Packard Enterprise
Published By: Hewlett Packard Enterprise     Published Date: Jan 31, 2019
"IT needs to reach beyond the traditional data center and the public cloud to form and manage a hybrid connected system stretching from the edge to the cloud, wherever the cloud may be. We believe this is leading to a new period of disruption and development that will require organizations to rethink and modernize their infrastructure more comprehensively than they have in the past. Hybrid cloud and hybrid cloud management will be the key pillars of this next wave of digital transformation – which is on its way much sooner than many have so far predicted. They have an important role to play as part of a deliberate and proactive cloud strategy, and are essential if the full benefits of moving over to a cloud model are to be fully realized."
Tags : 
    
Hewlett Packard Enterprise
Published By: Hitachi Vantara     Published Date: Mar 08, 2019
Finding the right data protection and recovery solution for complex enterprise infrastructures is a formidable challenge. Traditional backup and recovery solutions cannot keep up with or meet today’s business-critical requirements. Read this white paper to learn smarter approaches to data protection and recovery. See how to radically improve key performance metrics – including business continuity – as well as backup windows, operational recovery and disaster recovery.
Tags : 
    
Hitachi Vantara
Published By: SAP     Published Date: May 18, 2014
New data sources are fueling innovation while stretching the limitations of traditional data management strategies and structures. Data warehouses are giving way to purpose built platforms more capable of meeting the real-time needs of a more demanding end user and the opportunities presented by Big Data. Significant strategy shifts are under way to transform traditional data ecosystems by creating a unified view of the data terrain necessary to support Big Data and real-time needs of innovative enterprises companies.
Tags : 
sap, big data, real time data, in memory technology, data warehousing, analytics, big data analytics, data management
    
SAP
Published By: SAP     Published Date: May 18, 2014
This white paper discusses the issues involved in the traditional practice of deploying transactional and analytic applications on separate platforms using separate databases. It analyzes the results from a user survey, conducted on SAP's behalf by IDC, that explores these issues.
Tags : 
sap, big data, real time data, in memory technology, data warehousing, analytics, big data analytics, data management
    
SAP
Published By: SAP     Published Date: May 18, 2014
Leading companies and technology providers are rethinking the fundamental model of analytics, and the contours of a new paradigm are emerging. The new generation of analytics goes beyond Big Data (information that is too large and complex to manipulate without robust software), and the traditional narrow approach of analytics which was restricted to analysing customer and financial data collected from their interactions on social media. Today companies are embracing the social revolution, using real-time technologies to unlock deep insights about customers and others and enable better-informed decisions and richer collaboration in real-time.
Tags : 
sap, big data, real time data, in memory technology, data warehousing, analytics, big data analytics, data management
    
SAP
Published By: Coupa     Published Date: Dec 14, 2018
Many procurement departments are still using traditional manual processes or outdated technology. The result? Rogue spending, missed discounts from supplier contract pricing, reconciliation headaches, and the list goes on. These business risks are driving more organizations towards the cloud-based, secure, and workflow-friendly world of eProcurement solutions. These solutions are saving money and resources, improving use of budgets and personnel, enabling centralization, and using data to improve and streamline end-to-end purchasing processes. Download this report to learn about: Procurement trends from 400 organizations surveyed Operational and cost-savings benefits of eProcurement Leading features and functionality in eProcurement Adoption best practices and how to get started
Tags : 
    
Coupa
Published By: Hewlett Packard Enterprise     Published Date: Feb 05, 2018
As businesses plunge into the digital future, no asset will have a greater impact on success than data. The ability to collect, harness, analyze, protect, and manage data will determine which businesses disrupt their industries, and which are disrupted; which businesses thrive, and which disappear. But traditional storage solutions are not designed to optimally handle such a critical business asset. Instead, businesses need to adopt an all-flash data center. In their new role as strategic business enablers, IT leaders have the responsibility to ensure that their businesses are protected, by investing in flexible, future-proof flash storage solutions. The right flash solution can deliver on critical business needs for agility, rapid growth, speed-to-market, data protection, application performance, and cost-effectiveness—while minimizing the maintenance and administration burden.
Tags : 
data, storage, decision makers, hpe
    
Hewlett Packard Enterprise
Published By: Cherwell Software     Published Date: Apr 07, 2016
Cherwell Service Management™ software empowers IT organizations to easily manage their infrastructure at a fraction of the cost and complexity associated with traditional, legacy IT service management software. Designed with a metadata-driven approach, the Cherwell Service Management platform can easily be configured to meet process and integration requirements without writing or touching a single line of code. Furthermore, configurations will never break during version upgrades.
Tags : 
best practices, customer support, business intelligence, business optimization
    
Cherwell Software
Published By: Commvault     Published Date: Jul 06, 2016
Think of a wildfire that quickly spreads as it increases in speed and power. That is what is happening today as data growth increases the volume and management complexity of storage, backup and recovery. Now think of trying to stop that fire with a garden hose. Your traditional backup and recovery process is equally under-equipped to manage and facilitate operations that need more speed, efficiency, scalability and reliability to handle today’s 24/7, always-on environment. Here we examine the benefits of moving from a solution comprised of multiple point products to a holistic data protection platform designed to serve today’s enterprise.
Tags : 
commvault, data protection, storage, backup, recovery, holistic data protection, singe pane of glass, common code base
    
Commvault
Published By: Cisco     Published Date: Jun 21, 2016
The demands on IT today are staggering. Most organizations depend on their data to drive everything from product development and sales to communications, operations, and innovation. As a result, IT departments are charged with finding a way to bring new applications online quickly, accommodate massive data growth and complex data analysis, and make data available 24 hours a day, around the world, on any device. The traditional way to deliver data services is with separate infrastructure silos for various applications, processes, and locations, resulting in continually escalating costs for infrastructure and management. These infrastructure silos make it difficult to respond quickly to business opportunities and threats, cause productivity-hindering delays when you need to scale, and drive up operational costs.
Tags : 
    
Cisco
Published By: HPE     Published Date: Feb 11, 2015
Protecting your business-critical applications without impacting performance is proving ever more challenging in the face of unrelenting data growth, stringent recovery service level agreements (SLAs) and increasingly virtualized environments. Traditional approaches to data protection are unable to cost-effectively deliver the end-to-end availability and protection that your applications and hypervisors demand. A faster, easier, more efficient, and reliable way to protect data is needed.
Tags : 
    
HPE
Published By: AWS     Published Date: Sep 05, 2018
Big data alone does not guarantee better business decisions. Often that data needs to be moved and transformed so Insight Platforms can discern useful business intelligence. To deliver those results faster than traditional Extract, Transform, and Load (ETL) technologies, use Matillion ETL for Amazon Redshift. This cloud- native ETL/ELT offering, built specifically for Amazon Redshift, simplifies the process of loading and transforming data and can help reduce your development time. This white paper will focus on approaches that can help you maximize your investment in Amazon Redshift. Learn how the scalable, cloud- native architecture and fast, secure integrations can benefit your organization, and discover ways this cost- effective solution is designed with cloud computing in mind. In addition, we will explore how Matillion ETL and Amazon Redshift make it possible for you to automate data transformation directly in the data warehouse to deliver analytics and business intelligence (BI
Tags : 
    
AWS
Published By: Dell     Published Date: Jul 08, 2016
Organizations looking to implement desktop and app virtualization traditionally play a guessing game where storage is concerned. When considering local and physical storage, determining what would be necessary for the virtualized world is difficult and can be overwhelming. This is especially true when determining how virtualizing desktops will impact the storage architecture. Organizations risk over sizing their environment thereby wasting CapEx, or under-sizing and potentially ruining the user experience. Software-defined storage solutions, such as VMware Virtual SAN, provide simplified solutions with high performance data stores that offer fine-grained scalability with linearly-predictable performance as demand grows. Dell’s validated and certified desktop virtualization solutions incorporate vSphere and Virtual SAN, and provide a complete end-to-end solution that allows companies to grow and expand without large capital investments in SAN hardware.
Tags : 
technology, best practices, data, virtualization, storage, data
    
Dell
Published By: Oracle CX     Published Date: Oct 19, 2017
Business Enterprises today need to become more agile, meet new and increasing workload and security requirements, while reducing overall IT cost and risk. To meet these requirements many companies are turning to cloud computing. To remain competitive companies need to formulate a strategy that can easily move them from traditional on-premises IT to private or public clouds. A complete cloud strategy will likely include both private and public clouds because some applications and data might not be able to move to a public cloud. Moving to the cloud should not create information silos but should improve data sharing. Any cloud strategy should make sure that it is possible to integrate on-premises, private cloud and public cloud data and applications. Furthermore, any on-premises cloud deployments must be able to easily migrate to public cloud in the future
Tags : 
    
Oracle CX
Published By: Oracle CX     Published Date: Oct 19, 2017
Business Enterprises today need to become more agile, meet new and increasing workload and security requirements, while reducing overall IT cost and risk. To meet these requirements many companies are turning to cloud computing. To remain competitive companies need to formulate a strategy that can easily move them from traditional on-premises IT to private or public clouds. A complete cloud strategy will likely include both private and public clouds because some applications and data might not be able to move to a public cloud. Moving to the cloud should not create information silos but should improve data sharing. Any cloud strategy should make sure that it is possible to integrate on-premises, private cloud and public cloud data and applications. Furthermore, any on-premises cloud deployments must be able to easily migrate to public cloud in the future.
Tags : 
    
Oracle CX
Published By: Oracle CX     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
    
Oracle CX
Published By: Oracle CX     Published Date: Oct 20, 2017
Security has become top of mind for CIOs, and CEOs. Encryption at rest is a piece of the solution, but not a big piece. Encryption over the network is another piece, but only a small piece. These and other pieces do not fit together well; they need to unencrypt and reencrypt the data when they move through the layers, leaving clear versions that create complex operational issues to monitor and detect intrusion. Larger-scale high-value applications requiring high security often use Oracle middleware, including Java and Oracle database. Traditional security models give the data to the processors to encrypt and unencrypt, often many times. The overhead is large, and as a result encryption is used sparingly on only a few applications. The risk to enterprises is that they may have created an illusion of security, which in reality is ripe for exploitation. The modern best-practice security model is an end-to-end encryption architecture. The application deploys application-led encryption s
Tags : 
    
Oracle CX
Published By: Dell and Nutanix     Published Date: Oct 26, 2017
A related recent development in the data center is converged infrastructure (CI). Instead of the traditional silo deployment approach to storage, compute, and network resources, all infrastructure elements are delivered and managed in a single environment, providing virtualized access to business services in an efficient manner. This is particularly suitable for cloud-based delivery models. However, since CI achieves lower costs through optimization of data center resources, it can be effective for all IT organizations, regardless of the way in which the services are managed or presented.
Tags : 
    
Dell and Nutanix
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Related Topics

Add Research

Get your company's research in the hands of targeted business professionals.