Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.

sections
 

data availability

Results 51 - 75 of 330Sort Results By: Published Date | Title | Company Name
Published By: Dell & Microsoft     Published Date: Aug 14, 2013
Organizations everywhere are turning to virtualization, cloud computing, and mobile technologies to support anytime, anywhere access to today’s work load intensive, data-heavy applications. Dell PowerEdge 12th generation servers—built for high performance, 24/7 availability, and uncompromised reliability—can help IT organizations deliver the benefits of these transformative technologies.With cost-saving power, cooling, space, and management efficiencies, Dell’s new servers offer data centers unparalleled performance, efficiency, and reliability for a diverse range of enterprise applications.
Tags : 
data center, dell poweredge 12th generation servers, virtualization, cloud computing, mobile technologies, data-heavy applications, cost-saving power, management efficiencies
    
Dell & Microsoft
Published By: Oracle ZDLRA     Published Date: Jan 10, 2018
Business leaders expect two things from IT: keep mission-critical applications available and high performing 24x7 and, if something does happen, recover to be back in business quickly and without losing any critical data so there is no impact on revenue stream. Of course, there is a gap between this de facto expectation from nontechnical business leaders and what current technology is actually capable of delivering. For mission-critical workloads, which are most often hosted on databases, organizations may choose to implement high availability (HA) technologies within the database to avoid downtime and data loss.
Tags : 
recovery point, recovery time, backup appliance, san/nas, service level agreement, oracle
    
Oracle ZDLRA
Published By: Dyn     Published Date: Nov 03, 2016
Global DNS performance and availability are critical to user experience. According to Gartner, “DNS is mission-critical to all organizations that connect to the internet. DNS failure or poor performance leads to applications, data and content becoming unavailable, causing user frustration, lost sales and business reputation damage.” But many businesses still rely on a single, often in-house DNS solution that lacks global scale and resiliency. This white paper reviews the business advantages of implementing a high availability DNS architecture using redundant DNS services. You will learn: - The critical role DNS plays in the user experience. - The risks of relying solely on a single DNS solution. - The added performance and reliability benefits of a high availability DNS architecture with a redundant managed DNS service. - Criteria for evaluating a managed DNS service provider.
Tags : 
dyn, dns, downtime risk, end-user experience, networking, enterprise applications
    
Dyn
Published By: IBM     Published Date: Oct 13, 2016
Compare IBM DB2 pureScale with any other offering being considered for implementing a clustered, scalable database configuration see how they deliver continuous availability and why they are important. Download now!
Tags : 
data. queries, database operations, transactional databases, clustering, it management
    
IBM
Published By: IBM     Published Date: Jul 26, 2017
To compete in today’s fast-paced business climate, enterprises need accurate and frequent sales and customer reports to make real-time operational decisions about pricing, merchandising and inventory management. They also require greater agility to respond to business events as they happen, and more visibility into business activities so information and systems are optimized for peak efficiency and performance. By making use of data capture and business intelligence to integrate and apply data across the enterprise, organizations can capitalize on emerging opportunities and build a competitive advantage. The IBM® data replication portfolio is designed to address these issues through a highly flexible one-stop shop for high-volume, robust, secure information replication across heterogeneous data stores. The portfolio leverages real-time data replication to support high availability, database migration, application consolidation, dynamic warehousing, master data management (MDM), service
Tags : 
ibm, infosphere, data replication, security, data storage
    
IBM
Published By: NetApp     Published Date: Mar 05, 2018
To keep pace with an increasingly digital world, enterprises are transforming their data infrastructures using all flash storage. As a leading all flash storage provider, NetApp simplifies your infrastructure to improve economics, while accelerating performance and increasing availability to enhance your company’s competitiveness. NetApp future-proofs your IT investments, allowing you to grow with confidence. NetApp® all flash storage reduces your storage footprint, power, and cooling by up to 10x; doubles performance at half the latency of leading competitors; and lets you migrate confidently from your existing SAN with a pathway to the cloud. With NetApp, all flash arrays, your business is prepared to take on anything and everything the future can throw at it: rapid growth, new technology, or a shift in the industry. Cut fear out of the equation. Be data ready to bring on the future.
Tags : 
netapp, database performance, flash storage, data management, cost challenges
    
NetApp
Published By: Oracle ODA     Published Date: Aug 15, 2016
Businesses understand more than ever that they depend on data for insight and competitive advantage. And when it comes to data, they have always wanted easy access and fast performance. But how is the situation different now? Today, organizations want those elements and more. They want IT to strip away the limitations of time with faster deployment of new databases and applications. They want IT to reduce the limitations of distance by giving remote and branch offices better and more reliable access. And in a global world where business never stops, they want IT to ensure data availability around the clock. If IT can deliver databases and applications faster, on a more automated and consistent basis, to more locations without having to commit onsite resources, IT will be free to focus on more strategic projects.
Tags : 
    
Oracle ODA
Published By: Oracle ODA     Published Date: Aug 15, 2016
Oracle added two new models to the Oracle Database Appliance family in addition to the existing high availability model. With an entry list price starting at a fourth of the cost of the prior generation Oracle Database Appliance hardware and flexible Oracle Database software licensing, these new models bring Oracle Engineered Systems to within reach of every organization. Read about how the Oracle Database Appliance X-6 series expands the reach of the database appliance family to support various workloads, deployment scenarios, and database editions. They are especially designed for customers requiring only single instance databases, but who desire the simplicity, optimization, and affordability of the Oracle Database Appliance. These new models are ideal for customers who seek to avoid the complexity, tuning requirements, and higher costs of “build-your-own” database solutions.
Tags : 
    
Oracle ODA
Published By: CA Technologies     Published Date: Jul 20, 2017
Mainframes continue to provide high business value by combining efficient transaction processing with high-volume access to critical enterprise data. Business organizations are linking mobile devices to mainframe processing and data to support digital applications and drive business transformation. In this rapidly growing scenario, the importance of providing excellent end-user experience becomes critical for business success.This analyst announcement note covers how CA Technologies is addressing the need for providing high availability and a fast response time by optimizing mainframe performance with new machine learning and analytics capabilities.
Tags : 
    
CA Technologies
Published By: Pure Storage     Published Date: Nov 02, 2017
The tremendous growth of unstructured data is creating huge opportunities for organizations. But it is also creating significant challenges for the storage infrastructure. Many application environments that have the potential to maximize unstructured data have been restricted by the limitations of legacy storage systems. For the past several years—at least—users have expressed a need for storage solutions that can deliver extreme performance along with simple manageability, density, high availability and cost efficiency.
Tags : 
high performance, tco, multi protocol, management simplicity, the blade, elasticity software, performance, capacity
    
Pure Storage
Published By: Carbonite     Published Date: Oct 12, 2017
Carbonite provides data protection solutions for businesses and the IT professionals who serve them. Our product suite provides a full complement of backup, disaster recovery, and high availability solutions for any size business in any location around the world, all supported by a state-of-the-art global infrastructure.
Tags : 
draas, data recovery, recovery capability, cloud support, infrastructure
    
Carbonite
Published By: Datastax     Published Date: Dec 27, 2018
Today’s data volume, variety, and velocity has made relational database nearly obsolete for handling certain types of workloads. But it’s also put incredible strain on regular NoSQL databases. The key is to find one that can deliver the infinite scale and high availability required to support high volume, web-scale applications in clustered environments. This white paper details the capabilities and uses case of an Active Everywhere database
Tags : 
    
Datastax
Published By: Larsen & Toubro Infotech(LTI)     Published Date: Feb 18, 2019
The largest national multiline insurance had built a repository of Insurance policies (P&C and Life Insurance) on Microfilm and Microfiche in early 90’s, as a preservation strategy. They were grappling with issues as this technology became outdated over time: • Risk of losing their only source of data for Insurance policies and corresponding communication, need to improve data availability and speed of claims evaluation • Compliance issues, need of a WORM (write once read many) storage compliant with FINRA regulations, where data should be encrypted when at rest • Total cost for digitization compared to 10-12 years of support left to maintain insurance policies was not very encouraging • Required a low cost, cloud-based, FINRA-compliant document management solution which could provide quick access to stored data Download complete case study to know how LTI’s e-Office sDownload full case study to know how LTI’s e-Office solution enabled 50% TCO for Largest national Multiline Insurance.
Tags : 
    
Larsen & Toubro Infotech(LTI)
Published By: IBM     Published Date: Jul 05, 2016
This white paper discusses the concept of shared data scale-out clusters, as well as how they deliver continuous availability and why they are important for delivering scalable transaction processing support.
Tags : 
ibm, always on business, cloud, big data, oltp, ibm db2 purescale, networking, enterprise applications, data center
    
IBM
Published By: CDW     Published Date: Nov 12, 2012
In this article, you'll find new power-saving and measurement technologies, along with maturing best practices that can help IT managers implement comprehensive strategies to better rein in energy costs.
Tags : 
data center, power and cooling, availability, data center optimization
    
CDW
Published By: NetApp     Published Date: Sep 24, 2013
"Storage system architectures are moving away from monolithic scale-up approaches and adopting scale-out storage – providing a powerful and flexible way to respond to the inevitable data growth and data management challenges in today’s environments. With extensive data growth demands, there needs to be an increase in the levels of storage and application availability, performance, and scalability. Access this technical report that provides an overview of NetApp clustered Data ONTAP 8.2 and shows how it incorporates industry-leading unified architecture, non-disruptive operations, proven storage efficiency, and seamless scalability."
Tags : 
storage infrastructure, clustered storage, technology, scalability, application, storage solution, non-disruptive operations, data center
    
NetApp
Published By: NetApp     Published Date: Sep 24, 2013
"Today, IT’s customers are more mobile and global than ever before and as such expect their applications and data to be available 24x7. Interruptions, whether planned or unplanned, can have a major impact to the bottom line of the business. ESG Lab tested the ability of clustered Data ONTAP to provide continuous application availability and evaluated performance for both SAN and NAS configurations while running an Oracle OLTP workload. Check out this report to see the results."
Tags : 
mobile, global, applications, cloud, configuration, technology, data center
    
NetApp
Published By: Nimble Storage     Published Date: Feb 26, 2016
Download this eBook to learn the steps you can take now to prepare for the all flash data center. flash storage, SSD, all flash data centers, nimble storage, predictive flash platform, application perfomance, data velocity
Tags : 
flash storage, ssd, all flash data centers, nimble storage, predictive flash platform, application perfomance, data velocity, data protection, high availability, big data, predictive analytics, data center
    
Nimble Storage
Published By: Dell EMC     Published Date: Aug 03, 2015
XtremIO reduces datacenter footprint and complexity with unstoppable in-line data reduction capabilities which address storage sprawl for Exchange databases. Thin provisioning doesn’t just eliminate space at the end of the drive, it eliminates all of the whitespace found within the database!
Tags : 
data reduction, deployment approaches, emc, exchange server, database availability, compression, cloud infrastructure, datacenter footprint reduction
    
Dell EMC
Published By: Schneider Electric     Published Date: Feb 12, 2018
Internet use is trending towards bandwidth-intensive content and an increasing number of attached “things”. At the same time, mobile telecom networks and data networks are converging into a cloud computing architecture. To support needs today and tomorrow, computing power and storage is being inserted out on the network edge in order to lower data transport time and increase availability. Edge computing brings bandwidth-intensive content and latency-sensitive applications closer to the user or data source. This white paper explains the drivers of edge computing and explores the various types of edge computing available.
Tags : 
    
Schneider Electric
Published By: IBM APAC     Published Date: Jul 19, 2019
Disruption to data availability has the ability to cripple an organisation, and in the consumer goods market, restricted access to data could cause goods to take longer to get to market. To prepare for data outages, it is imperative that businesses have effective backup and recovery solutions in place to minimise downtime and maintain functionality. This case study examines how PT Wings Surya, a leader in the Indonesian consumer goods market, was able to increase business uptime, cut the time taken to run backup jobs from 8 to just 2 hours, and reduce backup costs by 60%.
Tags : 
    
IBM APAC
Published By: Stratoscale     Published Date: Feb 01, 2019
This eBook offers a practical hands-on guide to “Day One” challenges of deploying, managing and monitoring PostgreSQL. With the ongoing shift towards open-source database solutions, it’s no surprise that PostgreSQL is the fastest growing database. While it’s tempting to simply compare the licensing costs of proprietary systems against that of open source, it is both a misleading and incorrect approach when evaluating the potential for return on investment of a database technology migration. An effective monitoring and logging strategy is critical for maintaining the reliability, availability, and performance of database environments. The second section of this eBook provides a detailed analysis of all aspects of monitoring and logging PostgreSQL: ? Monitoring KPIs ? Metrics and stats ? Monitoring tools ? Passive monitoring versus active notifications
Tags : 
    
Stratoscale
Published By: Stratoscale     Published Date: Feb 01, 2019
This eBook offers a practical hands-on guide to “Day Two” challenges of accelerating large-scale PostgreSQL deployments. With the ongoing shift towards open-source database solutions, it’s no surprise that PostgreSQL is the fastest growing database. While it’s tempting to simply compare the licensing costs of proprietary systems against that of open source, it is both a misleading and incorrect approach when evaluating the potential for return on investment of a database technology migration. After a PostgreSQL deployment is live, there are a variety of day-two scenarios that require planning and strategizing. The third section of this eBook provides a detailed analysis of all aspects accelerating large-scale PostgreSQL deployments: ? Backups and Availability: strategies, point-in-time recovery, availability and scalability ? Upgrades and DevOps: PostgreSQL upgrade process, application upgrades and CI/CD
Tags : 
    
Stratoscale
Published By: IBM     Published Date: Sep 28, 2017
Here are the 6 reasons to change your database: Lower total cost of ownership Increased scalability and availability Flexibility for hybrid environments A platform for rapid reporting and analytics Support for new and emerging applications Greater simplicity Download now to learn more!
Tags : 
scalability, hybrid environment, emerging applications, rapid reporting
    
IBM
Published By: Group M_IBM Q1'18     Published Date: Dec 19, 2017
As organizations develop next-generation applications for the digital era, many are using cognitive computing ushered in by IBM Watson® technology. Cognitive applications can learn and react to customer preferences, and then use that information to support capabilities such as confidence-weighted outcomes with data transparency, systematic learning and natural language processing. To make the most of these next-generation applications, you need a next-generation database. It must handle a massive volume of data while delivering high performance to support real-time analytics. At the same time, it must provide data availability for demanding applications, scalability for growth and flexibility for responding to changes.
Tags : 
database, applications, data availability, cognitive applications
    
Group M_IBM Q1'18
Start   Previous    1 2 3 4 5 6 7 8 9 10 11 12 13 14    Next    End
Search      

Related Topics

Add Research

Get your company's research in the hands of targeted business professionals.