Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.

sections
 

source data analysis

Results 1 - 25 of 32Sort Results By: Published Date | Title | Company Name
Published By: Cisco EMEA     Published Date: Mar 05, 2018
The operation of your organization depends, at least in part, on its data. You can avoid fines and remediation costs, protect your organization’s reputation and employee morale, and maintain business continuity by building a capability to detect and respond to incidents effectively. The simplicity of the incident response process can be misleading. We recommend tabletop exercises as an important step in pressure-testing your program.
Tags : 
human resources, cisco, employees, data, analysis
    
Cisco EMEA
Published By: Oracle     Published Date: Nov 28, 2017
Today’s leading-edge organizations differentiate themselves through analytics to further their competitive advantage by extracting value from all their data sources. Other companies are looking to become data-driven through the modernization of their data management deployments. These strategies do include challenges, such as the management of large growing volumes of data. Today’s digital world is already creating data at an explosive rate, and the next wave is on the horizon, driven by the emergence of IoT data sources. The physical data warehouses of the past were great for collecting data from across the enterprise for analysis, but the storage and compute resources needed to support them are not able to keep pace with the explosive growth. In addition, the manual cumbersome task of patch, update, upgrade poses risks to data due to human errors. To reduce risks, costs, complexity, and time to value, many organizations are taking their data warehouses to the cloud. Whether hosted lo
Tags : 
    
Oracle
Published By: Oracle CX     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
    
Oracle CX
Published By: TriNet     Published Date: Jul 28, 2015
This paper acts as a well-qualified resource for those desiring quantitative data to support the validity of how outsourcing key HR functions can produce positive business results.
Tags : 
hr outsource, peo, data analysis, hr functions, in-house hr processes, co-employment models, compliance risk, healthcare reform regulation
    
TriNet
Published By: Stratoscale     Published Date: Feb 01, 2019
This eBook offers a practical hands-on guide to analyzing and mitigating the risks of migrating to PostgreSQL. With the ongoing shift towards open-source database solutions, it’s no surprise that PostgreSQL is the fastest growing database. While it’s tempting to simply compare the licensing costs of proprietary systems against that of open source, it is both a misleading and incorrect approach when evaluating the potential for return on investment of a database technology migration. A key decision criteria for adopting any technology is whether it can support requirements for existing applications while also fitting into longer term strategies and needs. The first section of this eBook provides a detailed analysis of all aspects of migrating from legacy and commercial solutions to PostgreSQL: ? Schema and code migration ? Data migration ? Application code migration ? Testing and evaluation
Tags : 
    
Stratoscale
Published By: Stratoscale     Published Date: Feb 01, 2019
This eBook offers a practical hands-on guide to “Day One” challenges of deploying, managing and monitoring PostgreSQL. With the ongoing shift towards open-source database solutions, it’s no surprise that PostgreSQL is the fastest growing database. While it’s tempting to simply compare the licensing costs of proprietary systems against that of open source, it is both a misleading and incorrect approach when evaluating the potential for return on investment of a database technology migration. An effective monitoring and logging strategy is critical for maintaining the reliability, availability, and performance of database environments. The second section of this eBook provides a detailed analysis of all aspects of monitoring and logging PostgreSQL: ? Monitoring KPIs ? Metrics and stats ? Monitoring tools ? Passive monitoring versus active notifications
Tags : 
    
Stratoscale
Published By: Stratoscale     Published Date: Feb 01, 2019
This eBook offers a practical hands-on guide to “Day Two” challenges of accelerating large-scale PostgreSQL deployments. With the ongoing shift towards open-source database solutions, it’s no surprise that PostgreSQL is the fastest growing database. While it’s tempting to simply compare the licensing costs of proprietary systems against that of open source, it is both a misleading and incorrect approach when evaluating the potential for return on investment of a database technology migration. After a PostgreSQL deployment is live, there are a variety of day-two scenarios that require planning and strategizing. The third section of this eBook provides a detailed analysis of all aspects accelerating large-scale PostgreSQL deployments: ? Backups and Availability: strategies, point-in-time recovery, availability and scalability ? Upgrades and DevOps: PostgreSQL upgrade process, application upgrades and CI/CD
Tags : 
    
Stratoscale
Published By: Stratoscale     Published Date: Feb 01, 2019
This eBook offers a practical hands-on guide to deploying, managing and scaling PostgreSQL in various environments, in the public cloud and on-premises. With the ongoing shift towards open-source database solutions, it’s no surprise that PostgreSQL is the fastest growing database. While it’s tempting to simply compare the licensing costs of proprietary systems against that of open source, it is both a misleading and incorrect approach when evaluating the potential for return on investment of a database technology migration. The eBook provides a detailed analysis of planning, day-one, and day-two challenges of PostgreSQL deployments, in the context of potential operating environments: ? Public cloud ? Private cloud ? Private DC (bare metal / VM)
Tags : 
    
Stratoscale
Published By: IBM     Published Date: Apr 14, 2017
Any organization wishing to process big data from newly identified data sources, needs to first determine the characteristics of the data and then define the requirements that need to be met to be able to ingest, profile, clean,transform and integrate this data to ready it for analysis. Having done that, it may well be the case that existing tools may not cater for the data variety, data volume and data velocity that these new data sources bring. If this occurs then clearly new technology will need to be considered to meet the needs of the business going forward.
Tags : 
data integration, big data, data sources, business needs, technological advancements, scaling data
    
IBM
Published By: Oracle     Published Date: Nov 07, 2018
At many organisations, planning and budgeting take too long and devour resources. Finance teams spend too much time on administrative tasks relating to these processes—collecting, consolidating and reconciling data, for example—and are left with little time for analysis, strategy development and target setting. Accuracycan be patchy, results unreliable.
Tags : 
    
Oracle
Published By: IBM     Published Date: Oct 27, 2016
IBM Analytics for Apache Spark for Bluemix is an open-source cluster computing framework with in-memory processing to speed analytic applications up to 100 times faster compared to other technologies on the market today. Optimized for extremely fast and large scale data processing-you can easily perform big data analysis from one application.
Tags : 
ibm, apache spark, bluemix, analytics, enterprise applications
    
IBM
Published By: Alteryx, Inc.     Published Date: Apr 21, 2017
Analysts struggle to incorporate new sources of data into their analysis because they rely on Microsoft Excel or other tools that were not designed for data blending. Deleting columns, parsing data, and writing complicated formulas to clean and combine data every time it changes is not an efficient way for today’s analysts to spend their time. Download The Definitive Guide to Data Blending and: Understand how analysts are empowered through data blending Learn how to automate time-consuming, manual data preparation tasks Gain deeper business insights in hours, not the weeks typical of traditional approaches
Tags : 
    
Alteryx, Inc.
Published By: IBM     Published Date: Jul 09, 2018
As the information age matures, data has become the most powerful resource enterprises have at their disposal. Businesses have embraced digital transformation, often staking their reputations on insights extracted from collected data. While decision-makers hone in on hot topics like AI and the potential of data to drive businesses into the future, many underestimate the pitfalls of poor data governance. If business decision-makers can’t trust the data within their organization, how can stakeholders and customers know they are in good hands? Information that is not correctly distributed, or abandoned within an IT silo, can prove harmful to the integrity of business decisions. In search of instant analytical insights, businesses often prioritize data access and analysis over governance and quality. However, without ensuring the data is trustworthy, complete and consistent, leaders cannot be confident their decisions are rooted in facts and reality
Tags : 
    
IBM
Published By: Oracle     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
    
Oracle
Published By: Landis + Gyr     Published Date: Mar 17, 2016
The rapid acceleration of Distributed Energy Resources (DER) presents a range of challenges for electric utility management. The good news is that a strategic approach, one that taps the power of data analytics, can meet the rising need to manage DER. Download this white paper, to learn about the current state of DER and how to get started with a smart grid analytics solution: • Cost-benefit analysis • Challenges of integrated DERs • Planning considerations • Advanced analytics tools and predictive grid analytics benefits • Choosing a platform
Tags : 
cost-benefit analysis, der, analytics, platform, utility, smart grid
    
Landis + Gyr
Published By: AWS     Published Date: Aug 20, 2018
A modern data warehouse is designed to support rapid data growth and interactive analytics over a variety of relational, non-relational, and streaming data types leveraging a single, easy-to-use interface. It provides a common architectural platform for leveraging new big data technologies to existing data warehouse methods, thereby enabling organizations to derive deeper business insights. Key elements of a modern data warehouse: • Data ingestion: take advantage of relational, non-relational, and streaming data sources • Federated querying: ability to run a query across heterogeneous sources of data • Data consumption: support numerous types of analysis - ad-hoc exploration, predefined reporting/dashboards, predictive and advanced analytics
Tags : 
    
AWS
Published By: AWS     Published Date: Nov 15, 2018
It isn’t always easy to keep pace with today’s high volume of data, especially when it’s coming at you from a diverse number of sources. Tracking these analytics can place a strain on IT, who must provide the requested information to C-suite and analysts. Unless this process can happen quickly, the insights grow stale. Download your complimentary ebook now to see how Matillion ETL for Amazon Redshift makes it easy for technical and business users alike to participate and own the entire data and analysis process. With Matillion ETL for Amazon Redshift, everyone from CTOs to marketing analysts can generate valuable business intelligence by automating data and analytics orchestrations.
Tags : 
    
AWS
Published By: Artemis Health     Published Date: Feb 05, 2019
Self-insured employers are mining their health and benefits data to save costs and provide quality care for employees. Data is driving business decisions, but how do you get from millions of rows of data to a consumable graph to taking action? In this white paper, we’ll delve into data analytics best practices that help self-insured employers find actionable insights in their benefits data. • Which data sources will help you ensure you’re measuring the right thing at the right time • How to ensure data variety and choose key metrics • An example of a successful predictive analysis using benefits data
Tags : 
    
Artemis Health
Published By: BMC ASEAN     Published Date: Dec 18, 2018
Big data projects often entail moving data between multiple cloud and legacy on-premise environments. A typical scenario involves moving data from a cloud-based source to a cloud-based normalization application, to an on-premise system for consolidation with other data, and then through various cloud and on-premise applications that analyze the data. Processing and analysis turn the disparate data into business insights delivered though dashboards, reports, and data warehouses - often using cloud-based apps. The workflows that take data from ingestion to delivery are highly complex and have numerous dependencies along the way. Speed, reliability, and scalability are crucial. So, although data scientists and engineers may do things manually during proof of concept, manual processes don't scale.
Tags : 
    
BMC ASEAN
Published By: DataCore     Published Date: Jul 10, 2019
With a software-based approach, IT organizations see a better return on their storage investment. DataCore’s software-defined storage provides improved resource utilization, seamless integration of new technologies, and reduced administrative time - all resulting in lower CAPEX and OPEX, yielding a superior TCO. A survey of 363 DataCore customers found that over half of them (55%) achieved positive ROI within the first year of deployment, and 21% were able to reach positive ROI in less than 6 months. Download this white paper to learn how software-defined storage can help reduce data center infrastructure costs, including guidelines to help you structure your TCO analysis comparison.
Tags : 
software defined storage, virtualization, storage, storage costs, data center costs, infrastructure costs, tco, roi
    
DataCore
Published By: Datastax     Published Date: Apr 04, 2017
As the big data ecosystem continues to expand, new technologies are addressing the requirements for managing, processing, analyzing, and storing data to help companies benefit from the rich sources of information flowing into their organizations. From NoSQL databases to open source projects to commercial products offered on-premises and in the cloud, the future of big data is being driven by innovative new approaches across the data management lifecycle. The most pressing areas include real-time data processing, interactive analysis, data integration, data governance, and security. Download this report for a better understanding of the current landscape, emerging best practices and real-world successes.
Tags : 
evolution, big data, technology, datastax, nosql
    
Datastax
Published By: Beqom     Published Date: Dec 05, 2018
In this guide, we’ll walk you through a few examples that illustrate the value of drilling down into data for better people decisions
Tags : 
compensation analysis, compensation analytics, compensation compliance, compensation management, compensation management employees, compensation management sales teams, compensation saas, compensation software
    
Beqom
Published By: Esker     Published Date: Jan 06, 2014
13 Questions for Maximizing Your Order Processing Automation Investment
Tags : 
esker, automation, sales order processing, order-to-cash, erp systems, automated sales, automation investment, saas
    
Esker
Published By: IBM     Published Date: Sep 30, 2013
In a “talent-constrained” economy, attracting and engaging highly talented people, developing globalleaders, improving and sharing new skills, and keeping people aligned and working together are major challenges. In this report, we discuss how new approaches and a new science of human resources have emerged to develop a “smarter workforce,” resetting the bar on talent, learning, and leadership strategies.
Tags : 
smarter workforce, workforce management, workforce, global, global businesses, recruiting, human resources, data analysis
    
IBM
Published By: SAP     Published Date: Mar 24, 2011
As you prepare to embark on an ERP implementation or to upgrade an existing system, you have a lot to juggle and plan. You know the horror stories of "go-lives" going bad but you aren't sure how to avoid the pitfalls. This white paper discusses at length how Utopia's formal data migration methodology reduces the risks of large systems deployments by using a tried-and-tested framework for the entire migration process, from source data analysis through post-go-live support.
Tags : 
data migration management, source data analysis, data integrity, erp implementation
    
SAP
Previous   1 2    Next    
Search      

Related Topics

Add Research

Get your company's research in the hands of targeted business professionals.