Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.

sections
 

column

Results 1 - 25 of 29Sort Results By: Published Date | Title | Company Name
Published By: Oracle CX     Published Date: Oct 20, 2017
With the growing size and importance of information stored in today’s databases, accessing and using the right information at the right time has become increasingly critical. Real-time access and analysis of operational data is key to making faster and better business decisions, providing enterprises with unique competitive advantages. Running analytics on operational data has been difficult because operational data is stored in row format, which is best for online transaction processing (OLTP) databases, while storing data in column format is much better for analytics processing. Therefore, companies normally have both an operational database with data in row format and a separate data warehouse with data in column format, which leads to reliance on “stale data” for business decisions. With Oracle’s Database In-Memory and Oracle servers based on the SPARC S7 and SPARC M7 processors companies can now store data in memory in both row and data formats, and run analytics on their operatio
Tags : 
    
Oracle CX
Published By: Oracle CX     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
    
Oracle CX
Published By: Oracle PaaS/IaaS/Hardware     Published Date: Jul 25, 2017
This ESG Lab review documents the results of recent testing of the Oracle SPARC M7 processor with a focus on in-memory database performance for the real-time enterprise. Leveraging new advanced features like columnar compression and on-ship in-memory query acceleration, ESG Lab compared the in-memory database performance of a SPARC T7 system with a SPARC M7 processor to an x86-based system.
Tags : 
    
Oracle PaaS/IaaS/Hardware
Published By: IBM     Published Date: May 19, 2016
In our 21-criteria evaluation of the dynamic case management (DCM) market, we identified the 14 most significant software vendors — Appian, bpm’online, Column Technologies, DST Systems, Eccentex, IBM, Isis Papyrus, Lexmark Enterprise Software, MicroPact, Newgen Software, OnBase by Hyland, OpenText, Pegasystems, and TIBCO Software — and researched, analyzed, and scored them. The evaluation focused on providers’ adaptive, analytics, and mobile features, all critical to helping enterprises tackle increasing volumes of varied and unstructured work. This report helps enterprise architecture (EA) professionals select the best providers to meet their unique needs.
Tags : 
ibm, forrester, forrester wave, dynamic case management, dcm, software vendors, software, enterprise applications
    
IBM
Published By: IBM     Published Date: Jul 08, 2016
In our 21-criteria evaluation of the dynamic case management (DCM) market, we identified the 14 most significant software vendors — Appian, bpm’online, Column Technologies, DST Systems, Eccentex, IBM, Isis Papyrus, Lexmark Enterprise Software, MicroPact, Newgen Software, OnBase by Hyland, OpenText, Pegasystems, and TIBCO Software — and researched, analyzed, and scored them. The evaluation focused on providers’ adaptive, analytics, and mobile features, all critical to helping enterprises tackle increasing volumes of varied and unstructured work. This report helps enterprise architecture (EA) professionals select the best providers to meet their unique needs.
Tags : 
ibm, forrester, forrester wave, dynamic case management, dcm, networking, software development, enterprise applications
    
IBM
Published By: IBM     Published Date: Jul 21, 2016
IBM's recently released DB2 version 11.1 for Linux, Unix and Windows (LUW) is a hybrid database that IBM says can handle transactional and analytic workloads thanks to its BLU Acceleration technology, which features an in-memory column store for analytical workloads that can scale across a massively parallel cluster.
Tags : 
ibm, db2. analytics, mpp, data wharehousing
    
IBM
Published By: IBM     Published Date: Jul 06, 2017
DB2 is a proven database for handling the most demanding transactional workloads. But the trend as of late is to enable relational databases to handle analytic queries more efficiently by adding an inmemory column store alongside to aggregate data and provide faster results. IBM's BLU Acceleration technology does exactly that. While BLU isn't brand new, the ability to spread the column store across a massively parallel processing (MPP) cluster of up to 1,000 nodes is a new addition to the technology. That, along with simpler monthly pricing options and integration with dashDB data warehousing in the cloud, makes DB2 for LUW, a very versatile database.
Tags : 
memory analytics, database, efficiency, acceleration technology, aggregate data
    
IBM
Published By: Alteryx, Inc.     Published Date: Apr 21, 2017
Analysts struggle to incorporate new sources of data into their analysis because they rely on Microsoft Excel or other tools that were not designed for data blending. Deleting columns, parsing data, and writing complicated formulas to clean and combine data every time it changes is not an efficient way for today’s analysts to spend their time. Download The Definitive Guide to Data Blending and: Understand how analysts are empowered through data blending Learn how to automate time-consuming, manual data preparation tasks Gain deeper business insights in hours, not the weeks typical of traditional approaches
Tags : 
    
Alteryx, Inc.
Published By: MarkLogic     Published Date: Mar 29, 2018
Executives, managers, and users will not trust data unless they understand where it came from. Enterprise metadata is the “data about data” that makes this trust possible. Unfortunately, many healthcare and life sciences organizations struggle to collect and manage metadata with their existing relational and column-family technology tools. MarkLogic’s multi-model architecture makes it easier to manage metadata, and build trust in the quality and lineage of enterprise data. Healthcare and life sciences companies are using MarkLogic’s smart metadata management capabilities to improve search and discovery, simplify regulatory compliance, deliver more accurate and reliable quality reports, and provide better customer service. This paper explains the essence and advantages of the MarkLogic approach.
Tags : 
enterprise, metadata, management, organizations, technology, tools, mark logic
    
MarkLogic
Published By: MarkLogic     Published Date: May 07, 2018
Executives, managers, and users will not trust data unless they understand where it came from. Enterprise metadata is the “data about data” that makes this trust possible. Unfortunately, many healthcare and life sciences organizations struggle to collect and manage metadata with their existing relational and column-family technology tools. MarkLogic’s multi-model architecture makes it easier to manage metadata, and build trust in the quality and lineage of enterprise data. Healthcare and life sciences companies are using MarkLogic’s smart metadata management capabilities to improve search and discovery, simplify regulatory compliance, deliver more accurate and reliable quality reports, and provide better customer service. This paper explains the essence and advantages of the MarkLogic approach.
Tags : 
agile, enterprise, metadata, management, organization
    
MarkLogic
Published By: AstuteIT_ABM_EMEA     Published Date: Feb 02, 2018
MongoDB is an open-source, document database designed with both scalability and developer agility in mind. MongoDB bridges the gap between key-value stores, which are fast and scalable, and relational databases, which have rich functionality. Instead of storing data in rows and columns as one would with a relational database, MongoDB stores JSON documents with dynamic schemas. Customers should consider three primary factors when evaluating databases: technological fit, cost, and topline implications. MongoDB's flexible and scalable data model, robust feature set, and high-performance, high-availability architecture make it suitable for a wide range of database use cases. Given that in many cases relational databases may also be a technological fit, it is helpful to consider the relative costs of each solution when evaluating which database to adopt.
Tags : 
total, cost, ownership, comparison, mongodb, oracle
    
AstuteIT_ABM_EMEA
Published By: Oracle     Published Date: Oct 20, 2017
With the growing size and importance of information stored in today’s databases, accessing and using the right information at the right time has become increasingly critical. Real-time access and analysis of operational data is key to making faster and better business decisions, providing enterprises with unique competitive advantages. Running analytics on operational data has been difficult because operational data is stored in row format, which is best for online transaction processing (OLTP) databases, while storing data in column format is much better for analytics processing. Therefore, companies normally have both an operational database with data in row format and a separate data warehouse with data in column format, which leads to reliance on “stale data” for business decisions. With Oracle’s Database In-Memory and Oracle servers based on the SPARC S7 and SPARC M7 processors companies can now store data in memory in both row and data formats, and run analytics on their operatio
Tags : 
    
Oracle
Published By: Oracle     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
    
Oracle
Published By: Amazon Web Services     Published Date: Sep 05, 2018
Today’s businesses generate staggering amounts of data, and learning to get the most value from that data is paramount to success. Just as Amazon Web Services (AWS) has transformed IT infrastructure to something that can be delivered on-demand, scalably, quickly, and cost-effectively, Amazon Redshift is doing the same for data warehousing and big data analytics. Amazon Redshift offers a massively parallel columnar data store that can be spun up in just a few minutes to deal with billions of rows of data at a cost of just a few cents an hour. Organizations choose Amazon Redshift for its affordability, flexibility, and powerful feature set: • Enterprise-class relational database query and management system • Supports client connections with many types of applications, including business intelligence (BI), reporting, data, and analytics tools • Execute analytic queries in order to retrieve, compare, and evaluate large amounts of data in multiple-stage operations
Tags : 
    
Amazon Web Services
Published By: Amazon Web Services     Published Date: Sep 05, 2018
Just as Amazon Web Services (AWS) has transformed IT infrastructure to something that can be delivered on demand, scalably, quickly, and cost-effectively, Amazon Redshift is doing the same for data warehousing and big data analytics. Redshift offers a massively parallel columnar data store that can be spun up in just a few minutes to deal with billions of rows of data at a cost of just a few cents an hour. It’s designed for speed and ease of use — but to realize all of its potential benefits, organizations still have to configure Redshift for the demands of their particular applications. Whether you’ve been using Redshift for a while, have just implemented it, or are still evaluating it as one of many cloud-based data warehouse and business analytics technology options, your organization needs to understand how to configure it to ensure it delivers the right balance of performance, cost, and scalability for your particular usage scenarios. Since starting to work with this technolog
Tags : 
    
Amazon Web Services
Published By: Oracle Corp.     Published Date: Oct 15, 2012
Please join Tom Kyte, of the AskTom column and Senior Technical Architect at Oracle, to learn about the threats every IT Database and Security administrator needs to be aware of. Tom will also discuss best practices for securing your databases.
Tags : 
database security, compliance, sql, it security, oracle
    
Oracle Corp.
Published By: Oracle Corporation     Published Date: May 11, 2012
Exadata Hybrid Columnar Compression is an enabling technology for two new Oracle Exadata Storage Server features: Warehouse Compression and Archive Compression. We will discuss each of these features in detail later in this paper, but first let's explore Exadata Hybrid Columnar Compression - the next generation in compression technology.
Tags : 
oracle, data warehousing, database, exadata, database machine, infrastructure, operation, operation costs, mobile, growth, payback, architecture, demands, enterprise applications
    
Oracle Corporation
Published By: Vertica     Published Date: Aug 15, 2010
If you are responsible for BI (Business Intelligence) in your organization, there are three questions you should ask yourself: - Are there applications in my organization for combining operational processes with analytical insight that we can't deploy because of performance and capacity constraints with our existing BI environment?
Tags : 
business intelligence, vertica, aggregated data, olap, rolap, sql, query, data warehouse, oltp
    
Vertica
Published By: SAP     Published Date: Jul 17, 2012
Relational database management systems (RDBMSs) are systems of software that manage databases as structured sets of tables containing rows and columns with references to one another through key values. They include the ability to optimize storage, process transactions, perform queries, and preserve the integrity of data structures. When used with applications, they provide the beating heart of the collection of business functions supported by those applications. They vary considerably in terms of the factors that impact the total cost of running a database application, yet users seldom perform a disciplined procedure to calculate such costs. Most users choose instead to remain with a single vendor's RDBMS and never visit the question of ongoing hardware, software, and staffing fees.
Tags : 
sap, infrastructure, database, data management, white paper, management, storage, business functions
    
SAP
Published By: Cisco     Published Date: Nov 30, 2016
Are meetings a “soul sucking waste of time,” as a columnist for the British newspaper The Guardian suggests?1 They could be if they’re unproductive, boring, too many of them, or all of the above. The solution may lie in technology, if used properly. “Technology is essential to innovating the conference call and boosting staff engagement,” according to a Harvard Business Review column.2 Video, for instance, may enhance “connectedness between participants” and help them see reactions to one another’s ideas.3
Tags : 
    
Cisco
Published By: Fisher Investments     Published Date: Sep 29, 2010
If you have a $500,000 portfolio, you should download the latest report by Forbes columnist Ken Fisher. This must-read report includes his latest stock market forecast, plus research and analysis you can use in your portfolio right now.
Tags : 
fisher investments, stock market, outlook, finance, forbes, fisher, economics, portfolio
    
Fisher Investments
Previous   1 2    Next    
Search      

Related Topics

Add Research

Get your company's research in the hands of targeted business professionals.