Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.

sections
 

database model

Results 1 - 25 of 38Sort Results By: Published Date | Title | Company Name
Published By: Oracle     Published Date: Apr 16, 2018
Lançado no Oracle Open World 2017, o Oracle Autonomous Database Cloud utiliza um revolucionário modelo de machine learning para habilitar a automação que elimina erros humanos e ajustes manuais, resultando em alto desempenho, disponibilidade de armazenamento e segurança a um custo muito mais baixo. Saiba como ele funciona e por que adotá-lo!
Tags : 
proxima, geracao, banco, dados, lider, setor, acaba
    
Oracle
Published By: AstuteIT_ABM_EMEA     Published Date: Feb 02, 2018
MongoDB is an open-source, document database designed with both scalability and developer agility in mind. MongoDB bridges the gap between key-value stores, which are fast and scalable, and relational databases, which have rich functionality. Instead of storing data in rows and columns as one would with a relational database, MongoDB stores JSON documents with dynamic schemas. Customers should consider three primary factors when evaluating databases: technological fit, cost, and topline implications. MongoDB's flexible and scalable data model, robust feature set, and high-performance, high-availability architecture make it suitable for a wide range of database use cases. Given that in many cases relational databases may also be a technological fit, it is helpful to consider the relative costs of each solution when evaluating which database to adopt.
Tags : 
total, cost, ownership, comparison, mongodb, oracle
    
AstuteIT_ABM_EMEA
Published By: VMTurbo     Published Date: Mar 25, 2015
An Intelligent Roadmap for Capacity Planning Many organizations apply overly simplistic principles to determine requirements for compute capacity in their virtualized data centers. These principles are based on a resource allocation model which takes the total amount of memory and CPU allocated to all virtual machines in a compute cluster, and assumes a defined level of over provisioning (e.g. 2:1, 4:1, 8:1, 12:1) in order to calculate the requirement for physical resources. Often managed in spreadsheets or simple databases, and augmented by simple alert-based monitoring tools, the resource allocation model does not account for actual resource consumption driven by each application workload running in the operational environment, and inherently corrodes the level of efficiency that can be driven from the underlying infrastructure.
Tags : 
capacity planning, vmturbo, resource allocation model, cpu, cloud era, it management, enterprise applications
    
VMTurbo
Published By: Microsoft Azure     Published Date: Apr 11, 2018
When you extend the global reach of your enterprise, you’ll find new markets for your products and services. That means reaching more potential customers, bigger growth potential, and higher ROI. But to tap into those emerging markets, you need to provide the best, most consistent user experience. Now, it’s possible for you to build, deploy, and manage modern apps at scale with a globally-distributed database—without the hassles associated with hosting in your data center. Read the e-book Build Modern Apps with Big Data at a Global Scale and learn how Azure Cosmos DB, a globally-distributed turnkey database service, is transforming the world of modern data management. Keep access to your data available, consistent, and safe—with industry-leading, enterprise-grade security and compliance. Start developing the best app experience for your users based on five well-defined consistency models: Strong: Favors data consistency. Ideal for banks, e-commerce processing, and online booking. Boun
Tags : 
    
Microsoft Azure
Published By: Oracle Corporation     Published Date: May 11, 2012
By using the Oracle Exadata Database Machine as your data warehouse platform you have a balanced, high performance hardware configuration. This paper focuses on the other two corner stones, data modeling and data loading.
Tags : 
oracle, data warehousing, database, exadata, database machine, infrastructure, operation, operation costs
    
Oracle Corporation
Published By: MarkLogic     Published Date: Mar 13, 2015
Big Data has been in the spotlight recently, as businesses seek to leverage their untapped information resources and win big on the promise of big data. However, the problem with big data initiatives are that organizations try using existing information management practices and legacy relational database technologies, which often collapse under the sheer weight of the data. In this paper, MarkLogic explains how a new approach is needed to handle the volume, velocity, and variety of big data because the current relational model that has been the status quo is not working. Learn about the NoSQL paradigm shift, and why NoSQL is gaining significant market traction because it solves the fundamental challenges of big data, achieving better performance, scalability, and flexibility. Learn how MarkLogic’s customers are reimagining their data to: - Make the world more secure - Provide access to valuable information - Create new revenue streams - Gain insights to increase market share - Reduce b
Tags : 
enterprise, nosql, relational, databases, data storage, management system, application, scalable
    
MarkLogic
Published By: MarkLogic     Published Date: Nov 07, 2017
Business demands a single view of data, and IT strains to cobble together data from multiple data stores to present that view. Multi-model databases, however, can help you integrate data from multiple sources and formats in a simplified way. This eBook explains how organizations use multi-model databases to reduce complexity, save money, lessen risk, and shorten time to value, and includes practical examples. Read this eBook to discover how to: Get unified views across disparate data models and formats within a single database Learn how multi-model databases leverage the inherent structure of data being stored Load as is and harmonize unstructured and semi-structured data Provide agility in data access and delivery through APIs, interfaces, and indexes Learn how to scale a multi-model database, and provide ACID capabilities and security Examine how a multi-model database would fit into your existing architecture
Tags : 
    
MarkLogic
Published By: Five9     Published Date: Apr 29, 2013
NJ 2-1-1 is a free phone number and online database that connects New Jersey residents quickly and effectively to community resources and emergency information. It is part of a growing national model that provides 190 million Americans in all 50 states.
Tags : 
five9, cloud, cloud computing, cloud-based contact center, contact center software, hurricane sandy, networking, data center
    
Five9
Published By: CA Technologies     Published Date: Aug 24, 2017
• Use CA Live API Creator to deliver running prototypes in hours. Convert business requirements into working software. • Not just for wireframes, but also for running systems, databases, logic and user interfaces. It’s as simple as a spreadsheet and just as fast. • Impress your business users with results in hours to get their feedback on the data model and logic. Iterate instantly. • All you need is in the box. No IDE to install or configure. Just use your browser.
Tags : 
    
CA Technologies
Published By: CA Technologies EMEA     Published Date: May 23, 2018
To move your business from its current state to the connected enterprise, you have to define a common API to your database and other systems, while providing the infrastructure to support the new model. The new systems must incorporate the security safeguards while ensuring the infrastructure can support the new growing, but variable, load. With the rapid adoption of mobile and web-based services across the industry, the REST architecture has emerged as the de facto standard for API integration across systems. This white paper addresses the concepts of REST, creating REST APIs for your databases and integrating with other systems: • What is REST? • Why use REST for database access? • Building REST infrastructure for database access • The REST enabled database • Integrating REST with other services • Criteria for selecting REST services platform
Tags : 
    
CA Technologies EMEA
Published By: Burton Group     Published Date: Jul 07, 2008
Data modeling has evolved from an arcane technique for database designers into an entire family of interrelated techniques that serves many constituencies, including techno-phobic business stakeholders and users. The new maturity of modeling tools and techniques arrives in the nick of time, because new technical and regulatory realities demand that enterprises maintain scrupulous awareness of their data and how it is used. Data modeling is no longer for databases only, no longer for technologists only, and no longer optional.
Tags : 
modeling, data modeling, data management, data mining, business intelligence, dba, database administration, burton group
    
Burton Group
Published By: Oracle     Published Date: Apr 03, 2012
This white paper describes how to successfully consolidate your database onto a private cloud through several deployment models.
Tags : 
orace, service models, cloud management, 11g, database services, database consolidation
    
Oracle
Published By: Group M_IBM Q2'19     Published Date: Jul 01, 2019
This white paper considers the pressures that enterprises face as the volume, variety, and velocity of relevant data mount and the time to insight seems unacceptably long. Most IT environments seeking to leverage statistical data in a useful way for analysis that can power decision making must glean that data from many sources, put it together in a relational database that requires special configuration and tuning, and only then make it available for data scientists to build models that are useful for business analysts. The complexity of all this is further compounded by the need to collect and analyze data that may reside in a classic datacenter on the premises as well as in private and public cloud systems. This need demands that the configuration support a hybrid cloud environment. After describing these issues, we consider the usefulness of a purpose-built database system that can accelerate access to and management of relevant data and is designed to deliver high performance for t
Tags : 
    
Group M_IBM Q2'19
Published By: Group M_IBM Q3'19     Published Date: Jul 01, 2019
This white paper considers the pressures that enterprises face as the volume, variety, and velocity of relevant data mount and the time to insight seems unacceptably long. Most IT environments seeking to leverage statistical data in a useful way for analysis that can power decision making must glean that data from many sources, put it together in a relational database that requires special configuration and tuning, and only then make it available for data scientists to build models that are useful for business analysts. The complexity of all this is further compounded by the need to collect and analyze data that may reside in a classic datacenter on the premises as well as in private and public cloud systems. This need demands that the configuration support a hybrid cloud environment. After describing these issues, we consider the usefulness of a purpose-built database system that can accelerate access to and management of relevant data and is designed to deliver high performance for t
Tags : 
    
Group M_IBM Q3'19
Published By: Group M_IBM Q3'19     Published Date: Sep 04, 2019
This white paper considers the pressures that enterprises face as the volume, variety, and velocity of relevant data mount and the time to insight seems unacceptably long. Most IT environments seeking to leverage statistical data in a useful way for analysis that can power decision making must glean that data from many sources, put it together in a relational database that requires special configuration and tuning, and only then make it available for data scientists to build models that are useful for business analysts. The complexity of all this is further compounded by the need to collect and analyze data that may reside in a classic datacenter on the premises as well as in private and public cloud systems. This need demands that the configuration support a hybrid cloud environment. After describing these issues, we consider the usefulness of a purpose-built database system that can accelerate access to and management of relevant data and is designed to deliver high performance for t
Tags : 
    
Group M_IBM Q3'19
Published By: MarkLogic     Published Date: Mar 17, 2015
You’ve probably heard about NoSQL, and you may wonder what it is. NoSQL represents a fundamental change in the way people think about storing and accessing data, especially now that most of the information generated is unstructured or semi-structured data — something for which existing database systems such as Oracle, MySQL, SQLServer, and Postgres aren’t well suited. NoSQL means a release from the constraints imposed on database management systems by the relational database model. This free eBook, Enterprise NoSQL for Dummies, MarkLogic Special Edition, provides an overview of NoSQL. You’ll start to understand what it is, what it isn’t, when you should consider using a NoSQL database instead of a relational database management system and when you may want to use both. In addition, this book introduces enterprise NoSQL and shows how it differs from other NoSQL systems, as well as explains when NoSQL may not be the right solution for your data storage problem. You’ll also learn the NoSQ
Tags : 
enterprise, nosql, relational, databases, data storage, management system, application, scalable
    
MarkLogic
Published By: MarkLogic     Published Date: Jun 09, 2017
NoSQL means a release from the constraints imposed on database management systems by the relational database model. This quick, concise eBook provides an overview of NoSQL technology, when you should consider using a NoSQL database over a relational one (and when to use both). In addition, this book introduces Enterprise NoSQL and shows how it differs from other NoSQL systems. You’ll also learn the NoSQL lingo, which customers are already using it and why, and tips to find the right NoSQL database for you.
Tags : 
    
MarkLogic
Published By: MarkLogic     Published Date: Nov 07, 2017
NoSQL means a release from the constraints imposed on database management systems by the relational database model. This quick, concise eBook provides an overview of NoSQL technology, when you should consider using a NoSQL database over a relational one (and when to use both). In addition, this book introduces Enterprise NoSQL and shows how it differs from other NoSQL systems. You’ll also learn the NoSQL lingo, which customers are already using it and why, and tips to find the right NoSQL database for you.
Tags : 
    
MarkLogic
Published By: Basho     Published Date: Apr 07, 2015
This whitepaper looks at why companies choose Riak over a relational database. We focus specifically on availability, scalability, and the key/value data model. Then we analyze the decision points that should be considered when choosing a non-relational solution and review data modeling, querying, and consistency guarantees. Finally, we end with simple patterns for building common applications in Riak using its key/value design, dealing with data conflicts that emerge in an eventually consistent system, and discuss multi-datacenter replication.
Tags : 
basho, riak, relational database, nosql, common applications, simple deployments, it management
    
Basho
Published By: Objectivity     Published Date: Aug 21, 2009
As business users expand to newer types of applications, and grow older ones, their attempts to use RDBMS encounter the "Relational Wall," where RDBMS technology no longer provides the performance and functionality needed. This paper measures the wall, explains what model and architectural differences cause it, how to foresee it, and how to avoid it.
Tags : 
database, rdbmss, object, object-oriented, object oriented, database performance, database development, database architecture
    
Objectivity
Published By: Vertica     Published Date: Oct 30, 2009
Independent research firm Knowledge Integrity Inc. examine two high performance computing technologies that are transitioning into the mainstream: high performance massively parallel analytical database management systems (ADBMS) and distributed parallel programming paradigms, such as MapReduce, (Hadoop, Pig, and HDFS, etc.). By providing an overview of both concepts and looking at how the two approaches can be used together, they conclude that combining a high performance batch programming and execution model with an high performance analytical database provides significant business benefits for a number of different types of applications.
Tags : 
vertica, analytical computing, adbms, mapreduce, application management, data management, data mining, grid computing
    
Vertica
Published By: Oracle     Published Date: Jan 28, 2019
For more than a decade, Oracle has developed and enhanced its ZFS Storage Appliance, giving its users a formidable unified and enterprise-grade storage offering. The latest release, ZS7-2, boasts upgraded hardware and software and is a timely reminder that more users might do well to evaluate this offering. It has a trifecta of advantages: (1) It’s notable performance, price-performance, and flexibility are all improved in this new release (2) There is a surprisingly inclusive set of functionalities, including excellent storage analytics that were developed even before analytics became a contemporary “must-have” (3) There’s a compelling group of “better together” elements that make ZFS Storage Appliance a particularly attractive choice for both Oracle Database environments and users that want to seamlessly integrate a cloud component into their IT infrastructure. Given the proven abilities of Oracle’s prior models, it’s also safe to assume that the new ZS7-2 will outperform other m
Tags : 
    
Oracle
Published By: Microsoft Azure     Published Date: Apr 11, 2018
Developing for and in the cloud has never been more dependent on data. Flexibility, performance, security—your applications need a database architecture that matches the innovation of your ideas. Industry analyst Ovum explored how Azure Cosmos DB is positioned to be the flagship database of internet-based products and services, and concluded that Azure Cosmos DB “is the first to open up [cloud] architecture to data that is not restricted by any specific schema, and it is among the most flexible when it comes to specifying consistency.” From security and fraud detection to consumer and industrial IoT, to personalized e-commerce and social and gaming networks, to smart utilities and advanced analytics, Azure Cosmos DB is how Microsoft is structuring the database for the age of cloud. Read the full report to learn how a globally distributed, multi-model data service can support your business objectives. Fill out the short form above to download the free research paper.
Tags : 
    
Microsoft Azure
Published By: Oracle     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
    
Oracle
Published By: Oracle CX     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
    
Oracle CX
Previous   1 2    Next    
Search      

Related Topics

Add Research

Get your company's research in the hands of targeted business professionals.