Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.

sections
 

data model

Results 351 - 375 of 382Sort Results By: Published Date | Title | Company Name
Published By: Red Hat     Published Date: Jan 21, 2011
Chuck Dubuque - Product Marketing Manager, Red Hat Server and desktop virtualization technologies, which bring cost and agility advantages to enterprise IT, have become mainstream in most datacenters. However, after nearly 10 years, virtualized servers still only represent about 30 percent of x86 production workloads due to technology limitations and the expense of proprietary virtualization platforms. In this session, Chuck Dubuque will cover: * The benefits of datacenter virtualization * Technologies in the open source community that will enable pervasive virtualization * The move to the cloud * The particular advantages of Red Hat Enterprise Virtualization and the Red Hat subscription model
Tags : 
red hat virtual tradeshow, live event, business agility, enterprise virtualization, product marketing, server, desktop virtualization, datacenter, virtualized server, data center design and management
    
Red Hat
Published By: Akamai     Published Date: Jun 04, 2010
Predictive analytics have been used by different industries for years to solve difficult problems that range from detecting credit card fraud to determining patient risk levels for medical conditions. It combines data mining and machine-learning technologies to create statistical models based on historical data. It then uses these models to predict future events. Extracting the power from the data requires powerful algorithms behind predictive analytics.
Tags : 
akamai, predictive, online advertising, tracking pixels, online shopping, in-market, site visitors, performance marketing
    
Akamai
Published By: TARGUSinfo.     Published Date: May 20, 2010
The arrival of 3rd party data is changing the landscape of online targeting. There is an influx of companies offering these services. But, what works with your business model? How do you evaluate the options? THE QUESTIONS YOU NEED TO KNOW AND WHY: • Is there high fidelity data? • Is the provided data stable? • Is your partner built to last? • Does your partner use a transparent methodology? Download our guide to evaluating data providers and find out the importance and relevance of these terms and conditions.
Tags : 
targus info
    
TARGUSinfo.
Published By: Attivio     Published Date: Apr 07, 2010
In this paper, well known data management authority Mark Albala proposes replacing the reliance on data models as the fixed navigation scheme used for storing, reporting and analyzing information with a new unified information access capability that enables more agile, insightful and pro-active decisions.
Tags : 
attivio, post-discovery intelligent application, analytic information delivery, business intelligence, enterprise applications, executive information systems, eis
    
Attivio
Published By: Discovery Corps, Inc.     Published Date: Nov 20, 2009
Seeing into the hearts and minds of our customers is impossible; but this article describes how Data Mining techniques can be used to create strategies and tactics to increase customer retention and value.
Tags : 
data mining, business intelligence, predictive modeling, customer retention, high-value customers, decision support, clustering, discovery corps
    
Discovery Corps, Inc.
Published By: Vertica     Published Date: Oct 30, 2009
Independent research firm Knowledge Integrity Inc. examine two high performance computing technologies that are transitioning into the mainstream: high performance massively parallel analytical database management systems (ADBMS) and distributed parallel programming paradigms, such as MapReduce, (Hadoop, Pig, and HDFS, etc.). By providing an overview of both concepts and looking at how the two approaches can be used together, they conclude that combining a high performance batch programming and execution model with an high performance analytical database provides significant business benefits for a number of different types of applications.
Tags : 
vertica, analytical computing, adbms, mapreduce, application management, data management, data mining, grid computing
    
Vertica
Published By: ITI     Published Date: Aug 21, 2009
Today 3D CAD models are driving the world's product development processes. Finite Element Analysis, Rapid Prototyping, NC programming, Data Exchange, and other downstream applications rely to a growing extent on the direct use of CAD models to streamline processes saving time and money.
Tags : 
supply chain, scm, supply chain management, computer aided design, cad models, cad model, product development, product testing, cad, computer aided design, cad, computer-aided design, cae, computer aided engineering, computer-aided engineering, product development lifecycle, iti
    
ITI
Published By: Objectivity     Published Date: Aug 21, 2009
As business users expand to newer types of applications, and grow older ones, their attempts to use RDBMS encounter the "Relational Wall," where RDBMS technology no longer provides the performance and functionality needed. This paper measures the wall, explains what model and architectural differences cause it, how to foresee it, and how to avoid it.
Tags : 
database, rdbmss, object, object-oriented, object oriented, database performance, database development, database architecture, database model, dba, database administration, database administrator, dba, objectivity, software development
    
Objectivity
Published By: Safe Software     Published Date: Aug 21, 2009
One of the most powerful aspects of GML is the freedom it gives users to define their own custom application schemas. While this capability provides extraordinary flexibility to data modellers, it also creates significant challenges, particularly when the data is interpreted.
Tags : 
spatial, geographic information system, geographic information systems, geological, geography, information management, data processing, database, data transfer, safe, safe software
    
Safe Software
Published By: SPSS     Published Date: Jun 30, 2009
The intrepid data miner runs many risks, including being buried under mountains of data or disappearing along with the "mysterious disappearing terabyte."  This article outlines some risks, debunks some myths, and attempts to provide some protective "hard hats" for data miners in the technology sector.
Tags : 
spss, data mining, data miner, data management, business process, best practices, business intelligence, statistics, statistical modeling, algorithms, predictive analysis, data analysis, scalability, optimization, business knowledge, decision-making
    
SPSS
Published By: Stratus Technologies     Published Date: Jun 04, 2009
Cloud computing allows companies to use IT resources as a service, taking advantage of shared applications, processing and storage managed within the cloud - either inside a private cloud at an internal data center, or in an external cloud at a service provider. Greater robustness of the virtualized servers supporting this new IT model will ensure a resilient cloud infrastructure capable of delivering "on-demand" access to users.
Tags : 
demand, stratus, cloud computing, virtualized, on demand, storage management, shared applications, cloud, x86, virtualization, server, disaster recovery, business agility, business continuity, it administration, failure, scalability, smp, symmetric multiprocessing, data integrity
    
Stratus Technologies
Published By: SRC,LLC     Published Date: Jun 01, 2009
We work in an ever-changing landscape of competitors, market forces, customer relationships, partner alliances, political and regulatory pressures, and global climate effects. The common mechanism for coping effectively with change is to refer to canned reports and dashboards, dump reporting data into a spreadsheet and look for answers to today's questions. All too often, we have to rely on estimates rather than models, and gut feel rather than data, because we can't get answers in a timely fashion. Yet we know that the answers lie hidden in the morass of corporate data.
Tags : 
business intelligence, src, capabilities, unified enterprise, spatial intelligence, alteryx, oracle, netezza, teradata, gui, ford, experian, sql, enterprise applications
    
SRC,LLC
Published By: VeriSign Incorp.     Published Date: May 08, 2009
Web Application Threats Are Evolving. Are Your Security Efforts Keeping Pace? Today, Web application security threats are not only becoming more abundant than ever, but also more difficult to detect, and more complex to solve. Many organizations are responding to these unique vulnerabilities with traditional network security approaches. However, sophisticated Web applications threats require a more sophisticated security strategy. What’s worked in the past won’t necessarily work today; and what’s more, Web application security requires a comprehensive solution, not simply a series of a la carte provisions. For detailed steps toward improving your Web application security strategy, download the VeriSign® Enterprise Security Services white paper, Best Practices That Improve Web Application Security.
Tags : 
verisign, web application security, sensitive data, intellectual property, business processes, operational costs, verisign enterprise security services, point-of-sale, pos, application-layer vulnerabilities, web 2.0, virtual servers, service oriented architecture (soa), lightweight technologies, insider threat, holistic control model, software development lifecycle, sdlc, wafs
    
VeriSign Incorp.
Published By: Mentor Graphics     Published Date: Apr 03, 2009
A powerful signal integrity analysis tool must be flexibility, easy to use and integrated into an existing EDA framework and design flow. In addition, it is important for the tool to be accurate enough. This report reviews a validation study for the Mentor Graphics HyperLynx 8.0 PI tool to establish confidence in using it for power integrity analysis.
Tags : 
mentor graphics, pdn simulation, eda framework, mentor hyperlynx 8.0 pi, integrity analysis, virtual prototypes, esr, capacitor, power distribution network, vrm, voltage regulator module, signal, smas, analog models, backward crosstalk, capacitive crosstalk, controlling crosstalk, correct emc problems, correct emi problems, cross talk
    
Mentor Graphics
Published By: Mentor Graphics     Published Date: Apr 03, 2009
For advanced signaling over high-loss channels, designs today are using equalization and several new measurement methods to evaluate the performance of the link. Both simulation and measurement tools support equalization and the new measurement methods, but correlation of results throughout the design flow is unclear. In this paper a high performance equalizing serial data link is measured and the performance is compared to that predicted by simulation. Then, the differences between simulation and measurements are discussed as well as methods to correlate the two.
Tags : 
mentor graphics, equalized serial data links, design flow, high loss channels, tektronix, pcb, bit error rate, ber, ieee, serdes, simulation, system configuration, mentor graphics hyperlynx, simplified symmetric trapezoidal input, duty cycle distortion, ber contours, electronics, analog models, backward crosstalk, capacitive crosstalk
    
Mentor Graphics
Published By: SPSS, Inc.     Published Date: Mar 31, 2009
The intrepid data miner runs many risks, including being buried under mountains of data or disappearing along with the "mysterious disappearing terabyte."  This article outlines some risks, debunks some myths, and attempts to provide some protective "hard hats" for data miners in the marketing sector.
Tags : 
spss, data mining, data miner, data management, business process, best practices, business intelligence, statistics, statistical modeling, algorithms, predictive analysis, data analysis, scalability, optimization, business knowledge, decision-making
    
SPSS, Inc.
Published By: DataDirect Technologies     Published Date: Mar 17, 2009
Connecting to a database requires a number of independent layers. The application needs to incorporate software that establishes the connection and calls to the database. A database connectivity layer needs to be in place to help manage security, communications, and data flow with the database. The database has a set of interfaces that help translate the client requests into actions within the database engine. And with the advent of .NET, the costs of managed versus non-managed code must also be considered.
Tags : 
.net, data direct, microsoft .net architecture, scaling, effective architectural design, virtual machine, intermediate language, component object model, com, windows operating system, com-based code, cpu utilization, optimize database connectivity, software development
    
DataDirect Technologies
Published By: HP Data Center     Published Date: Feb 18, 2009
Today's data centers are embarking down a path in which "old world" business, technology, and facility metrics are being pushed aside in order to provide unparalleled service delivery capabilities, processes, and methodologies. The expectations derived from today’s high-density technology deployments are driving service delivery models to extremes with very high service delivery capabilities adopted as baseline requirements within today’s stringent business models. Part of the "revolution" that is driving today's data center modeling to unprecedented high performance and efficiency levels is the fact that computer processing advances with regard to high-performance and smaller footprints have truly countered each other.
Tags : 
hp data center, data center enfironment, high density computing, rack-mount servers, mep mechanical, electrical, and plumbing, virtualization, consolidation, it deployments, server consolidation, networking, data center design and management
    
HP Data Center
Published By: IBM     Published Date: Feb 02, 2009
A comprehensive solution for leveraging data in today's retail environment. From customer data to product placement statistics, retail organizations are constantly juggling information. As the sheer amount of data continues to grow, it becomes increasingly difficult to manage. Not only does data come in many different forms—such as reports, memos and e-mails—but often it’s scattered across multiple repositories.
Tags : 
ibm, ibm balanced warehouses, ibm master data management server, ibm omnifind, ibm industry data models, dynamic warehousing, retail buyer’s guide, leveraging data, customer data, product placement statistics, data management, scalable system, customer relationships, cross-sell, up-sell opportunities, db2 warehouse, oltp-based transactional data server, information management software, enterprise applications
    
IBM
Published By: IBM     Published Date: Feb 02, 2009
A comprehensive solution for leveraging data in today's financial industry. Most organizations realize that the key to success lies in how well they manage data—and the banking industry is no exception. From customer statistics to strategic plans to employee communications, financial institutions are constantly juggling endless types of information.
Tags : 
ibm, information management software, leveraging data, dynamic warehousing, data management, improve customer service, real-time risk analysis, analytics capabilities, information on demand framework, ibm db2 warehouse, ibm master data management server, ibm omnifind, ibm industry data models, ibm balanced warehouses, oltp-based transactional data, enterprise applications
    
IBM
Published By: IBM     Published Date: Dec 30, 2008
Most long-standing data warehouses are designed to support a relatively small number of users who access information to support strategic decisions, financial planning and the production of standard reports that track performance. Today, many more users need to access information in context and on demand so that critical functions are optimized to run efficiently. Learn how to create a roadmap for a truly dynamic warehousing infrastructure, and move ahead of your competition with your business intelligence system
Tags : 
warehousing infrastructure, ibm, business intelligence, data warehouse, dynamic warehousing, data warehouse model, master data
    
IBM
Published By: IBM     Published Date: Dec 09, 2008
Watch an overview of IBM Information Server FastTrack, a revolutionary new module of the IBM Information Server. IBM Information Server FastTrack was designed to help enterprises of all sizes maximize team collaboration, facilitate automation and help ensure projects are completed successfully and on time. By creating an integrated environment that includes business analysts, developers and data modelers, Information Server FastTrack accelerates collaborative development across user roles, products and geographies.
Tags : 
ibm, fasttrack, ibm information server, server fasttrack, data integration projects, project completion, ibm information server fasttrack, process automation
    
IBM
Published By: Tail-f Systems     Published Date: Sep 16, 2008
This whitepaper describes how NETCONF and YANG can help drastically simplify network configuration management. The IETF has recently standardized the NETCONF configuration management protocol and is currently in the process of standardizing a NETCONF-oriented data modeling language called YANG. These two new technologies promise to drastically simplify network configuration management.
Tags : 
configuration management, netconf, yang, next-generation network management, next-generation networks, netconf protocol, tail-f, tailf, network management
    
Tail-f Systems
Published By: HP     Published Date: Jul 29, 2008
This white paper describes an energy audit tool which can lead to significant decreases in the costs of running a data center.  Thermal Zone Mapping (TZM) is a visualization tool developed to present in graphical format high level thermal metrics developed by HP and targeted for use as part of the HP Data Center Thermal Assessment service. This tool uses data generated from computer models of the data center and generates metrics, which are then post-processed and visualized in the three-dimensional data center space.
Tags : 
heat, data center, heat monitoring, hardware, datacenter, mission critical, monitoring, network management, servers, green computing, power and cooling
    
HP
Published By: Burton Group     Published Date: Jul 07, 2008
Data modeling has evolved from an arcane technique for database designers into an entire family of interrelated techniques that serves many constituencies, including techno-phobic business stakeholders and users. The new maturity of modeling tools and techniques arrives in the nick of time, because new technical and regulatory realities demand that enterprises maintain scrupulous awareness of their data and how it is used. Data modeling is no longer for databases only, no longer for technologists only, and no longer optional.
Tags : 
modeling, data modeling, data management, data mining, business intelligence, dba, database administration, burton group
    
Burton Group
Start   Previous    2 3 4 5 6 7 8 9 10 11 12 13 14 15 16    Next    End
Search      

Related Topics

Add Research

Get your company's research in the hands of targeted business professionals.