Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.

sections
 

data streaming

Results 1 - 25 of 33Sort Results By: Published Date | Title | Company Name
Published By: SAS     Published Date: Jan 17, 2018
The Industrial Internet of Things (IIoT) is flooding today’s industrial sector with data. Information is streaming in from many sources — equipment on production lines, sensors at customer facilities, sales data, and much more. Harvesting insights means filtering out the noise to arrive at actionable intelligence. This report shows how to craft a strategy to gain a competitive edge. It explains how to evaluate IIoT solutions, including what to look for in end-to-end analytics solutions. Finally, it shows how SAS has combined its analytics expertise with Intel’s leadership in IIoT information architecture to create solutions that turn raw data into valuable insights.
Tags : 
    
SAS
Published By: SAS     Published Date: Jun 05, 2017
"The Industrial Internet of Things (IIoT) is flooding today’s industrial sector with data. Information is streaming in from many sources — equipment on production lines, sensors at customer facilities, sales data, and much more. Harvesting insights means filtering out the noise to arrive at actionable intelligence. This report shows how to craft a strategy to gain a competitive edge. It explains how to evaluate IIoT solutions, including what to look for in end-to-end analytics solutions. Finally, it shows how SAS has combined its analytics expertise with Intel’s leadership in IIoT information architecture to create solutions that turn raw data into valuable insights. "
Tags : 
    
SAS
Published By: StreamSets     Published Date: Sep 24, 2018
If you’ve ever built real-time data pipelines or streaming applications, you know how useful the Apache Kafka™ distributed streaming platform can be. Then again, you’ve also probably bumped up against the challenges of working with Kafka. If you’re new to Kafka, or ready to simplify your implementation, we present common challenges you may be facing and five ways that StreamSets can make your efforts much more efficient and reliable
Tags : 
apache, kafka, steam, sets, data
    
StreamSets
Published By: Attunity     Published Date: Feb 12, 2019
This technical whitepaper by Radiant Advisors covers key findings from their work with a network of Fortune 1000 companies and clients from various industries. It assesses the major trends and tips to gain access to and optimize data streaming for more valuable insights. Read this report to learn from real-world successes in modern data integration, and better understand how to maximize the use of streaming data. You will also learn about the value of populating a cloud data lake with streaming operational data, leveraging database replication, automation and other key modern data integration techniques. Download this whitepaper today for about the latest approaches on modern data integration and streaming data technologies.
Tags : 
streaming data, cloud data lakes, cloud data lake, data lake, cloud, data lakes, streaming data, change data capture, cloud computing, modern data integration, data integration, data analytics, cloud-based data lake, enterprise data, self-service data
    
Attunity
Published By: IBM     Published Date: Jul 07, 2015
In this book you will also learn how cognitive computing systems, like IBM Watson, fit into the Big Data world. Learn about the concept of data-in-motion and InfoSphere Streams, the world’s fastest and most flexible platform for streaming data.
Tags : 
big data, mobility, compute-intensive apps, virtualization, cloud computing, scalable infrastructure, reliability, data center
    
IBM
Published By: IBM     Published Date: Aug 05, 2014
There is a lot of discussion in the press about Big Data. Big Data is traditionally defined in terms of the three V’s of Volume, Velocity, and Variety. In other words, Big Data is often characterized as high-volume, streaming, and including semi-structured and unstructured formats. Healthcare organizations have produced enormous volumes of unstructured data, such as the notes by physicians and nurses in electronic medical records (EMRs). In addition, healthcare organizations produce streaming data, such as from patient monitoring devices. Now, thanks to emerging technologies such as Hadoop and streams, healthcare organizations are in a position to harness this Big Data to reduce costs and improve patient outcomes. However, this Big Data has profound implications from an Information Governance perspective. In this white paper, we discuss Big Data Governance from the standpoint of three case studies.
Tags : 
ibm, data, big data, information, healthcare, governance, technology, it management
    
IBM
Published By: SAS     Published Date: Jun 06, 2018
Data integration (DI) may be an old technology, but it is far from extinct. Today, rather than being done on a batch basis with internal data, DI has evolved to a point where it needs to be implicit in everyday business operations. Big data – of many types, and from vast sources like the Internet of Things – joins with the rapid growth of emerging technologies to extend beyond the reach of traditional data management software. To stay relevant, data integration needs to work with both indigenous and exogenous sources while operating at different latencies, from real time to streaming. This paper examines how data integration has gotten to this point, how it’s continuing to evolve and how SAS can help organizations keep their approach to DI current.
Tags : 
    
SAS
Published By: SAS     Published Date: Aug 28, 2018
Data integration (DI) may be an old technology, but it is far from extinct. Today, rather than being done on a batch basis with internal data, DI has evolved to a point where it needs to be implicit in everyday business operations. Big data – of many types, and from vast sources like the Internet of Things – joins with the rapid growth of emerging technologies to extend beyond the reach of traditional data management software. To stay relevant, data integration needs to work with both indigenous and exogenous sources while operating at different latencies, from real time to streaming. This paper examines how data integration has gotten to this point, how it’s continuing to evolve and how SAS can help organizations keep their approach to DI current.
Tags : 
    
SAS
Published By: SAS     Published Date: Jan 17, 2018
Executives, managers and information workers have all come to respect the role that data management plays in the success of their organizations. But organizations don’t always do a good job of communicating and encouraging better ways of managing information. In this e-book you will find easy to digest resources on the value and importance of data preparation, data governance, data integration, data quality, data federation, streaming data, and master data management.
Tags : 
    
SAS
Published By: SAS     Published Date: Sep 19, 2018
We are offering this second edition resource as a business oriented, working guide to core data management practices. In this ebook you will find easy to digest resources on the value and importance of data preparation, data governance, data integration, data quality, data federation, streaming data, and master data management.
Tags : 
    
SAS
Published By: SnapLogic     Published Date: Aug 17, 2015
This report summarizes the changes that are occurring, new and emerging patterns of data integration, as well as data integration technology that you can buy today that lives up to these new expectation
Tags : 
data integration, cloud computing, mass data storage, integration requirements, integration strategies, non-persisted data streaming, device native data, data encryption, data center design and management
    
SnapLogic
Published By: AWS     Published Date: Jun 20, 2018
Data and analytics have become an indispensable part of gaining and keeping a competitive edge. But many legacy data warehouses introduce a new challenge for organizations trying to manage large data sets: only a fraction of their data is ever made available for analysis. We call this the “dark data” problem: companies know there is value in the data they collected, but their existing data warehouse is too complex, too slow, and just too expensive to use. A modern data warehouse is designed to support rapid data growth and interactive analytics over a variety of relational, non-relational, and streaming data types leveraging a single, easy-to-use interface. It provides a common architectural platform for leveraging new big data technologies to existing data warehouse methods, thereby enabling organizations to derive deeper business insights. Key elements of a modern data warehouse: • Data ingestion: take advantage of relational, non-relational, and streaming data sources • Federated q
Tags : 
    
AWS
Published By: Attunity     Published Date: Nov 15, 2018
Change data capture (CDC) technology can modernize your data and analytics environment with scalable, efficient and real-time data replication that does not impact production systems. To realize these benefits, enterprises need to understand how this critical technology works, why it’s needed, and what their Fortune 500 peers have learned from their CDC implementations. This book serves as a practical guide for enterprise architects, data managers and CIOs as they enable modern data lake, streaming and cloud architectures with CDC. Read this book to understand: ? The rise of data lake, streaming and cloud platforms ? How CDC works and enables these architectures ? Case studies of leading-edge enterprises ? Planning and implementation approaches
Tags : 
optimize customer service
    
Attunity
Published By: HP     Published Date: Jan 20, 2015
"Improving the operational aspects of a business can span the organizational chart, from line of business teams focused on the supply chain to IT teams reporting on communication networks and their switches. The goal is to capture the data streaming in from these various processes, and put Big Data techniques to work for you."
Tags : 
big data, hp haven, scalable, secure data platform, ecosystem, security
    
HP
Published By: SAS     Published Date: Apr 16, 2015
ITS technology is a general term. Two common and related forms of ITS communication technology using event stream processing are referred to as vehicle-to-vehicle (V2V) and vehicle to-infrastructure (V2X) in the US, and car-to-infrastructure (Car2X) in Europe. The two types of connected-car research and development programs often overlap and can be integrated. Car2X enables vehicle communication with the road transportation infrastructure and provides the ability to send or receive local information about traffic conditions, geo-markers (e.g. to identify pothole locations), road hazards, alerts, safety vehicles, etc. V2V focuses on connected-car technology and the anonymous communication of sensor data continuously transmitted to and from cars. Using event stream processing, this streaming data enables the real-time synthesis of information to communicate what will improve and promote driver safety, reduce crashes, and improve vehicle transportation efficiency.
Tags : 
    
SAS
Published By: Attunity     Published Date: Jan 14, 2019
This whitepaper explores how to automate your data lake pipeline to address common challenges including how to prevent data lakes from devolving into useless data swamps and how to deliver analytics-ready data via automation. Read Increase Data Lake ROI with Streaming Data Pipelines to learn about: • Common data lake origins and challenges including integrating diverse data from multiple data source platforms, including lakes on premises and in the cloud. • Delivering real-time integration, with change data capture (CDC) technology that integrates live transactions with the data lake. • Rethinking the data lake with multi-stage methodology, continuous data ingestion and merging processes that assemble a historical data store. • Leveraging a scalable and autonomous streaming data pipeline to deliver analytics-ready data sets for better business insights. Read this Attunity whitepaper now to get ahead on your data lake strategy in 2019.
Tags : 
data lake, data pipeline, change data capture, data swamp, hybrid data integration, data ingestion, streaming data, real-time data, big data, hadoop, agile analytics, cloud data lake, cloud data warehouse, data lake ingestion, data ingestion
    
Attunity
Published By: Impetus     Published Date: Mar 15, 2016
Streaming analytics platforms provide businesses a method for extracting strategic value from data-in-motion in a manner similar to how traditional analytics tools operate on data-at rest.
Tags : 
impetus, guide to stream analytics, real time streaming analytics, streaming analytics, real time analytics, big data analytics, monitoring, network architecture
    
Impetus
Published By: Intel     Published Date: Jul 27, 2017
To stay ahead of the competition in a fast-paced, cost-driven cloud services marketplace, LeCloud must innovate new services and revenue streams to retain customers and drive profit. By using future-forward data center solutions from Intel, LeCloud is able to reduce latency in its video transcoding and improve the user experience when streaming new 4K and H.265 real-time video services to millions of customers concurrently.
Tags : 
    
Intel
Published By: AWS     Published Date: Apr 27, 2018
Until recently, businesses that were seeking information about their customers, products, or applications, in real time, were challenged to do so. Streaming data, such as website clickstreams, application logs, and IoT device telemetry, could be ingested but not analyzed in real time for any kind of immediate action. For years, analytics were understood to be a snapshot of the past, but never a window into the present. Reports could show us yesterday’s sales figures, but not what customers are buying right now. Then, along came the cloud. With the emergence of cloud computing, and new technologies leveraging its inherent scalability and agility, streaming data can now be processed in memory, and more significantly, analyzed as it arrives, in real time. Millions to hundreds of millions of events (such as video streams or application alerts) can be collected and analyzed per hour to deliver insights that can be acted upon in an instant. From financial services to manufacturing, this rev
Tags : 
    
AWS
Published By: AWS     Published Date: May 18, 2018
We’ve become a world of instant information. We carry mobile devices that answer questions in seconds and we track our morning runs from screens on our wrists. News spreads immediately across our social feeds, and traffic alerts direct us away from road closures. As consumers, we have come to expect answers now, in real time. Until recently, businesses that were seeking information about their customers, products, or applications, in real time, were challenged to do so. Streaming data, such as website clickstreams, application logs, and IoT device telemetry, could be ingested but not analyzed in real time for any kind of immediate action. For years, analytics were understood to be a snapshot of the past, but never a window into the present. Reports could show us yesterday’s sales figures, but not what customers are buying right now. Then, along came the cloud. With the emergence of cloud computing, and new technologies leveraging its inherent scalability and agility, streaming data
Tags : 
    
AWS
Published By: TIBCO Software     Published Date: Feb 01, 2019
The current trend in manufacturing is towards tailor-made products in smaller lots with shorter delivery times. This change may lead to frequent production modifications resulting in increased machine downtime, higher production cost, product waste—and no need to rework faulty products. To satisfy the customer demand behind this trend, manufacturers must move quickly to new production models. Quality assurance is the key area that IT must support. At the same time, the traceability of products becomes central to compliance as well as quality. Traceability can be achieved by interconnecting data sources across the factory, analyzing historical and streaming data for insights, and taking immediate action to control the entire end-to-end process. Doing so can lead to noticeable cost reductions, and gains in efficiency, process reliability, and speed of new product delivery. Additionally, analytics helps manufacturers find the best setups for machinery.
Tags : 
data, product, manufacturers, manufacturing, processes, technologies, analysis, quality, machine, analytics
    
TIBCO Software
Published By: TIBCO Software GmbH     Published Date: Jan 15, 2019
The current trend in manufacturing is towards tailor-made products in smaller lots with shorter delivery times. This change may lead to frequent production modifications resulting in increased machine downtime, higher production cost, product waste—and no need to rework faulty products. To satisfy the customer demand behind this trend, manufacturers must move quickly to new production models. Quality assurance is the key area that IT must support. At the same time, the traceability of products becomes central to compliance as well as quality. Traceability can be achieved by interconnecting data sources across the factory, analyzing historical and streaming data for insights, and taking immediate action to control the entire end-to-end process. Doing so can lead to noticeable cost reductions, and gains in efficiency, process reliability, and speed of new product delivery. Additionally, analytics helps manufacturers find the best setups for machinery.
Tags : 
    
TIBCO Software GmbH
Published By: AWS     Published Date: Aug 20, 2018
A modern data warehouse is designed to support rapid data growth and interactive analytics over a variety of relational, non-relational, and streaming data types leveraging a single, easy-to-use interface. It provides a common architectural platform for leveraging new big data technologies to existing data warehouse methods, thereby enabling organizations to derive deeper business insights. Key elements of a modern data warehouse: • Data ingestion: take advantage of relational, non-relational, and streaming data sources • Federated querying: ability to run a query across heterogeneous sources of data • Data consumption: support numerous types of analysis - ad-hoc exploration, predefined reporting/dashboards, predictive and advanced analytics
Tags : 
    
AWS
Previous   1 2    Next    
Search      

Related Topics

Add Research

Get your company's research in the hands of targeted business professionals.