Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.

sections
 

data streaming

Results 1 - 25 of 33Sort Results By: Published Date | Title | Company Name
Published By: Attunity     Published Date: Nov 15, 2018
Change data capture (CDC) technology can modernize your data and analytics environment with scalable, efficient and real-time data replication that does not impact production systems. To realize these benefits, enterprises need to understand how this critical technology works, why it’s needed, and what their Fortune 500 peers have learned from their CDC implementations. This book serves as a practical guide for enterprise architects, data managers and CIOs as they enable modern data lake, streaming and cloud architectures with CDC. Read this book to understand: ? The rise of data lake, streaming and cloud platforms ? How CDC works and enables these architectures ? Case studies of leading-edge enterprises ? Planning and implementation approaches
Tags : 
optimize customer service
    
Attunity
Published By: Attunity     Published Date: Jan 14, 2019
This whitepaper explores how to automate your data lake pipeline to address common challenges including how to prevent data lakes from devolving into useless data swamps and how to deliver analytics-ready data via automation. Read Increase Data Lake ROI with Streaming Data Pipelines to learn about: • Common data lake origins and challenges including integrating diverse data from multiple data source platforms, including lakes on premises and in the cloud. • Delivering real-time integration, with change data capture (CDC) technology that integrates live transactions with the data lake. • Rethinking the data lake with multi-stage methodology, continuous data ingestion and merging processes that assemble a historical data store. • Leveraging a scalable and autonomous streaming data pipeline to deliver analytics-ready data sets for better business insights. Read this Attunity whitepaper now to get ahead on your data lake strategy in 2019.
Tags : 
data lake, data pipeline, change data capture, data swamp, hybrid data integration, data ingestion, streaming data, real-time data, big data, hadoop, agile analytics, cloud data lake, cloud data warehouse, data lake ingestion, data ingestion
    
Attunity
Published By: Attunity     Published Date: Feb 12, 2019
This technical whitepaper by Radiant Advisors covers key findings from their work with a network of Fortune 1000 companies and clients from various industries. It assesses the major trends and tips to gain access to and optimize data streaming for more valuable insights. Read this report to learn from real-world successes in modern data integration, and better understand how to maximize the use of streaming data. You will also learn about the value of populating a cloud data lake with streaming operational data, leveraging database replication, automation and other key modern data integration techniques. Download this whitepaper today for about the latest approaches on modern data integration and streaming data technologies.
Tags : 
streaming data, cloud data lakes, cloud data lake, data lake, cloud, data lakes, streaming data, change data capture, cloud computing, modern data integration, data integration, data analytics, cloud-based data lake, enterprise data, self-service data
    
Attunity
Published By: Attunity     Published Date: Feb 12, 2019
Read this technical whitepaper to learn how data architects and DBAs can avoid the struggle of complex scripting for Kafka in modern data environments. You’ll also gain tips on how to avoid the time-consuming hassle of manually configuring data producers and data type conversions. Specifically, this paper will guide you on how to overcome these challenges by leveraging innovative technology such as Attunity Replicate. The solution can easily integrate source metadata and schema changes for automated configuration real-time data feeds and best practices.
Tags : 
data streaming, kafka, metadata integration, metadata, data streaming, apache kafka, data integration, data analytics, database transactions, streaming environments, real-time data replication, data configuration
    
Attunity
Published By: AWS     Published Date: Apr 27, 2018
Until recently, businesses that were seeking information about their customers, products, or applications, in real time, were challenged to do so. Streaming data, such as website clickstreams, application logs, and IoT device telemetry, could be ingested but not analyzed in real time for any kind of immediate action. For years, analytics were understood to be a snapshot of the past, but never a window into the present. Reports could show us yesterday’s sales figures, but not what customers are buying right now. Then, along came the cloud. With the emergence of cloud computing, and new technologies leveraging its inherent scalability and agility, streaming data can now be processed in memory, and more significantly, analyzed as it arrives, in real time. Millions to hundreds of millions of events (such as video streams or application alerts) can be collected and analyzed per hour to deliver insights that can be acted upon in an instant. From financial services to manufacturing, this rev
Tags : 
    
AWS
Published By: AWS     Published Date: May 18, 2018
We’ve become a world of instant information. We carry mobile devices that answer questions in seconds and we track our morning runs from screens on our wrists. News spreads immediately across our social feeds, and traffic alerts direct us away from road closures. As consumers, we have come to expect answers now, in real time. Until recently, businesses that were seeking information about their customers, products, or applications, in real time, were challenged to do so. Streaming data, such as website clickstreams, application logs, and IoT device telemetry, could be ingested but not analyzed in real time for any kind of immediate action. For years, analytics were understood to be a snapshot of the past, but never a window into the present. Reports could show us yesterday’s sales figures, but not what customers are buying right now. Then, along came the cloud. With the emergence of cloud computing, and new technologies leveraging its inherent scalability and agility, streaming data
Tags : 
    
AWS
Published By: AWS     Published Date: Jun 20, 2018
Data and analytics have become an indispensable part of gaining and keeping a competitive edge. But many legacy data warehouses introduce a new challenge for organizations trying to manage large data sets: only a fraction of their data is ever made available for analysis. We call this the “dark data” problem: companies know there is value in the data they collected, but their existing data warehouse is too complex, too slow, and just too expensive to use. A modern data warehouse is designed to support rapid data growth and interactive analytics over a variety of relational, non-relational, and streaming data types leveraging a single, easy-to-use interface. It provides a common architectural platform for leveraging new big data technologies to existing data warehouse methods, thereby enabling organizations to derive deeper business insights. Key elements of a modern data warehouse: • Data ingestion: take advantage of relational, non-relational, and streaming data sources • Federated q
Tags : 
    
AWS
Published By: AWS     Published Date: Aug 20, 2018
A modern data warehouse is designed to support rapid data growth and interactive analytics over a variety of relational, non-relational, and streaming data types leveraging a single, easy-to-use interface. It provides a common architectural platform for leveraging new big data technologies to existing data warehouse methods, thereby enabling organizations to derive deeper business insights. Key elements of a modern data warehouse: • Data ingestion: take advantage of relational, non-relational, and streaming data sources • Federated querying: ability to run a query across heterogeneous sources of data • Data consumption: support numerous types of analysis - ad-hoc exploration, predefined reporting/dashboards, predictive and advanced analytics
Tags : 
    
AWS
Published By: AWS - ROI DNA     Published Date: Jun 12, 2018
Traditional data processing infrastructures—especially those that support applications—weren’t designed for our mobile, streaming, and online world. However, some organizations today are building real-time data pipelines and using machine learning to improve active operations. Learn how to make sense of every format of log data, from security to infrastructure and application monitoring, with IT Operational Analytics--enabling you to reduce operational risks and quickly adapt to changing business conditions.
Tags : 
    
AWS - ROI DNA
Published By: Cisco     Published Date: Nov 18, 2015
The Internet of Everything (IoE) is a continuous interaction among people, processes, data, and things. Sensors, networks, and smart devices are ubiquitous, providing a torrent of streaming data or big data. The Internet of Things (IoT), which is a network of physical objects accessed through the Internet that can sense and communicate, is a component of IoE. Cisco is helping customers and strategic partners leverage the full potential of IoE to achieve radical results across all sectors and industries. Indeed, IoE is capable of helping public safety and justice agencies increase cost efficiency, improve safety and security, provide better response times, and increase productivity.
Tags : 
ioe, public safety, justice, emergency response, networking, security, enterprise applications
    
Cisco
Published By: HP     Published Date: Jan 20, 2015
"Improving the operational aspects of a business can span the organizational chart, from line of business teams focused on the supply chain to IT teams reporting on communication networks and their switches. The goal is to capture the data streaming in from these various processes, and put Big Data techniques to work for you."
Tags : 
big data, hp haven, scalable, secure data platform, ecosystem, security
    
HP
Published By: IBM     Published Date: Aug 05, 2014
There is a lot of discussion in the press about Big Data. Big Data is traditionally defined in terms of the three V’s of Volume, Velocity, and Variety. In other words, Big Data is often characterized as high-volume, streaming, and including semi-structured and unstructured formats. Healthcare organizations have produced enormous volumes of unstructured data, such as the notes by physicians and nurses in electronic medical records (EMRs). In addition, healthcare organizations produce streaming data, such as from patient monitoring devices. Now, thanks to emerging technologies such as Hadoop and streams, healthcare organizations are in a position to harness this Big Data to reduce costs and improve patient outcomes. However, this Big Data has profound implications from an Information Governance perspective. In this white paper, we discuss Big Data Governance from the standpoint of three case studies.
Tags : 
ibm, data, big data, information, healthcare, governance, technology, it management
    
IBM
Published By: IBM     Published Date: Jul 07, 2015
In this book you will also learn how cognitive computing systems, like IBM Watson, fit into the Big Data world. Learn about the concept of data-in-motion and InfoSphere Streams, the world’s fastest and most flexible platform for streaming data.
Tags : 
big data, mobility, compute-intensive apps, virtualization, cloud computing, scalable infrastructure, reliability, data center
    
IBM
Published By: Impetus     Published Date: Mar 15, 2016
Streaming analytics platforms provide businesses a method for extracting strategic value from data-in-motion in a manner similar to how traditional analytics tools operate on data-at rest.
Tags : 
impetus, guide to stream analytics, real time streaming analytics, streaming analytics, real time analytics, big data analytics, monitoring, network architecture
    
Impetus
Published By: Intel     Published Date: Jul 27, 2017
To stay ahead of the competition in a fast-paced, cost-driven cloud services marketplace, LeCloud must innovate new services and revenue streams to retain customers and drive profit. By using future-forward data center solutions from Intel, LeCloud is able to reduce latency in its video transcoding and improve the user experience when streaming new 4K and H.265 real-time video services to millions of customers concurrently.
Tags : 
    
Intel
Published By: MemSQL     Published Date: Nov 15, 2017
THE LAMBDA ARCHITECTURE SIMPLIFIED Your Guide to Building a Scalable Data Architecture for Real-Time Workloads YOU'LL LEARN: - What defines the Lambda Architecture, broken down by each layer - How to simplify the Lambda Architecture by consolidating the speed layer and batch layer into one system - How to implement a scalable Lambda Architecture that accommodates streaming and immutable data - How companies like Comcast and Tapjoy use Lambda Architectures in production
Tags : 
data, scalable, architecture, production
    
MemSQL
Published By: SAS     Published Date: Apr 16, 2015
ITS technology is a general term. Two common and related forms of ITS communication technology using event stream processing are referred to as vehicle-to-vehicle (V2V) and vehicle to-infrastructure (V2X) in the US, and car-to-infrastructure (Car2X) in Europe. The two types of connected-car research and development programs often overlap and can be integrated. Car2X enables vehicle communication with the road transportation infrastructure and provides the ability to send or receive local information about traffic conditions, geo-markers (e.g. to identify pothole locations), road hazards, alerts, safety vehicles, etc. V2V focuses on connected-car technology and the anonymous communication of sensor data continuously transmitted to and from cars. Using event stream processing, this streaming data enables the real-time synthesis of information to communicate what will improve and promote driver safety, reduce crashes, and improve vehicle transportation efficiency.
Tags : 
    
SAS
Published By: SAS     Published Date: Apr 25, 2017
If you’re in the data world, you know it’s full of discord. Multiple data sources, inconsistent standards and definitions, inaccurate reports and a lack of governance are enough to derail any organization. What’s an enterprise architect to do? With the right data governance and master data management (MDM) solution, you can set and enforce policies and establish a consistent view of your data without micromanaging it. You can eliminate duplicate and inconsistent data. You can combine traditional data and new big data sources – like streaming data from the IoT – into one harmonious view. Read this e-book for expert advice and case studies that will show you new ways to manage your big data – and make sure everyone’s on the same page.
Tags : 
    
SAS
Published By: SAS     Published Date: Jun 05, 2017
Analytics is now an expected part of the bottom line. The irony is that as more companies become adept at analytics, it becomes less of a competitive advantage. Enter machine learning. Recent advances have led to increased interest in adopting this technology as part of a larger, more comprehensive analytics strategy. But incorporating modern machine learning techniques into production data infrastructures is not easy.Businesses are now being forced to look deeper into their data to increase efficiency and competitiveness. Read this report to learn more about modern applications for machine learning, including recommendation systems, streaming analytics, deep learning and cognitive computing. And learn from the experiences of two companies that have successfully navigated both organizational and technological challenges to adopt machine learning and embark on their own analytics evolution.
Tags : 
    
SAS
Published By: SAS     Published Date: Jun 05, 2017
"The Industrial Internet of Things (IIoT) is flooding today’s industrial sector with data. Information is streaming in from many sources — equipment on production lines, sensors at customer facilities, sales data, and much more. Harvesting insights means filtering out the noise to arrive at actionable intelligence. This report shows how to craft a strategy to gain a competitive edge. It explains how to evaluate IIoT solutions, including what to look for in end-to-end analytics solutions. Finally, it shows how SAS has combined its analytics expertise with Intel’s leadership in IIoT information architecture to create solutions that turn raw data into valuable insights. "
Tags : 
    
SAS
Published By: SAS     Published Date: Jan 17, 2018
The Industrial Internet of Things (IIoT) is flooding today’s industrial sector with data. Information is streaming in from many sources — equipment on production lines, sensors at customer facilities, sales data, and much more. Harvesting insights means filtering out the noise to arrive at actionable intelligence. This report shows how to craft a strategy to gain a competitive edge. It explains how to evaluate IIoT solutions, including what to look for in end-to-end analytics solutions. Finally, it shows how SAS has combined its analytics expertise with Intel’s leadership in IIoT information architecture to create solutions that turn raw data into valuable insights.
Tags : 
    
SAS
Published By: SAS     Published Date: Jan 17, 2018
Executives, managers and information workers have all come to respect the role that data management plays in the success of their organizations. But organizations don’t always do a good job of communicating and encouraging better ways of managing information. In this e-book you will find easy to digest resources on the value and importance of data preparation, data governance, data integration, data quality, data federation, streaming data, and master data management.
Tags : 
    
SAS
Published By: SAS     Published Date: Jun 06, 2018
Data integration (DI) may be an old technology, but it is far from extinct. Today, rather than being done on a batch basis with internal data, DI has evolved to a point where it needs to be implicit in everyday business operations. Big data – of many types, and from vast sources like the Internet of Things – joins with the rapid growth of emerging technologies to extend beyond the reach of traditional data management software. To stay relevant, data integration needs to work with both indigenous and exogenous sources while operating at different latencies, from real time to streaming. This paper examines how data integration has gotten to this point, how it’s continuing to evolve and how SAS can help organizations keep their approach to DI current.
Tags : 
    
SAS
Previous   1 2    Next    
Search      

Related Topics

Add Research

Get your company's research in the hands of targeted business professionals.