Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.

Home > Attunity > Tips for Real-Time Database Streaming for Kafka

Tips for Real-Time Database Streaming for Kafka

White Paper Published By: Attunity
Published:  Feb 12, 2019
Type:  White Paper
Length:  10 pages

The ultra-low latency, highly scalable, distributed Apache Kafka data streaming platform has ushered in a new era of real-time data integration, processing and analytics. With Kafka, enterprises can address new advanced analytics use cases and extract more value from more data. Production database transactions provide a rich vein of data to drive these use cases.

However, architects and DBAs struggle with the scripting and complexity of publishing database transactions to Kafka and streaming environments. Talented programmers must individually and manually configure data producers and data type conversions. They often cannot easily integrate source metadata and schema changes.

Attunity Replicate provides a simple, real-time and universal solution for converting production databases into live data streams. 

Read this whitepaper to understand:

  • Motivations for data streaming
  • Key architectural components of Kafka
  • The role of Attunity Replicate in streaming environments
  • Methods for automated configuration, one-to-many publication, auto-data type mapping and simpler metadata integration
  • Best practices based on two enterprise case studies

Tags : 
data streaming, kafka, metadata integration, metadata, data streaming, apache kafka, data integration, data analytics