You can create new business value by injecting database transactions into Kafka, Amazon Kinesis, Azure Event Hub and other streaming systems.  This enables advanced analytics use cases such as real-time event processing, machine learning and microservices.

The challenge is unlocking this value by replicating DB updates to message streams, at scale, without cumbersome scripting or production impact.

Attunity Replicate addresses these challenges with CDC technology that provides efficient, real-time, and low-impact replication from many source databases at once.

“Attunity is an important partner for both Confluent and the broader Kafka community. Their technology simplifies integration with Kafka, enabling customers to more quickly derive greater business value from their data with less effort.”

VP Business Development at Confluent,
the company founded by the creators of Apache Kafka

Real-Time and Low Impact

With Attunity Replicate, IT organizations gain:

  • Real-time data capture.  Feed live database updates to message brokers with low latency
  • Low impact resolution. Our log-based change data capture (CDC) reduces impact and zero-footprint architecture eliminates the need to install agents on source database systems

Kafka and Big Data Integration

  • Metadata updates. Support source schema evolution and integrate with schema registries
  • Universal access.  Feed message brokers that stream to sinks such as Hadoop, S3, Hive, Cassandra and MongoDB

Sign up for a Free Trial Today

Simple and High Scale

  • No scripting. Rapidly configure, manage and monitor data flows with no manual scripting
  • Scale.  Support hundreds of sources, topics and targets

Customer Case Study: Leading Asset Management Firm

Dev Tool:

Request: solutions/hadoop-big-data/streaming-ingest-kafka
Matched Rewrite Rule: (.?.+?)(?:/([0-9]+))?/?$
Matched Rewrite Query: pagename=solutions%2Fhadoop-big-data%2Fstreaming-ingest-kafka&page=
Loaded Template: page.php