For all its quirks and licence fees, we all love Oracle for what it does. But sometimes we want to get the data out to use elsewhere. Maybe we want to build analytics on it; perhaps we want to drive applications with it; sometimes we might even want to move it to another non-Oracle database—can you imagine that!
With Apache Kafka as our scalable, distributed event streaming platform, we can ingest data from Oracle as a stream of events. We can use Kafka to transform and enrich the events if we want to, even joining them to data from other sources. We can stream the resulting events to target systems, as well as use them to create event-driven microservices.
This talk will show some basics of Kafka and then dive into ingesting data from Oracle into Kafka, applying stream processing with ksqlDB, and then pushing that data to systems including PostgreSQL as well as back into Oracle itself.
️ As presented at ACEs @ Home meetup on 15th June 2020
Slides and resources:
ℹ️ Table of contents:
1:34 What is Kafka? (see also
10:00 What’re the reasons for integrating Oracle into Kafka?
14:41 Kafka Connect (see also
17:50 The two types of Change Data Capture (CDC)
19:40 Live demo – Oracle into Kafka
24:30 Live demo – Difference between CDC methods illustrated
28:40 Live demo – Streaming data from Kafka to another database (Postgres)
32:59 Live demo – ksqlDB
37:19 Live demo – Joining a stream of events to a table in ksqlDB
40:14 Live demo – Building aggregates in ksqlDB
41:24 Live demo – Creating a sink connector from ksqlDB to Postgres
44:04 Live demo – ksqlDB stream/table duality, push and pull queries
46:29 Live demo – Key/Value lookup against state in ksqlDB using REST API
47:44 CDC recap, how to choose which to use
49:29 ksqlDB overview
52:50 Summary & useful links
☁️ Confluent Cloud ☁️
Confluent Cloud is a managed Apache Kafka and Confluent Platform service. It scales to zero and lets you get started with Apache Kafka at the click of a mouse. You can signup at and use code 60DEVADV for $60 towards your bill (small print: