Kafka Connect 101: Introduction to Connectors, Sinks, and Sources with Examples

https://cnfl.io/kafka-connect-101-module-1 | Kafka Connect enables streaming integration in and out of Apache Kafka® with systems, including databases, message queues, flat files, and more. In this video, Tim Berglund (Senior Director of Developer Advocacy, Confluent) gives an overview of the primary use cases for Kafka Connect and shows it in action. The demo connects a database to Elasticsearch and Neo4j.

Use the promo code CONNECT101 to get $101 of free Confluent Cloud usage: https://cnfl.io/try-cloud-with-kafka-connect

Promo code details: https://cnfl.io/promo-code-details-connect101

► Kafka Connect in 60 Seconds: https://www.youtube.com/watch?v=RQn3tYvkeh8&list=PL5T99fPsK7ppB_AbZhBhTyKHtHWZLWIJ8&index=2
► Kafka Connect: https://docs.confluent.io/platform/current/connect/index.html?utm_source=youtube&utm_medium=video&utm_campaign=tm.devx_ch.cd-kafka-connect-101_content.connecting-to-apache-kafka
► Kafka Data Pipelines ft. Robin Moffatt: https://softwareengineeringdaily.com/2019/09/23/kafka-data-pipelines-with-robin-moffatt
► Why Kafka Connect? ft. Robin Moffatt: https://developer.confluent.io/podcast/why-kafka-connect-ft-robin-moffatt?utm_source=youtube&utm_medium=video&utm_campaign=tm.devx_ch.cd-kafka-connect-101_content.connecting-to-apache-kafka
► Demo Scene Repository: https://github.com/confluentinc/demo-scene/#kafka-connect
► Kafka Connect Deep Dive – JDBC Source Connector: https://www.confluent.io/blog/kafka-connect-deep-dive-jdbc-source-connector?utm_source=youtube&utm_medium=video&utm_campaign=tm.devx_ch.cd-kafka-connect-101_content.connecting-to-apache-kafka
► Confluent Hub – Discover Apache Kafka Connectors and More: https://hub.confluent.io/?utm_source=youtube&utm_medium=video&utm_campaign=tm.devx_ch.cd-kafka-connect-101_content.connecting-to-apache-kafka
► Pipeline to the Cloud – Streaming On-Premises Data for Cloud Analytics: https://www.confluent.io/blog/cloud-analytics-for-on-premises-data-streams-with-kafka?utm_source=youtube&utm_medium=video&utm_campaign=tm.devx_ch.cd-kafka-connect-101_content.connecting-to-apache-kafka
► Kafka Summit: Apache Kafka and ksqlDB in Action – Let’s Build a Streaming Data Pipeline: https://rmoff.dev/build-a-streaming-data-pipeline
► Data + AI Summit Europe 2020: From Zero to Hero with Kafka Connect: https://rmoff.dev/zero-to-hero
► On Track with Apache Kafka – Building a Streaming ETL Solution with Rail Data: https://videos.confluent.io/watch/cWoYxcP95cWSSWnTLNQmcW?utm_source=youtube&utm_medium=video&utm_campaign=tm.devx_ch.cd-kafka-connect-101_content.connecting-to-apache-kafka
► Intro to Kafka Connect: Core Components and Architecture ft. Robin Moffatt: https://developer.confluent.io/podcast/intro-to-kafka-connect-core-components-and-architecture-ft-robin-moffatt/?utm_source=youtube&utm_medium=video&utm_campaign=tm.devx_ch.cd-kafka-connect-101_content.connecting-to-apache-kafka
► What is Kafka Connect? Basic Fundamentals and Architecture: https://www.confluent.io/blog/kafka-connect-tutorial/?utm_source=youtube&utm_medium=video&utm_campaign=tm.devx_ch.cd-kafka-connect-101_content.connecting-to-apache-kafka

Subscribe: https://youtube.com/c/confluent?sub_confirmation=1
Site: https://confluent.io
GitHub: https://github.com/confluentinc
Facebook: https://facebook.com/confluentinc
Twitter: https://twitter.com/confluentinc
LinkedIn: https://www.linkedin.com/company/confluent
Instagram: https://www.instagram.com/confluent_inc

Confluent, founded by the creators of Apache Kafka®, enables organizations to harness business value of live data. The Confluent Platform manages the barrage of stream data and makes it available throughout an organization. It provides various industries, from retail, logistics and manufacturing, to financial services and online social networking, a scalable, unified, real-time data pipeline that enables applications ranging from large volume data integration to big data analysis with Hadoop to real-time stream processing. To learn more, please visit https://confluent.io

#apachekafka #kafka #kafkaconnect


15 thoughts on “Kafka Connect 101: Introduction to Connectors, Sinks, and Sources with Examples”
  1. Than kyou very mush a 10/10 explanaition, you are born to be a teacher…please create a serie for kafka automotive stream project locally or another project from A to Z

  2. What if your external data source is an API, can you still use Kafka connect to get that data into kafka topics?

  3. How is the ordering at sink guaranteed if there is update in sql followed by delete of the same row?

  4. Any alternative from Kafka server to Sql server (Kafka Server — > SQL Server DataBase)?

Leave a Reply

Your email address will not be published.

Captcha loading...