Kafka Connect 101: Troubleshooting Common Issues and How to Debug Them

https://cnfl.io/kafka-connect-101-module-5 | Learn from Tim Berglund (Senior Director of Developer Advocacy, Confluent) important troubleshooting steps and techniques to debug problems with your Kafka Connect pipelines. This video includes details that will help you understand log messages, increasing log verbosity, and how to view the status of tasks.

Use the promo code CONNECT101 to get $101 of free Confluent Cloud usage: https://cnfl.io/try-cloud-with-kafka-connect

Promo code details: https://cnfl.io/promo-code-details-connect101

► Changing the Logging Level for Kafka Connect Dynamically: https://rmoff.net/2020/01/16/changing-the-logging-level-for-kafka-connect-dynamically
► Kafka Connect Deep Dive – Error Handling and Dead Letter Queues: https://www.confluent.io/blog/kafka-connect-deep-dive-error-handling-dead-letter-queues?utm_source=youtube&utm_medium=video&utm_campaign=tm.devx_ch.cd-kafka-connect-101_content.connecting-to-apache-kafka
► Automatically Restarting Failed Kafka Connect Tasks: https://rmoff.net/2019/06/06/automatically-restarting-failed-kafka-connect-tasks
► Reset Kafka Connect Source Connector Offsets: https://rmoff.net/2019/08/15/reset-kafka-connect-source-connector-offsets

Subscribe: https://youtube.com/c/confluent?sub_confirmation=1
Site: https://confluent.io
GitHub: https://github.com/confluentinc
Facebook: https://facebook.com/confluentinc
Twitter: https://twitter.com/confluentinc
LinkedIn: https://www.linkedin.com/company/confluent
Instagram: https://www.instagram.com/confluent_inc

Confluent, founded by the creators of Apache Kafka®, enables organizations to harness business value of live data. Confluent manages the barrage of stream data and makes it available throughout an organization. It provides various industries, from retail, logistics and manufacturing, to financial services and online social networking, a scalable, unified, real-time data pipeline that enables applications ranging from large volume data integration to big data analysis with Hadoop to real-time stream processing. To learn more, please visit https://confluent.io

#apachekafka #kafka #kafkaconnect


3 thoughts on “Kafka Connect 101: Troubleshooting Common Issues and How to Debug Them”
  1. It's great that the connect API gives the exception in the trace property but it sure would be nice if a timestamp was also provided. When running many connectors in a cluster with many nodes the volume of logs can be huge and locating the exception without a timestamp can be painful.

Leave a Reply

Your email address will not be published.

Captcha loading...