https://cnfl.io/kafka-streams-101-module-1 | To understand Kafka Streams, you have to begin with Apache Kafka®, a distributed, scalable, elastic, and fault-tolerant event streaming platform. The storage nodes in Kafka, brokers, are just instances of the Kafka storage layer process running on your laptop or server. At the heart of each broker is a log, an append-only file that holds events. Logs are immutable, so unlike the message systems that you may be familiar with, Kafka’s records aren’t destroyed right away. You may not want to store your data forever, so Kafka lets you set a retention time on your logs. You put data into Kafka with producers and get it out with consumers, with offsets marking the position of records in your topic. Kafka Connect makes it easy to link your various producers and consumers to Kafka. As for the benefits of Kafka Streams, imagine that you have a topic, from which you’d like to filter all records marked with the color “red.” You could accomplish this with plain Kafka but the equivalent Kafka Streams code would only take a third of the lines. And that’s really the reason to use it with Kafka: It’s declarative, so you state what you want to do, rather than how to do it. Follow along as Sophie Blee-Goldman (Apache Kafka Committer and Software Engineer, Confluent) covers all of this in detail.
Use the promo code STREAMS101 to get $101 of free Confluent Cloud usage: https://cnfl.io/try-cloud-with-kafka-streams-101-course
Promo code details: https://cnfl.io/promo-code-disclaimer-kafka-streams-101-course
► Creating a Streams Application | Apache Kafka® Streams API: https://www.youtube.com/watch?v=LxxeXI1mPKo&list=RDCMUCmZz-Gj3caLLzEWBtbYUXaA&index=15
► Streams Developer Guide: https://docs.confluent.io/platform/current/streams/developer-guide/index.html?utm_source=youtube&utm_medium=video&utm_campaign=tm.devx_ch.cd-kafka-streams-101_content.stream-processing
► Kafka 101 – Kafka Streams: https://developer.confluent.io/learn-kafka/apache-kafka/kafka-streams/?utm_source=youtube&utm_medium=video&utm_campaign=tm.devx_ch.cd-kafka-streams-101_content.stream-processing
Confluent is pioneering a fundamentally new category of data infrastructure focused on data in motion. Confluent’s cloud-native offering is the foundational platform for data in motion – designed to be the intelligent connective tissue enabling real-time data, from multiple sources, to constantly stream across the organization. With Confluent, organizations can meet the new business imperative of delivering rich, digital front-end customer experiences and transitioning to sophisticated, real-time, software-driven backend operations. To learn more, please visit www.confluent.io.
#kafka #kafkastreams #streamprocessing #apachekafka #confluent