https://cnfl.io/kafka-streams-101-module-2 | Kafka Streams works with unbounded sequences of events—event streams—which correspond to Kafka topics. When you define a Kafka Streams application, you’re really defining a processor topology, a directed acyclic graph (DAG). Topologies typically have source nodes, user processer nodes, and sink processor nodes. You start a Kafka Streams application by creating an instance of the StreamsBuilder class then use it to create a KStream. Once you have your KStream, you can do things like transform your data with a mapping or filtering operation.
Use the promo code STREAMS101 to get $101 of free Confluent Cloud usage: https://cnfl.io/try-cloud-with-kafka-streams-101-course
Promo code details: https://cnfl.io/promo-code-disclaimer-kafka-streams-101-course
► Apache Kafka® 101: Kafka Streams: https://www.youtube.com/watch?v=UbNoL5tJEjc
► Class Topology: https://kafka.apache.org/23/javadoc/index.html?org/apache/kafka/streams/Topology.html
► Class StreamsBuilder: https://docs.confluent.io/5.5.0/streams/javadocs/index.html?org%2Fapache%2Fkafka%2Fstreams%2FStreamsBuilder.html=&utm_source=youtube&utm_medium=video&utm_campaign=tm.devx_ch.cd-kafka-streams-101_content.stream-processing
Confluent is pioneering a fundamentally new category of data infrastructure focused on data in motion. Confluent’s cloud-native offering is the foundational platform for data in motion – designed to be the intelligent connective tissue enabling real-time data, from multiple sources, to constantly stream across the organization. With Confluent, organizations can meet the new business imperative of delivering rich, digital front-end customer experiences and transitioning to sophisticated, real-time, software-driven backend operations. To learn more, please visit www.confluent.io.
#kafka #kafkastreams #streamprocessing #apachekafka #confluent