From the course: Apache Flink: Real-Time Data Engineering
Unlock the full course today
Join today to access over 24,200 courses taught by industry experts.
Writing to a Kafka sink - Flink Tutorial
From the course: Apache Flink: Real-Time Data Engineering
Writing to a Kafka sink
- Kafka is a popular event source and sink for flink pipelines. In this example, we will look at using Kafka as a sink for flink pipelines. We will write the one second summaries we created earlier with even time to a Kafka sink. The code for this example, is in the same event time operations class in chapter four. To write to Kafka, we first need to create a Kafka producer. For this, we need to define a properties object with the bootstrap server list for Kafka. Then we create a producer that consumes strings as events. We provide the default topic name, which in this case is flink.kafka.streaming.sink. Next, we need to provide a serialization schema for the strings. This implements a producer record method that returns the name of the topic and the output data in bytes. This is standard Kafka implementation. We also add the properties object we defined earlier. Finally, we also set the Kafka producer semantic to…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.