com.github.benfradet.spark.kafka010.writer
Class used to write DStreams and RDDs to Kafka Example usage:
import java.util.Properties import com.github.benfradet.spark.kafka010.writer.KafkaWriter._ import org.apache.kafka.common.serialization.StringSerializer val topic = "my-topic" val producerConfig = { val p = new Properties() p.setProperty("bootstrap.servers", "127.0.0.1:9092") p.setProperty("key.serializer", classOf[StringSerializer].getName) p.setProperty("value.serializer", classOf[StringSerializer].getName) p } val dStream: DStream[String] = ... dStream.writeToKafka( producerConfig, s => new ProducerRecord[String, String](topic, s) ) val rdd: RDD[String] = ... rdd.writeToKafka( producerConfig, s => new ProducerRecord[String, String](localTopic, s) )
Write a DStream or RDD to Kafka
properties for a org.apache.kafka.clients.producer.KafkaProducer
a function used to transform values of T type into ProducerRecords
Class used to write DStreams and RDDs to Kafka Example usage: