Producer

akka.kafka.scaladsl.Producer$
object Producer

Akka Stream connector for publishing messages to Kafka topics.

Attributes

Source:
Producer.scala
Graph
Supertypes
class Object
trait Matchable
class Any
Self type

Members list

Concise view

Value members

Concrete methods

def committableSink[K, V](producerSettings: ProducerSettings[K, V], committerSettings: CommitterSettings): Sink[Envelope[K, V, Committable], Future[Done]]

Create a sink that is aware of the committable offset from a Consumer.committableSource. The offsets are batched and committed regularly.

Create a sink that is aware of the committable offset from a Consumer.committableSource. The offsets are batched and committed regularly.

It publishes records to Kafka topics conditionally:

  • Message publishes a single message to its topic, and commits the offset

  • MultiMessage publishes all messages in its records field, and commits the offset

  • PassThroughMessage does not publish anything, but commits the offset

Note that there is a risk that something fails after publishing but before committing, so it is "at-least once delivery" semantics.

Attributes

Source:
Producer.scala
@ApiMayChange(issue = "https://github.com/akka/alpakka-kafka/issues/880")
def committableSinkWithOffsetContext[K, V](producerSettings: ProducerSettings[K, V], committerSettings: CommitterSettings): Sink[(Envelope[K, V, _], Committable), Future[Done]]

Create a sink that is aware of the committable offset passed as context from a Consumer.sourceWithOffsetContext. The offsets are batched and committed regularly.

Create a sink that is aware of the committable offset passed as context from a Consumer.sourceWithOffsetContext. The offsets are batched and committed regularly.

It publishes records to Kafka topics conditionally:

  • Message publishes a single message to its topic, and commits the offset

  • MultiMessage publishes all messages in its records field, and commits the offset

  • PassThroughMessage does not publish anything, but commits the offset

Note that there is a risk that something fails after publishing but before committing, so it is "at-least once delivery" semantics.

Attributes

Source:
Producer.scala
def flexiFlow[K, V, PassThrough](settings: ProducerSettings[K, V]): Flow[Envelope[K, V, PassThrough], Results[K, V, PassThrough], NotUsed]

Create a flow to conditionally publish records to Kafka topics and then pass it on.

Create a flow to conditionally publish records to Kafka topics and then pass it on.

It publishes records to Kafka topics conditionally:

The messages support the possibility to pass through arbitrary data, which can for example be a CommittableOffset or CommittableOffsetBatch that can be committed later in the flow.

Attributes

Source:
Producer.scala
@ApiMayChange(issue = "https://github.com/akka/alpakka-kafka/issues/880")
def flowWithContext[K, V, C](settings: ProducerSettings[K, V]): FlowWithContext[Envelope[K, V, NotUsed], C, Results[K, V, C], C, NotUsed]

API MAY CHANGE

API MAY CHANGE

Create a flow to conditionally publish records to Kafka topics and then pass it on.

It publishes records to Kafka topics conditionally:

This flow is intended to be used with Akka's flow with context.

Attributes

C

the flow context type

Source:
Producer.scala
def plainSink[K, V](settings: ProducerSettings[K, V]): Sink[ProducerRecord[K, V], Future[Done]]

Create a sink for publishing records to Kafka topics.

Create a sink for publishing records to Kafka topics.

The Kafka ProducerRecord contains the topic name to which the record is being sent, an optional partition number, and an optional key and value.

Attributes

Source:
Producer.scala

Deprecated methods

def committableSink[K, V](settings: ProducerSettings[K, V]): Sink[Envelope[K, V, Committable], Future[Done]]

Create a sink that is aware of the committable offset from a Consumer.committableSource. It will commit the consumer offset when the message has been published successfully to the topic.

Create a sink that is aware of the committable offset from a Consumer.committableSource. It will commit the consumer offset when the message has been published successfully to the topic.

It publishes records to Kafka topics conditionally:

  • Message publishes a single message to its topic, and commits the offset

  • MultiMessage publishes all messages in its records field, and commits the offset

  • PassThroughMessage does not publish anything, but commits the offset

Note that there is a risk that something fails after publishing but before committing, so it is "at-least once delivery" semantics.

Attributes

Deprecated
true
Source:
Producer.scala
def committableSink[K, V](settings: ProducerSettings[K, V], producer: Producer[K, V]): Sink[Envelope[K, V, Committable], Future[Done]]

Create a sink that is aware of the committable offset from a Consumer.committableSource. It will commit the consumer offset when the message has been published successfully to the topic.

Create a sink that is aware of the committable offset from a Consumer.committableSource. It will commit the consumer offset when the message has been published successfully to the topic.

It publishes records to Kafka topics conditionally:

  • Message publishes a single message to its topic, and commits the offset

  • MultiMessage publishes all messages in its records field, and commits the offset

  • PassThroughMessage does not publish anything, but commits the offset

Note that there is always a risk that something fails after publishing but before committing, so it is "at-least once delivery" semantics.

Supports sharing a Kafka Producer instance.

Attributes

Deprecated
true
Source:
Producer.scala
def flexiFlow[K, V, PassThrough](settings: ProducerSettings[K, V], producer: Producer[K, V]): Flow[Envelope[K, V, PassThrough], Results[K, V, PassThrough], NotUsed]

Create a flow to conditionally publish records to Kafka topics and then pass it on.

Create a flow to conditionally publish records to Kafka topics and then pass it on.

It publishes records to Kafka topics conditionally:

The messages support the possibility to pass through arbitrary data, which can for example be a CommittableOffset or CommittableOffsetBatch that can be committed later in the flow.

Supports sharing a Kafka Producer instance.

Attributes

Deprecated
true
Source:
Producer.scala
def flow[K, V, PassThrough](settings: ProducerSettings[K, V]): Flow[Message[K, V, PassThrough], Result[K, V, PassThrough], NotUsed]

Create a flow to publish records to Kafka topics and then pass it on.

Create a flow to publish records to Kafka topics and then pass it on.

The records must be wrapped in a Message and continue in the stream as Result.

The messages support the possibility to pass through arbitrary data, which can for example be a CommittableOffset or CommittableOffsetBatch that can be committed later in the flow.

Attributes

Deprecated
true
Source:
Producer.scala
def flow[K, V, PassThrough](settings: ProducerSettings[K, V], producer: Producer[K, V]): Flow[Message[K, V, PassThrough], Result[K, V, PassThrough], NotUsed]

Create a flow to publish records to Kafka topics and then pass it on.

Create a flow to publish records to Kafka topics and then pass it on.

The records must be wrapped in a Message and continue in the stream as Result.

The messages support the possibility to pass through arbitrary data, which can for example be a CommittableOffset or CommittableOffsetBatch that can be committed later in the flow.

Supports sharing a Kafka Producer instance.

Attributes

Deprecated
true
Source:
Producer.scala
@ApiMayChange(issue = "https://github.com/akka/alpakka-kafka/issues/880")
def flowWithContext[K, V, C](settings: ProducerSettings[K, V], producer: Producer[K, V]): FlowWithContext[Envelope[K, V, NotUsed], C, Results[K, V, C], C, NotUsed]

API MAY CHANGE

API MAY CHANGE

Create a flow to conditionally publish records to Kafka topics and then pass it on.

It publishes records to Kafka topics conditionally:

This flow is intended to be used with Akka's flow with context.

Supports sharing a Kafka Producer instance.

Attributes

C

the flow context type

Deprecated
true
Source:
Producer.scala
def plainSink[K, V](settings: ProducerSettings[K, V], producer: Producer[K, V]): Sink[ProducerRecord[K, V], Future[Done]]

Create a sink for publishing records to Kafka topics.

Create a sink for publishing records to Kafka topics.

The Kafka ProducerRecord contains the topic name to which the record is being sent, an optional partition number, and an optional key and value.

Supports sharing a Kafka Producer instance.

Attributes

Deprecated
true
Source:
Producer.scala