Object/Trait

net.manub.embeddedkafka

EmbeddedKafka

Related Docs: trait EmbeddedKafka | package embeddedkafka

Permalink

object EmbeddedKafka extends EmbeddedKafka with RunningEmbeddedKafkaOps[EmbeddedKafkaConfig, EmbeddedK]

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. EmbeddedKafka
  2. RunningEmbeddedKafkaOps
  3. RunningKafkaOps
  4. RunningZooKeeperOps
  5. ServerStarter
  6. RunningServersOps
  7. EmbeddedKafka
  8. EmbeddedKafkaOps
  9. KafkaOps
  10. ZooKeeperOps
  11. ProducerOps
  12. ConsumerOps
  13. AdminOps
  14. EmbeddedKafkaSupport
  15. AnyRef
  16. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. val adminClientCloseTimeout: FiniteDuration

    Permalink
    Attributes
    protected
    Definition Classes
    AdminOps
  5. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  6. val autoCreateTopics: Boolean

    Permalink
    Attributes
    protected
    Definition Classes
    KafkaOps
  7. val brokerId: Short

    Permalink
    Attributes
    protected
    Definition Classes
    KafkaOps
  8. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  9. def consumeFirstKeyedMessageFrom[K, V](topic: String, autoCommit: Boolean = false)(implicit config: EmbeddedKafkaConfig, keyDeserializer: Deserializer[K], valueDeserializer: Deserializer[V]): (K, V)

    Permalink

    Consumes the first message available in a given topic, deserializing it as type V).

    Consumes the first message available in a given topic, deserializing it as type V).

    Only the message that is returned is committed if autoCommit is false. If autoCommit is true then all messages that were polled will be committed.

    topic

    the topic to consume a message from

    autoCommit

    if false, only the offset for the consumed message will be committed. if true, the offset for the last polled message will be committed instead. Defaulted to false.

    config

    an implicit EmbeddedKafkaConfig

    keyDeserializer

    an implicit org.apache.kafka.common.serialization.Deserializer for the type K

    valueDeserializer

    an implicit org.apache.kafka.common.serialization.Deserializer for the type V

    returns

    the first message consumed from the given topic, with a type V)

    Definition Classes
    ConsumerOps
    Annotations
    @throws( classOf[TimeoutException] ) @throws( classOf[KafkaUnavailableException] )
    Exceptions thrown

    KafkaUnavailableException if unable to connect to Kafka

    TimeoutException if unable to consume a message within 5 seconds

  10. def consumeFirstMessageFrom[V](topic: String, autoCommit: Boolean = false)(implicit config: EmbeddedKafkaConfig, valueDeserializer: Deserializer[V]): V

    Permalink

    Consumes the first message available in a given topic, deserializing it as type V.

    Consumes the first message available in a given topic, deserializing it as type V.

    Only the message that is returned is committed if autoCommit is false. If autoCommit is true then all messages that were polled will be committed.

    topic

    the topic to consume a message from

    autoCommit

    if false, only the offset for the consumed message will be committed. if true, the offset for the last polled message will be committed instead. Defaulted to false.

    config

    an implicit EmbeddedKafkaConfig

    valueDeserializer

    an implicit org.apache.kafka.common.serialization.Deserializer for the type V

    returns

    the first message consumed from the given topic, with a type V

    Definition Classes
    ConsumerOps
    Annotations
    @throws( classOf[TimeoutException] ) @throws( classOf[KafkaUnavailableException] )
    Exceptions thrown

    KafkaUnavailableException if unable to connect to Kafka

    TimeoutException if unable to consume a message within 5 seconds

  11. def consumeFirstStringMessageFrom(topic: String, autoCommit: Boolean = false)(implicit config: EmbeddedKafkaConfig): String

    Permalink
    Definition Classes
    ConsumerOps
  12. def consumeNumberKeyedMessagesFrom[K, V](topic: String, number: Int, autoCommit: Boolean = false)(implicit config: EmbeddedKafkaConfig, keyDeserializer: Deserializer[K], valueDeserializer: Deserializer[V]): List[(K, V)]

    Permalink
    Definition Classes
    ConsumerOps
  13. def consumeNumberKeyedMessagesFromTopics[K, V](topics: Set[String], number: Int, autoCommit: Boolean = false, timeout: Duration = 5.seconds, resetTimeoutOnEachMessage: Boolean = true)(implicit config: EmbeddedKafkaConfig, keyDeserializer: Deserializer[K], valueDeserializer: Deserializer[V]): Map[String, List[(K, V)]]

    Permalink

    Consumes the first n messages available in given topics, deserializes them as type V), and returns the n messages in a Map from topic name to List[(K, V)].

    Consumes the first n messages available in given topics, deserializes them as type V), and returns the n messages in a Map from topic name to List[(K, V)].

    Only the messages that are returned are committed if autoCommit is false. If autoCommit is true then all messages that were polled will be committed.

    topics

    the topics to consume messages from

    number

    the number of messages to consume in a batch

    autoCommit

    if false, only the offset for the consumed messages will be committed. if true, the offset for the last polled message will be committed instead. Defaulted to false.

    timeout

    the interval to wait for messages before throwing TimeoutException

    resetTimeoutOnEachMessage

    when true, throw TimeoutException if we have a silent period (no incoming messages) for the timeout interval; when false, throw TimeoutException after the timeout interval if we haven't received all of the expected messages

    config

    an implicit EmbeddedKafkaConfig

    keyDeserializer

    an implicit org.apache.kafka.common.serialization.Deserializer for the type K

    valueDeserializer

    an implicit org.apache.kafka.common.serialization.Deserializer for the type V

    returns

    the List of messages consumed from the given topics, each with a type V)

    Definition Classes
    ConsumerOps
    Exceptions thrown

    KafkaUnavailableException if unable to connect to Kafka

    TimeoutException if unable to consume messages within specified timeout

  14. def consumeNumberMessagesFrom[V](topic: String, number: Int, autoCommit: Boolean = false)(implicit config: EmbeddedKafkaConfig, valueDeserializer: Deserializer[V]): List[V]

    Permalink
    Definition Classes
    ConsumerOps
  15. def consumeNumberMessagesFromTopics[V](topics: Set[String], number: Int, autoCommit: Boolean = false, timeout: Duration = 5.seconds, resetTimeoutOnEachMessage: Boolean = true)(implicit config: EmbeddedKafkaConfig, valueDeserializer: Deserializer[V]): Map[String, List[V]]

    Permalink

    Consumes the first n messages available in given topics, deserializes them as type V, and returns the n messages in a Map from topic name to List[V].

    Consumes the first n messages available in given topics, deserializes them as type V, and returns the n messages in a Map from topic name to List[V].

    Only the messages that are returned are committed if autoCommit is false. If autoCommit is true then all messages that were polled will be committed.

    topics

    the topics to consume messages from

    number

    the number of messages to consume in a batch

    autoCommit

    if false, only the offset for the consumed messages will be committed. if true, the offset for the last polled message will be committed instead. Defaulted to false.

    timeout

    the interval to wait for messages before throwing TimeoutException

    resetTimeoutOnEachMessage

    when true, throw TimeoutException if we have a silent period (no incoming messages) for the timeout interval; when false, throw TimeoutException after the timeout interval if we haven't received all of the expected messages

    config

    an implicit EmbeddedKafkaConfig

    valueDeserializer

    an implicit org.apache.kafka.common.serialization.Deserializer for the type V

    returns

    the List of messages consumed from the given topics, each with a type V

    Definition Classes
    ConsumerOps
    Exceptions thrown

    KafkaUnavailableException if unable to connect to Kafka

    TimeoutException if unable to consume messages within specified timeout

  16. def consumeNumberStringMessagesFrom(topic: String, number: Int, autoCommit: Boolean = false)(implicit config: EmbeddedKafkaConfig): List[String]

    Permalink
    Definition Classes
    ConsumerOps
  17. val consumerPollingTimeout: FiniteDuration

    Permalink
    Attributes
    protected
    Definition Classes
    ConsumerOps
  18. def createCustomTopic(topic: String, topicConfig: Map[String, String] = Map.empty, partitions: Int = 1, replicationFactor: Int = 1)(implicit config: EmbeddedKafkaConfig): Unit

    Permalink

    Creates a topic with a custom configuration.

    Creates a topic with a custom configuration.

    topic

    the topic name

    topicConfig

    per topic configuration Map

    partitions

    number of partitions Int

    replicationFactor

    replication factor Int

    config

    an implicit EmbeddedKafkaConfig

    Definition Classes
    AdminOps
  19. def deleteTopics(topics: List[String])(implicit config: EmbeddedKafkaConfig): Try[Unit]

    Permalink

    Either deletes or marks for deletion a list of topics.

    Either deletes or marks for deletion a list of topics.

    topics

    the topic names

    config

    an implicit EmbeddedKafkaConfig

    Definition Classes
    AdminOps
  20. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  21. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  22. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  23. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  24. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  25. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  26. def isRunning: Boolean

    Permalink

    Returns whether the in memory servers are running.

    Returns whether the in memory servers are running.

    Definition Classes
    EmbeddedKafkaRunningServersOps
  27. val logCleanerDedupeBufferSize: Int

    Permalink
    Attributes
    protected
    Definition Classes
    KafkaOps
  28. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  29. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  30. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  31. val producerPublishTimeout: FiniteDuration

    Permalink
    Attributes
    protected
    Definition Classes
    ProducerOps
  32. def publishStringMessageToKafka(topic: String, message: String)(implicit config: EmbeddedKafkaConfig): Unit

    Permalink

    Publishes synchronously a message of type String to the running Kafka broker.

    Publishes synchronously a message of type String to the running Kafka broker.

    topic

    the topic to which publish the message (it will be auto-created)

    message

    the String message to publish

    config

    an implicit EmbeddedKafkaConfig

    Definition Classes
    ProducerOps
    Exceptions thrown

    KafkaUnavailableException if unable to connect to Kafka

    See also

    EmbeddedKafka#publishToKafka

  33. def publishToKafka[K, T](topic: String, messages: Seq[(K, T)])(implicit config: EmbeddedKafkaConfig, keySerializer: Serializer[K], serializer: Serializer[T]): Unit

    Permalink

    Publishes synchronously a batch of message to the running Kafka broker.

    Publishes synchronously a batch of message to the running Kafka broker.

    topic

    the topic to which publish the message (it will be auto-created)

    messages

    the keys and messages of type T) to publish

    config

    an implicit EmbeddedKafkaConfig

    keySerializer

    an implicit Serializer for the type K

    serializer

    an implicit Serializer for the type T

    Definition Classes
    ProducerOps
    Annotations
    @throws( classOf[KafkaUnavailableException] )
    Exceptions thrown

    KafkaUnavailableException if unable to connect to Kafka

  34. def publishToKafka[K, T](topic: String, key: K, message: T)(implicit config: EmbeddedKafkaConfig, keySerializer: Serializer[K], serializer: Serializer[T]): Unit

    Permalink

    Publishes synchronously a message to the running Kafka broker.

    Publishes synchronously a message to the running Kafka broker.

    topic

    the topic to which publish the message (it will be auto-created)

    key

    the key of type K to publish

    message

    the message of type T to publish

    config

    an implicit EmbeddedKafkaConfig

    serializer

    an implicit Serializer for the type T

    Definition Classes
    ProducerOps
    Annotations
    @throws( classOf[KafkaUnavailableException] )
    Exceptions thrown

    KafkaUnavailableException if unable to connect to Kafka

  35. def publishToKafka[T](producerRecord: ProducerRecord[String, T])(implicit config: EmbeddedKafkaConfig, serializer: Serializer[T]): Unit

    Permalink

    Publishes synchronously a message to the running Kafka broker.

    Publishes synchronously a message to the running Kafka broker.

    producerRecord

    the producerRecord of type T to publish

    config

    an implicit EmbeddedKafkaConfig

    serializer

    an implicit Serializer for the type T

    Definition Classes
    ProducerOps
    Annotations
    @throws( classOf[KafkaUnavailableException] )
    Exceptions thrown

    KafkaUnavailableException if unable to connect to Kafka

  36. def publishToKafka[T](topic: String, message: T)(implicit config: EmbeddedKafkaConfig, serializer: Serializer[T]): Unit

    Permalink

    Publishes synchronously a message to the running Kafka broker.

    Publishes synchronously a message to the running Kafka broker.

    topic

    the topic to which publish the message (it will be auto-created)

    message

    the message of type T to publish

    config

    an implicit EmbeddedKafkaConfig

    serializer

    an implicit Serializer for the type T

    Definition Classes
    ProducerOps
    Annotations
    @throws( classOf[KafkaUnavailableException] )
    Exceptions thrown

    KafkaUnavailableException if unable to connect to Kafka

  37. def start()(implicit config: EmbeddedKafkaConfig): EmbeddedK

    Permalink

    Starts in memory servers, using temporary directories for storing logs.

    Starts in memory servers, using temporary directories for storing logs. The log directories will be cleaned after calling the EmbeddedServer.stop() method or on JVM exit, whichever happens earlier.

    config

    an implicit EmbeddedKafkaConfig

    Definition Classes
    EmbeddedKafkaServerStarter
  38. def startKafka(kafkaLogsDir: Path, factory: Option[EmbeddedZ] = None)(implicit config: EmbeddedKafkaConfig): EmbeddedK

    Permalink

    Starts a Kafka broker in memory, storing logs in a specific location.

    Starts a Kafka broker in memory, storing logs in a specific location.

    kafkaLogsDir

    the path for the Kafka logs

    factory

    an EmbeddedZ server

    config

    an implicit EmbeddedKafkaConfig

    returns

    an EmbeddedK server

    Definition Classes
    RunningKafkaOps
  39. def startKafka(config: EmbeddedKafkaConfig, kafkaLogDir: Path): KafkaServer

    Permalink
    Definition Classes
    KafkaOps
  40. def startZooKeeper(zkLogsDir: Path)(implicit config: EmbeddedKafkaConfig): EmbeddedZ

    Permalink

    Starts a Zookeeper instance in memory, storing logs in a specific location.

    Starts a Zookeeper instance in memory, storing logs in a specific location.

    zkLogsDir

    the path for the Zookeeper logs

    config

    an implicit EmbeddedKafkaConfig

    returns

    an EmbeddedZ server

    Definition Classes
    RunningZooKeeperOps
  41. def startZooKeeper(zooKeeperPort: Int, zkLogsDir: Path): ServerCnxnFactory

    Permalink
    Definition Classes
    ZooKeeperOps
  42. def stop(server: EmbeddedServer): Unit

    Permalink

    Stops a specific EmbeddedServer instance, and deletes the log directory.

    Stops a specific EmbeddedServer instance, and deletes the log directory.

    server

    the EmbeddedServer to be stopped.

    Definition Classes
    RunningServersOps
  43. def stop(): Unit

    Permalink

    Stops all in memory servers and deletes the log directories.

    Stops all in memory servers and deletes the log directories.

    Definition Classes
    RunningServersOps
  44. def stopKafka(): Unit

    Permalink

    Stops all in memory Kafka instances, preserving the logs directories.

    Stops all in memory Kafka instances, preserving the logs directories.

    Definition Classes
    RunningKafkaOps
  45. def stopZooKeeper(): Unit

    Permalink

    Stops all in memory Zookeeper instances, preserving the logs directories.

    Stops all in memory Zookeeper instances, preserving the logs directories.

    Definition Classes
    RunningZooKeeperOps
  46. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  47. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  48. val topicCreationTimeout: FiniteDuration

    Permalink
    Attributes
    protected
    Definition Classes
    AdminOps
  49. val topicDeletionTimeout: FiniteDuration

    Permalink
    Attributes
    protected
    Definition Classes
    AdminOps
  50. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  51. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  52. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  53. def withAdminClient[T](body: (AdminClient) ⇒ T)(implicit config: EmbeddedKafkaConfig): Try[T]

    Permalink

    Creates an AdminClient, then executes the body passed as a parameter.

    Creates an AdminClient, then executes the body passed as a parameter.

    body

    the function to execute

    config

    an implicit EmbeddedKafkaConfig

    Attributes
    protected
    Definition Classes
    AdminOps
  54. def withConsumer[K, V, T](body: (KafkaConsumer[K, V]) ⇒ T)(implicit config: EmbeddedKafkaConfig, keyDeserializer: Deserializer[K], valueDeserializer: Deserializer[V]): T

    Permalink

    Loaner pattern that allows running a code block with a newly created producer.

    Loaner pattern that allows running a code block with a newly created producer. The producer's lifecycle will be automatically handled and closed at the end of the given code block.

    body

    the function to execute that returns T

    config

    an implicit EmbeddedKafkaConfig

    keyDeserializer

    an implicit Deserializer for the type K

    valueDeserializer

    an implicit Deserializer for the type V

    Definition Classes
    ConsumerOps
  55. def withProducer[K, V, T](body: (KafkaProducer[K, V]) ⇒ T)(implicit config: EmbeddedKafkaConfig, keySerializer: Serializer[K], valueSerializer: Serializer[V]): T

    Permalink

    Loaner pattern that allows running a code block with a newly created consumer.

    Loaner pattern that allows running a code block with a newly created consumer. The consumer's lifecycle will be automatically handled and closed at the end of the given code block.

    body

    the function to execute that returns T

    config

    an implicit EmbeddedKafkaConfig

    keySerializer

    an implicit Serializer for the type K

    valueSerializer

    an implicit Serializer for the type V

    Definition Classes
    ProducerOps
  56. def withRunningKafka[T](body: ⇒ T)(implicit config: EmbeddedKafkaConfig): T

    Permalink

    Starts a ZooKeeper instance and a Kafka broker (and performs additional logic, if any), then executes the body passed as a parameter.

    Starts a ZooKeeper instance and a Kafka broker (and performs additional logic, if any), then executes the body passed as a parameter.

    body

    the function to execute

    config

    an implicit EmbeddedKafkaConfig

    Definition Classes
    EmbeddedKafkaSupport
  57. def withRunningKafkaOnFoundPort[T](config: EmbeddedKafkaConfig)(body: (EmbeddedKafkaConfig) ⇒ T): T

    Permalink

    Starts a ZooKeeper instance and a Kafka broker (and performs additional logic, if any), then executes the body passed as a parameter.

    Starts a ZooKeeper instance and a Kafka broker (and performs additional logic, if any), then executes the body passed as a parameter. The actual ports of the servers will be detected and inserted into a copied version of the EmbeddedKafkaConfig that gets passed to body. This is useful if you set any port to 0, which will listen on an arbitrary available port.

    config

    the user-defined EmbeddedKafkaConfig

    body

    the function to execute, given an EmbeddedKafkaConfig with the actual ports the servers are running on

    Definition Classes
    EmbeddedKafkaSupport
  58. val zkConnectionTimeout: FiniteDuration

    Permalink
    Attributes
    protected
    Definition Classes
    KafkaOps
  59. val zkConnectionTimeoutMs: Int

    Permalink
    Definition Classes
    AdminOps
  60. val zkSessionTimeoutMs: Int

    Permalink
    Definition Classes
    AdminOps

Deprecated Value Members

  1. object aKafkaProducer

    Permalink
    Definition Classes
    ProducerOps
    Annotations
    @deprecated
    Deprecated

    (Since version 2.4.1) Direct usage of KafkaProducer is discouraged, see loan method withProducer

  2. def kafkaConsumer[K, T](implicit config: EmbeddedKafkaConfig, keyDeserializer: Deserializer[K], deserializer: Deserializer[T]): KafkaConsumer[K, T]

    Permalink
    Definition Classes
    ConsumerOps
    Annotations
    @deprecated
    Deprecated

    (Since version 2.4.1) Direct usage of KafkaConsumer is discouraged, see loan method withConsumer

  3. def kafkaProducer[K, T](topic: String, key: K, message: T)(implicit config: EmbeddedKafkaConfig, keySerializer: Serializer[K], serializer: Serializer[T]): KafkaProducer[K, T]

    Permalink
    Definition Classes
    ProducerOps
    Annotations
    @deprecated
    Deprecated

    (Since version 2.4.1) Direct usage of KafkaProducer is discouraged, see loan method withProducer

Inherited from RunningEmbeddedKafkaOps[EmbeddedKafkaConfig, EmbeddedK]

Inherited from RunningKafkaOps

Inherited from RunningZooKeeperOps

Inherited from RunningServersOps

Inherited from EmbeddedKafka

Inherited from EmbeddedKafkaOps[EmbeddedKafkaConfig, EmbeddedK]

Inherited from KafkaOps

Inherited from ZooKeeperOps

Inherited from AdminOps[EmbeddedKafkaConfig]

Inherited from EmbeddedKafkaSupport[EmbeddedKafkaConfig]

Inherited from AnyRef

Inherited from Any

Ungrouped