Object/Trait

net.manub.embeddedkafka

EmbeddedKafka

Related Docs: trait EmbeddedKafka | package embeddedkafka

Permalink

object EmbeddedKafka extends EmbeddedKafkaSupport

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. EmbeddedKafka
  2. EmbeddedKafkaSupport
  3. AnyRef
  4. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. object aKafkaProducer

    Permalink
    Definition Classes
    EmbeddedKafkaSupport
  5. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  6. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  7. def consumeFirstKeyedMessageFrom[K, V](topic: String, autoCommit: Boolean = false)(implicit config: EmbeddedKafkaConfig, keyDeserializer: Deserializer[K], valueDeserializer: Deserializer[V]): (K, V)

    Permalink

    Consumes the first message available in a given topic, deserializing it as type V).

    Consumes the first message available in a given topic, deserializing it as type V).

    Only the message that is returned is committed if autoCommit is false. If autoCommit is true then all messages that were polled will be committed.

    topic

    the topic to consume a message from

    autoCommit

    if false, only the offset for the consumed message will be commited. if true, the offset for the last polled message will be committed instead. Defaulted to false.

    config

    an implicit EmbeddedKafkaConfig

    keyDeserializer

    an implicit org.apache.kafka.common.serialization.Deserializer for the type K

    valueDeserializer

    an implicit org.apache.kafka.common.serialization.Deserializer for the type V

    returns

    the first message consumed from the given topic, with a type V)

    Definition Classes
    EmbeddedKafkaSupport
    Annotations
    @throws( classOf[TimeoutException] ) @throws( classOf[KafkaUnavailableException] )
    Exceptions thrown

    KafkaUnavailableException if unable to connect to Kafka

    TimeoutException if unable to consume a message within 5 seconds

  8. def consumeFirstMessageFrom[V](topic: String, autoCommit: Boolean = false)(implicit config: EmbeddedKafkaConfig, valueDeserializer: Deserializer[V]): V

    Permalink

    Consumes the first message available in a given topic, deserializing it as type V.

    Consumes the first message available in a given topic, deserializing it as type V.

    Only the message that is returned is committed if autoCommit is false. If autoCommit is true then all messages that were polled will be committed.

    topic

    the topic to consume a message from

    autoCommit

    if false, only the offset for the consumed message will be commited. if true, the offset for the last polled message will be committed instead. Defaulted to false.

    config

    an implicit EmbeddedKafkaConfig

    valueDeserializer

    an implicit org.apache.kafka.common.serialization.Deserializer for the type V

    returns

    the first message consumed from the given topic, with a type V

    Definition Classes
    EmbeddedKafkaSupport
    Annotations
    @throws( classOf[TimeoutException] ) @throws( classOf[KafkaUnavailableException] )
    Exceptions thrown

    KafkaUnavailableException if unable to connect to Kafka

    TimeoutException if unable to consume a message within 5 seconds

  9. def consumeFirstStringMessageFrom(topic: String, autoCommit: Boolean = false)(implicit config: EmbeddedKafkaConfig): String

    Permalink
    Definition Classes
    EmbeddedKafkaSupport
  10. def consumeNumberKeyedMessagesFrom[K, V](topic: String, number: Int, autoCommit: Boolean = false)(implicit config: EmbeddedKafkaConfig, keyDeserializer: Deserializer[K], valueDeserializer: Deserializer[V]): List[(K, V)]

    Permalink
    Definition Classes
    EmbeddedKafkaSupport
  11. def consumeNumberKeyedMessagesFromTopics[K, V](topics: Set[String], number: Int, autoCommit: Boolean = false, timeout: Duration = 5.seconds, resetTimeoutOnEachMessage: Boolean = true)(implicit config: EmbeddedKafkaConfig, keyDeserializer: Deserializer[K], valueDeserializer: Deserializer[V]): Map[String, List[(K, V)]]

    Permalink

    Consumes the first n messages available in given topics, deserializes them as type V), and returns the n messages in a Map from topic name to List[(K, V)].

    Consumes the first n messages available in given topics, deserializes them as type V), and returns the n messages in a Map from topic name to List[(K, V)].

    Only the messages that are returned are committed if autoCommit is false. If autoCommit is true then all messages that were polled will be committed.

    topics

    the topics to consume messages from

    number

    the number of messages to consume in a batch

    autoCommit

    if false, only the offset for the consumed messages will be commited. if true, the offset for the last polled message will be committed instead. Defaulted to false.

    timeout

    the interval to wait for messages before throwing TimeoutException

    resetTimeoutOnEachMessage

    when true, throw TimeoutException if we have a silent period (no incoming messages) for the timeout interval; when false, throw TimeoutException after the timeout interval if we haven't received all of the expected messages

    config

    an implicit EmbeddedKafkaConfig

    keyDeserializer

    an implicit org.apache.kafka.common.serialization.Deserializer for the type K

    valueDeserializer

    an implicit org.apache.kafka.common.serialization.Deserializer for the type V

    returns

    the List of messages consumed from the given topics, each with a type V)

    Definition Classes
    EmbeddedKafkaSupport
    Exceptions thrown

    KafkaUnavailableException if unable to connect to Kafka

    TimeoutException if unable to consume messages within specified timeout

  12. def consumeNumberMessagesFrom[V](topic: String, number: Int, autoCommit: Boolean = false)(implicit config: EmbeddedKafkaConfig, valueDeserializer: Deserializer[V]): List[V]

    Permalink
    Definition Classes
    EmbeddedKafkaSupport
  13. def consumeNumberMessagesFromTopics[V](topics: Set[String], number: Int, autoCommit: Boolean = false, timeout: Duration = 5.seconds, resetTimeoutOnEachMessage: Boolean = true)(implicit config: EmbeddedKafkaConfig, valueDeserializer: Deserializer[V]): Map[String, List[V]]

    Permalink

    Consumes the first n messages available in given topics, deserializes them as type V, and returns the n messages in a Map from topic name to List[V].

    Consumes the first n messages available in given topics, deserializes them as type V, and returns the n messages in a Map from topic name to List[V].

    Only the messages that are returned are committed if autoCommit is false. If autoCommit is true then all messages that were polled will be committed.

    topics

    the topics to consume messages from

    number

    the number of messages to consume in a batch

    autoCommit

    if false, only the offset for the consumed messages will be commited. if true, the offset for the last polled message will be committed instead. Defaulted to false.

    timeout

    the interval to wait for messages before throwing TimeoutException

    resetTimeoutOnEachMessage

    when true, throw TimeoutException if we have a silent period (no incoming messages) for the timeout interval; when false, throw TimeoutException after the timeout interval if we haven't received all of the expected messages

    config

    an implicit EmbeddedKafkaConfig

    valueDeserializer

    an implicit org.apache.kafka.common.serialization.Deserializer for the type V

    returns

    the List of messages consumed from the given topics, each with a type V

    Definition Classes
    EmbeddedKafkaSupport
    Exceptions thrown

    KafkaUnavailableException if unable to connect to Kafka

    TimeoutException if unable to consume messages within specified timeout

  14. def consumeNumberStringMessagesFrom(topic: String, number: Int, autoCommit: Boolean = false)(implicit config: EmbeddedKafkaConfig): List[String]

    Permalink
    Definition Classes
    EmbeddedKafkaSupport
  15. def createCustomTopic(topic: String, topicConfig: Map[String, String] = Map.empty, partitions: Int = 1, replicationFactor: Int = 1)(implicit config: EmbeddedKafkaConfig): Unit

    Permalink

    Creates a topic with a custom configuration

    Creates a topic with a custom configuration

    topic

    the topic name

    topicConfig

    per topic configuration Map

    partitions

    number of partitions Int

    replicationFactor

    replication factor Int

    config

    an implicit EmbeddedKafkaConfig

    Definition Classes
    EmbeddedKafkaSupport
  16. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  17. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  18. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  19. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  20. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  21. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  22. def isRunning: Boolean

    Permalink

    Returns whether the in memory Kafka and Zookeeper are running.

  23. def kafkaConsumer[K, T](implicit config: EmbeddedKafkaConfig, keyDeserializer: Deserializer[K], deserializer: Deserializer[T]): KafkaConsumer[K, T]

    Permalink
    Definition Classes
    EmbeddedKafkaSupport
  24. def kafkaProducer[K, T](topic: String, key: K, message: T)(implicit config: EmbeddedKafkaConfig, keySerializer: Serializer[K], serializer: Serializer[T]): KafkaProducer[K, T]

    Permalink
    Definition Classes
    EmbeddedKafkaSupport
  25. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  26. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  27. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  28. def publishStringMessageToKafka(topic: String, message: String)(implicit config: EmbeddedKafkaConfig): Unit

    Permalink

    Publishes synchronously a message of type String to the running Kafka broker.

    Publishes synchronously a message of type String to the running Kafka broker.

    topic

    the topic to which publish the message (it will be auto-created)

    message

    the String message to publish

    config

    an implicit EmbeddedKafkaConfig

    Definition Classes
    EmbeddedKafkaSupport
    Exceptions thrown

    KafkaUnavailableException if unable to connect to Kafka

    See also

    EmbeddedKafka#publishToKafka

  29. def publishToKafka[K, T](topic: String, messages: Seq[(K, T)])(implicit config: EmbeddedKafkaConfig, keySerializer: Serializer[K], serializer: Serializer[T]): Unit

    Permalink

    Publishes synchronously a batch of message to the running Kafka broker.

    Publishes synchronously a batch of message to the running Kafka broker.

    topic

    the topic to which publish the message (it will be auto-created)

    messages

    the keys and messages of type T) to publish

    config

    an implicit EmbeddedKafkaConfig

    keySerializer

    an implicit Serializer for the type K

    serializer

    an implicit Serializer for the type T

    Definition Classes
    EmbeddedKafkaSupport
    Annotations
    @throws( classOf[KafkaUnavailableException] )
    Exceptions thrown

    KafkaUnavailableException if unable to connect to Kafka

  30. def publishToKafka[K, T](topic: String, key: K, message: T)(implicit config: EmbeddedKafkaConfig, keySerializer: Serializer[K], serializer: Serializer[T]): Unit

    Permalink

    Publishes synchronously a message to the running Kafka broker.

    Publishes synchronously a message to the running Kafka broker.

    topic

    the topic to which publish the message (it will be auto-created)

    key

    the key of type K to publish

    message

    the message of type T to publish

    config

    an implicit EmbeddedKafkaConfig

    serializer

    an implicit Serializer for the type T

    Definition Classes
    EmbeddedKafkaSupport
    Annotations
    @throws( classOf[KafkaUnavailableException] )
    Exceptions thrown

    KafkaUnavailableException if unable to connect to Kafka

  31. def publishToKafka[T](topic: String, message: T)(implicit config: EmbeddedKafkaConfig, serializer: Serializer[T]): Unit

    Permalink

    Publishes synchronously a message to the running Kafka broker.

    Publishes synchronously a message to the running Kafka broker.

    topic

    the topic to which publish the message (it will be auto-created)

    message

    the message of type T to publish

    config

    an implicit EmbeddedKafkaConfig

    serializer

    an implicit Serializer for the type T

    Definition Classes
    EmbeddedKafkaSupport
    Annotations
    @throws( classOf[KafkaUnavailableException] )
    Exceptions thrown

    KafkaUnavailableException if unable to connect to Kafka

  32. def start()(implicit config: EmbeddedKafkaConfig): Unit

    Permalink

    Starts a ZooKeeper instance and a Kafka broker in memory, using temporary directories for storing logs.

    Starts a ZooKeeper instance and a Kafka broker in memory, using temporary directories for storing logs. The log directories will be cleaned after calling the stop() method or on JVM exit, whichever happens earlier.

    config

    an implicit EmbeddedKafkaConfig

  33. def startKafka(kafkaLogDir: Directory)(implicit config: EmbeddedKafkaConfig): Unit

    Permalink

    Starts a Kafka broker in memory, storing logs in a specific location.

    Starts a Kafka broker in memory, storing logs in a specific location.

    kafkaLogDir

    the path for the Kafka logs

    config

    an implicit EmbeddedKafkaConfig

  34. def startKafka(config: EmbeddedKafkaConfig, kafkaLogDir: Directory): KafkaServer

    Permalink
    Definition Classes
    EmbeddedKafkaSupport
  35. def startZooKeeper(zkLogsDir: Directory)(implicit config: EmbeddedKafkaConfig): Unit

    Permalink

    Starts a Zookeeper instance in memory, storing logs in a specific location.

    Starts a Zookeeper instance in memory, storing logs in a specific location.

    zkLogsDir

    the path for the Zookeeper logs

    config

    an implicit EmbeddedKafkaConfig

  36. def startZooKeeper(zooKeeperPort: Int, zkLogsDir: Directory): ServerCnxnFactory

    Permalink
    Definition Classes
    EmbeddedKafkaSupport
  37. def stop(): Unit

    Permalink

    Stops the in memory ZooKeeper instance and Kafka broker, and deletes the log directories.

  38. def stopKafka(): Unit

    Permalink

    Stops the in memory Kafka instance, preserving the logs directory.

  39. def stopZooKeeper(): Unit

    Permalink

    Stops the in memory Zookeeper instance, preserving the logs directory.

  40. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  41. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  42. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  43. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  44. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  45. def withRunningKafka[T](body: ⇒ T)(implicit config: EmbeddedKafkaConfig): T

    Permalink

    Starts a ZooKeeper instance and a Kafka broker, then executes the body passed as a parameter.

    Starts a ZooKeeper instance and a Kafka broker, then executes the body passed as a parameter.

    body

    the function to execute

    config

    an implicit EmbeddedKafkaConfig

    Definition Classes
    EmbeddedKafkaSupport
  46. def withRunningKafkaOnFoundPort[T](config: EmbeddedKafkaConfig)(body: (EmbeddedKafkaConfig) ⇒ T): T

    Permalink

    Starts a ZooKeeper instance and a Kafka broker, then executes the body passed as a parameter.

    Starts a ZooKeeper instance and a Kafka broker, then executes the body passed as a parameter. The actual ZooKeeper and Kafka ports will be detected and inserted into a copied version of the EmbeddedKafkaConfig that gets passed to body. This is useful if you set either or both port to 0, which will listen on an arbitrary available port.

    config

    the user-defined EmbeddedKafkaConfig

    body

    the function to execute, given an EmbeddedKafkaConfig with the actual ports Kafka and ZooKeeper are running on

    Definition Classes
    EmbeddedKafkaSupport
  47. val zkConnectionTimeoutMs: Int

    Permalink
    Definition Classes
    EmbeddedKafkaSupport
  48. val zkSecurityEnabled: Boolean

    Permalink
    Definition Classes
    EmbeddedKafkaSupport
  49. val zkSessionTimeoutMs: Int

    Permalink
    Definition Classes
    EmbeddedKafkaSupport

Inherited from EmbeddedKafkaSupport

Inherited from AnyRef

Inherited from Any

Ungrouped