All Classes and Interfaces

Class
Description
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Can be used for forcing async manual offset commit when using Kafka consumer.
Factory to create a new Kafka Consumer and Kafka Producer instances.
 
Generated by camel build tools - do NOT edit this file!
 
 
 
A fatal exception such as the kafka consumer is not able to be created and/or subscribed to the kafka brokers within a given backoff period, leading to camel-kafka giving up and terminating the kafka consumer thread, meaning that the kafka consumer will not try to recover.
Kafka consumer readiness health-check
 
Sent and receive messages to/from an Apache Kafka broker.
Generated by camel build tools - do NOT edit this file!
Generated by camel build tools - do NOT edit this file!
 
 
Deserializer for kafka header value.
 
Serializer for kafka header value.
A Kafka topic-based implementation of IdempotentRepository.
Can be used for forcing manual offset commit when using Kafka consumer.
Factory to create a new KafkaManualCommit to store on the Exchange.
A holder class for the Camel exchange related payload, such as the exchange itself, the consumer, thread ID, etc
A holder class for the payload related to the Kafka record, such as partition and topic information
 
 
Kafka producer readiness health-check
 
 
 
 
 
 
Base interface for resume strategies that publish the offsets to a Kafka topic
A configuration suitable for using with the KafkaResumeStrategy and any of its implementations
A configuration builder appropriate for building configurations for the SingleNodeKafkaResumeStrategy
Kafka based SendDynamicAware which allows to optimise Kafka components with the toD (dynamic to) DSL in Camel.
 
 
 
A NO-OP resume strategy that does nothing (i.e.: no resume)
A resume strategy that uses Kafka's offset for resuming
Defines a adapters for handling resume operations.
Strategy to decide when a Kafka exception was thrown during polling, how to handle this.
DISCARD will discard the message and continue to poll next message.
 
 
Used to provide individual kafka header values if feature "batchWithIndividualHeaders" is enabled.
 
Contains the error details when failing to produce records
 
 
BEGINNING configures the consumer to consume from the beginning of the topic/partition.
A resume strategy that uses Camel's seekTo configuration for resuming
A resume strategy that publishes offsets to a Kafka topic.