Class KafkaComponent

  • All Implemented Interfaces:
    AutoCloseable, org.apache.camel.CamelContextAware, org.apache.camel.Component, org.apache.camel.Service, org.apache.camel.ShutdownableService, org.apache.camel.SSLContextParametersAware, org.apache.camel.StatefulService, org.apache.camel.SuspendableService

    @Component("kafka")
    public class KafkaComponent
    extends org.apache.camel.support.DefaultComponent
    implements org.apache.camel.SSLContextParametersAware
    • Field Summary

      • Fields inherited from class org.apache.camel.support.service.BaseService

        BUILT, FAILED, INITIALIZED, INITIALIZING, lock, NEW, SHUTDOWN, SHUTTING_DOWN, STARTED, STARTING, status, STOPPED, STOPPING, SUSPENDED, SUSPENDING
    • Constructor Detail

      • KafkaComponent

        public KafkaComponent()
      • KafkaComponent

        public KafkaComponent​(org.apache.camel.CamelContext context)
    • Method Detail

      • setConfiguration

        public void setConfiguration​(KafkaConfiguration configuration)
        Allows to pre-configure the Kafka component with common options that the endpoints will reuse.
      • isUseGlobalSslContextParameters

        public boolean isUseGlobalSslContextParameters()
        Specified by:
        isUseGlobalSslContextParameters in interface org.apache.camel.SSLContextParametersAware
      • setUseGlobalSslContextParameters

        public void setUseGlobalSslContextParameters​(boolean useGlobalSslContextParameters)
        Enable usage of global SSL context parameters.
        Specified by:
        setUseGlobalSslContextParameters in interface org.apache.camel.SSLContextParametersAware
      • setKafkaManualCommitFactory

        public void setKafkaManualCommitFactory​(KafkaManualCommitFactory kafkaManualCommitFactory)
        Factory to use for creating KafkaManualCommit instances. This allows to plugin a custom factory to create custom KafkaManualCommit instances in case special logic is needed when doing manual commits that deviates from the default implementation that comes out of the box.
      • setKafkaClientFactory

        public void setKafkaClientFactory​(KafkaClientFactory kafkaClientFactory)
        Factory to use for creating KafkaConsumer and KafkaProducer instances. This allows to configure a custom factory to create instances with logic that extends the vanilla Kafka clients.
      • setPollExceptionStrategy

        public void setPollExceptionStrategy​(PollExceptionStrategy pollExceptionStrategy)
        To use a custom strategy with the consumer to control how to handle exceptions thrown from the Kafka broker while pooling messages.
      • getCreateConsumerBackoffMaxAttempts

        public int getCreateConsumerBackoffMaxAttempts()
      • setCreateConsumerBackoffMaxAttempts

        public void setCreateConsumerBackoffMaxAttempts​(int createConsumerBackoffMaxAttempts)
        Maximum attempts to create the kafka consumer (kafka-client), before eventually giving up and failing. Error during creating the consumer may be fatal due to invalid configuration and as such recovery is not possible. However, one part of the validation is DNS resolution of the bootstrap broker hostnames. This may be a temporary networking problem, and could potentially be recoverable. While other errors are fatal such as some invalid kafka configurations. Unfortunately kafka-client does not separate this kind of errors. Camel will by default retry forever, and therefore never give up. If you want to give up after many attempts then set this option and Camel will then when giving up terminate the consumer. You can manually restart the consumer by stopping and starting the route, to try again.
      • getCreateConsumerBackoffInterval

        public long getCreateConsumerBackoffInterval()
      • setCreateConsumerBackoffInterval

        public void setCreateConsumerBackoffInterval​(long createConsumerBackoffInterval)
        The delay in millis seconds to wait before trying again to create the kafka consumer (kafka-client).
      • getSubscribeConsumerBackoffMaxAttempts

        public int getSubscribeConsumerBackoffMaxAttempts()
      • setSubscribeConsumerBackoffMaxAttempts

        public void setSubscribeConsumerBackoffMaxAttempts​(int subscribeConsumerBackoffMaxAttempts)
        Maximum number the kafka consumer will attempt to subscribe to the kafka broker, before eventually giving up and failing. Error during subscribing the consumer to the kafka topic could be temporary errors due to network issues, and could potentially be recoverable. Camel will by default retry forever, and therefore never give up. If you want to give up after many attempts then set this option and Camel will then when giving up terminate the consumer. You can manually restart the consumer by stopping and starting the route, to try again.
      • getSubscribeConsumerBackoffInterval

        public long getSubscribeConsumerBackoffInterval()
      • setSubscribeConsumerBackoffInterval

        public void setSubscribeConsumerBackoffInterval​(long subscribeConsumerBackoffInterval)
        The delay in millis seconds to wait before trying again to subscribe to the kafka broker.
      • doInit

        protected void doInit()
                       throws Exception
        Overrides:
        doInit in class org.apache.camel.support.DefaultComponent
        Throws:
        Exception