Packages

c

org.apache.spark.sql.kafka010

KafkaContinuousStream

class KafkaContinuousStream extends ContinuousStream with Logging

A ContinuousStream for data from kafka.

Linear Supertypes
Logging, ContinuousStream, SparkDataStream, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. KafkaContinuousStream
  2. Logging
  3. ContinuousStream
  4. SparkDataStream
  5. AnyRef
  6. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new KafkaContinuousStream(offsetReader: KafkaOffsetReader, kafkaParams: Map[String, AnyRef], options: CaseInsensitiveStringMap, metadataPath: String, initialOffsets: KafkaOffsetRangeLimit, failOnDataLoss: Boolean)

    offsetReader

    a reader used to get kafka offsets. Note that the actual data will be read by per-task consumers generated later.

    kafkaParams

    String params for per-task Kafka consumers.

    options

    Params which are not Kafka consumer params.

    metadataPath

    Path to a directory this reader can use for writing metadata.

    initialOffsets

    The Kafka offsets to start reading data at.

    failOnDataLoss

    Flag indicating whether reading should fail in data loss scenarios, where some offsets after the specified initial ones can't be properly read.

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  6. def commit(end: Offset): Unit
    Definition Classes
    KafkaContinuousStream → SparkDataStream
  7. def createContinuousReaderFactory(): ContinuousPartitionReaderFactory
    Definition Classes
    KafkaContinuousStream → ContinuousStream
  8. def deserializeOffset(json: String): Offset
    Definition Classes
    KafkaContinuousStream → SparkDataStream
  9. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  10. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  11. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  12. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  13. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  14. def initialOffset(): Offset
    Definition Classes
    KafkaContinuousStream → SparkDataStream
  15. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  16. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  17. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  18. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  19. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  20. def logDebug(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  21. def logDebug(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  22. def logError(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  23. def logError(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  24. def logInfo(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  25. def logInfo(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  26. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  27. def logTrace(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  28. def logTrace(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  29. def logWarning(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  30. def logWarning(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  31. def mergeOffsets(offsets: Array[PartitionOffset]): Offset
    Definition Classes
    KafkaContinuousStream → ContinuousStream
  32. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  33. def needsReconfiguration(): Boolean
    Definition Classes
    KafkaContinuousStream → ContinuousStream
  34. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  35. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  36. def planInputPartitions(start: Offset): Array[InputPartition]
    Definition Classes
    KafkaContinuousStream → ContinuousStream
  37. def stop(): Unit

    Stop this source and free any resources it has allocated.

    Stop this source and free any resources it has allocated.

    Definition Classes
    KafkaContinuousStream → SparkDataStream
  38. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  39. def toString(): String
    Definition Classes
    KafkaContinuousStream → AnyRef → Any
  40. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  41. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  42. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()

Inherited from Logging

Inherited from ContinuousStream

Inherited from SparkDataStream

Inherited from AnyRef

Inherited from Any

Ungrouped