Object/Class

org.apache.spark.streaming

SnappyStreamingContext

Related Docs: class SnappyStreamingContext | package streaming

Permalink

object SnappyStreamingContext extends internal.Logging with Serializable

Linear Supertypes
Serializable, Serializable, internal.Logging, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. SnappyStreamingContext
  2. Serializable
  3. Serializable
  4. Logging
  5. AnyRef
  6. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  7. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  8. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  9. def getActive: Option[SnappyStreamingContext]

    Permalink

    :: Experimental ::

    :: Experimental ::

    Get the currently active context, if there is one. Active means started but not stopped.

    Annotations
    @Experimental()
  10. def getActiveOrCreate(checkpointPath: String, creatingFunc: () ⇒ SnappyStreamingContext, hadoopConf: Configuration = SparkHadoopUtil.get.conf, createOnError: Boolean = false): SnappyStreamingContext

    Permalink

    :: Experimental ::

    :: Experimental ::

    Either get the currently active StreamingContext (that is, started but not stopped), OR recreate a StreamingContext from checkpoint data in the given path. If checkpoint data does not exist in the provided, then create a new StreamingContext by calling the provided creatingFunc.

    checkpointPath

    Checkpoint directory used in an earlier StreamingContext program

    creatingFunc

    Function to create a new StreamingContext

    hadoopConf

    Optional Hadoop configuration if necessary for reading from the file system

    createOnError

    Optional, whether to create a new StreamingContext if there is an error in reading checkpoint data. By default, an exception will be thrown on error.

    Annotations
    @Experimental()
  11. def getActiveOrCreate(creatingFunc: () ⇒ SnappyStreamingContext): SnappyStreamingContext

    Permalink

    :: Experimental :: Either return the "active" StreamingContext (that is, started but not stopped), or create a new StreamingContext that is started by the creating function

    :: Experimental :: Either return the "active" StreamingContext (that is, started but not stopped), or create a new StreamingContext that is started by the creating function

    creatingFunc

    Function to create a new StreamingContext

    Annotations
    @Experimental()
  12. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  13. def getInstance(): Option[SnappyStreamingContext]

    Permalink

    :: Experimental ::

    :: Experimental ::

    Get the currently created context, it may be started or not, but never stopped.

    Annotations
    @Experimental()
  14. def getOrCreate(checkpointPath: String, creatingFunc: () ⇒ SnappyStreamingContext, hadoopConf: Configuration = SparkHadoopUtil.get.conf, createOnError: Boolean = false): SnappyStreamingContext

    Permalink

    Either recreate a SnappyStreamingContext from checkpoint data or create a new SnappyStreamingContext.

    Either recreate a SnappyStreamingContext from checkpoint data or create a new SnappyStreamingContext. If checkpoint data exists in the provided checkpointPath, then SnappyStreamingContext will be recreated from the checkpoint data. If the data does not exist, then the StreamingContext will be created by called the provided creatingFunc.

    checkpointPath

    Checkpoint directory used in an earlier StreamingContext program

    creatingFunc

    Function to create a new SnappyStreamingContext

    hadoopConf

    Optional Hadoop configuration if necessary for reading from the file system

    createOnError

    Optional, whether to create a new SnappyStreamingContext if there is an error in reading checkpoint data. By default, an exception will be thrown on error.

  15. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  16. def initializeLogIfNecessary(isInterpreter: Boolean): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  17. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  18. def isTraceEnabled(): Boolean

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  19. def log: Logger

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  20. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  21. def logDebug(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  22. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  23. def logError(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  24. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  25. def logInfo(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  26. def logName: String

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  27. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  28. def logTrace(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  29. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  30. def logWarning(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  31. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  32. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  33. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  34. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  35. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  36. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  37. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  38. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from internal.Logging

Inherited from AnyRef

Inherited from Any

Ungrouped