Class

com.datawizards.sparklocal.impl.scala.eager.session

SparkSessionAPIScalaEagerImpl

Related Doc: package session

Permalink

class SparkSessionAPIScalaEagerImpl extends SparkSessionAPIScalaBase

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. SparkSessionAPIScalaEagerImpl
  2. SparkSessionAPIScalaBase
  3. SparkSessionAPI
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SparkSessionAPIScalaEagerImpl()

    Permalink

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def broadcast[T](value: T)(implicit arg0: ClassTag[T]): BroadcastAPI[T]

    Permalink

    Broadcast a read-only variable to the cluster, returning a broadcast object for reading it in distributed functions.

    Broadcast a read-only variable to the cluster, returning a broadcast object for reading it in distributed functions. The variable will be sent to each cluster only once.

    Definition Classes
    SparkSessionAPIScalaBaseSparkSessionAPI
  6. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  7. def collectionAccumulator[T](name: String): CollectionAccumulatorAPI[T]

    Permalink

    Create and register a CollectionAccumulator, which starts with empty list and accumulates inputs by adding them into the list.

    Create and register a CollectionAccumulator, which starts with empty list and accumulates inputs by adding them into the list.

    Definition Classes
    SparkSessionAPIScalaBaseSparkSessionAPI
  8. def collectionAccumulator[T]: CollectionAccumulatorAPI[T]

    Permalink

    Create and register a CollectionAccumulator, which starts with empty list and accumulates inputs by adding them into the list.

    Create and register a CollectionAccumulator, which starts with empty list and accumulates inputs by adding them into the list.

    Definition Classes
    SparkSessionAPIScalaBaseSparkSessionAPI
  9. def createDataset[T](data: Seq[T])(implicit arg0: ClassTag[T], enc: Encoder[T]): DataSetAPI[T]

    Permalink

    Create new DataSet based on Scala collection

    Create new DataSet based on Scala collection

    Definition Classes
    SparkSessionAPIScalaEagerImplSparkSessionAPI
  10. def createDataset[T](data: RDDAPI[T])(implicit arg0: ClassTag[T], enc: Encoder[T]): DataSetAPI[T]

    Permalink

    Create new DataSet based on RDD

    Create new DataSet based on RDD

    Definition Classes
    SparkSessionAPI
  11. def createRDD[T](data: Seq[T])(implicit arg0: ClassTag[T]): RDDAPI[T]

    Permalink

    Create new RDD based on Scala collection

    Create new RDD based on Scala collection

    Definition Classes
    SparkSessionAPIScalaEagerImplSparkSessionAPI
  12. def doubleAccumulator(name: String): DoubleAccumulatorAPI

    Permalink

    Create and register a double accumulator, which starts with 0 and accumulates inputs by add.

    Create and register a double accumulator, which starts with 0 and accumulates inputs by add.

    Definition Classes
    SparkSessionAPIScalaBaseSparkSessionAPI
  13. def doubleAccumulator: DoubleAccumulatorAPI

    Permalink

    Create and register a double accumulator, which starts with 0 and accumulates inputs by add.

    Create and register a double accumulator, which starts with 0 and accumulates inputs by add.

    Definition Classes
    SparkSessionAPIScalaBaseSparkSessionAPI
  14. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  15. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  16. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  17. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  18. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  19. object implicits

    Permalink
    Definition Classes
    SparkSessionAPIScalaBase
  20. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  21. def longAccumulator(name: String): LongAccumulatorAPI

    Permalink

    Create and register a long accumulator, which starts with 0 and accumulates inputs by add.

    Create and register a long accumulator, which starts with 0 and accumulates inputs by add.

    Definition Classes
    SparkSessionAPIScalaBaseSparkSessionAPI
  22. def longAccumulator: LongAccumulatorAPI

    Permalink

    Create and register a long accumulator, which starts with 0 and accumulates inputs by add.

    Create and register a long accumulator, which starts with 0 and accumulates inputs by add.

    Definition Classes
    SparkSessionAPIScalaBaseSparkSessionAPI
  23. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  24. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  25. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  26. def read[T]: ReaderExecutor[T]

    Permalink

    Returns a ReaderExecutor that can be used to read non-streaming data in as a DataSet

    Returns a ReaderExecutor that can be used to read non-streaming data in as a DataSet

    Definition Classes
    SparkSessionAPIScalaEagerImplSparkSessionAPI
  27. def register(acc: AccumulatorV2API[_, _]): Unit

    Permalink

    Register the given accumulator.

    Register the given accumulator.

    Definition Classes
    SparkSessionAPIScalaBaseSparkSessionAPI
  28. def register(acc: AccumulatorV2API[_, _], name: String): Unit

    Permalink

    Register the given accumulator with given name.

    Register the given accumulator with given name.

    Definition Classes
    SparkSessionAPIScalaBaseSparkSessionAPI
  29. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  30. def textFile(path: String, minPartitions: Int = 2): RDDAPI[String]

    Permalink

    Read a text file from HDFS, a local file system (available on all nodes), or any Hadoop-supported file system URI, and return it as an RDD of Strings.

    Read a text file from HDFS, a local file system (available on all nodes), or any Hadoop-supported file system URI, and return it as an RDD of Strings.

    Definition Classes
    SparkSessionAPIScalaEagerImplSparkSessionAPI
  31. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  32. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  33. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  34. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from SparkSessionAPIScalaBase

Inherited from SparkSessionAPI

Inherited from AnyRef

Inherited from Any

Ungrouped