Class

com.datawizards.sparklocal.impl.spark.session

SparkSessionAPISparkImpl

Related Doc: package session

Permalink

class SparkSessionAPISparkImpl extends SparkSessionAPI

Linear Supertypes
SparkSessionAPI, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. SparkSessionAPISparkImpl
  2. SparkSessionAPI
  3. AnyRef
  4. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SparkSessionAPISparkImpl(spark: SparkSession)

    Permalink

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def broadcast[T](value: T)(implicit arg0: ClassTag[T]): BroadcastAPI[T]

    Permalink

    Broadcast a read-only variable to the cluster, returning a broadcast object for reading it in distributed functions.

    Broadcast a read-only variable to the cluster, returning a broadcast object for reading it in distributed functions. The variable will be sent to each cluster only once.

    Definition Classes
    SparkSessionAPISparkImplSparkSessionAPI
  6. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  7. def collectionAccumulator[T](name: String): CollectionAccumulatorAPI[T]

    Permalink

    Create and register a CollectionAccumulator, which starts with empty list and accumulates inputs by adding them into the list.

    Create and register a CollectionAccumulator, which starts with empty list and accumulates inputs by adding them into the list.

    Definition Classes
    SparkSessionAPISparkImplSparkSessionAPI
  8. def collectionAccumulator[T]: CollectionAccumulatorAPI[T]

    Permalink

    Create and register a CollectionAccumulator, which starts with empty list and accumulates inputs by adding them into the list.

    Create and register a CollectionAccumulator, which starts with empty list and accumulates inputs by adding them into the list.

    Definition Classes
    SparkSessionAPISparkImplSparkSessionAPI
  9. def createDataset[T](data: Seq[T])(implicit arg0: ClassTag[T], enc: Encoder[T]): DataSetAPI[T]

    Permalink

    Create new DataSet based on Scala collection

    Create new DataSet based on Scala collection

    Definition Classes
    SparkSessionAPISparkImplSparkSessionAPI
  10. def createDataset[T](data: RDDAPI[T])(implicit arg0: ClassTag[T], enc: Encoder[T]): DataSetAPI[T]

    Permalink

    Create new DataSet based on RDD

    Create new DataSet based on RDD

    Definition Classes
    SparkSessionAPI
  11. def createRDD[T](data: Seq[T])(implicit arg0: ClassTag[T]): RDDAPI[T]

    Permalink

    Create new RDD based on Scala collection

    Create new RDD based on Scala collection

    Definition Classes
    SparkSessionAPISparkImplSparkSessionAPI
  12. def doubleAccumulator(name: String): DoubleAccumulatorAPI

    Permalink

    Create and register a double accumulator, which starts with 0 and accumulates inputs by add.

    Create and register a double accumulator, which starts with 0 and accumulates inputs by add.

    Definition Classes
    SparkSessionAPISparkImplSparkSessionAPI
  13. def doubleAccumulator: DoubleAccumulatorAPI

    Permalink

    Create and register a double accumulator, which starts with 0 and accumulates inputs by add.

    Create and register a double accumulator, which starts with 0 and accumulates inputs by add.

    Definition Classes
    SparkSessionAPISparkImplSparkSessionAPI
  14. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  15. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  16. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  17. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  18. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  19. object implicitImpl extends SQLImplicits

    Permalink
  20. val implicits: SQLImplicits

    Permalink
  21. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  22. def longAccumulator(name: String): LongAccumulatorAPI

    Permalink

    Create and register a long accumulator, which starts with 0 and accumulates inputs by add.

    Create and register a long accumulator, which starts with 0 and accumulates inputs by add.

    Definition Classes
    SparkSessionAPISparkImplSparkSessionAPI
  23. def longAccumulator: LongAccumulatorAPI

    Permalink

    Create and register a long accumulator, which starts with 0 and accumulates inputs by add.

    Create and register a long accumulator, which starts with 0 and accumulates inputs by add.

    Definition Classes
    SparkSessionAPISparkImplSparkSessionAPI
  24. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  25. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  26. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  27. def read[T]: ReaderExecutor[T]

    Permalink

    Returns a ReaderExecutor that can be used to read non-streaming data in as a DataSet

    Returns a ReaderExecutor that can be used to read non-streaming data in as a DataSet

    Definition Classes
    SparkSessionAPISparkImplSparkSessionAPI
  28. def register(acc: AccumulatorV2API[_, _]): Unit

    Permalink

    Register the given accumulator.

    Register the given accumulator.

    Definition Classes
    SparkSessionAPISparkImplSparkSessionAPI
  29. def register(acc: AccumulatorV2API[_, _], name: String): Unit

    Permalink

    Register the given accumulator with given name.

    Register the given accumulator with given name.

    Definition Classes
    SparkSessionAPISparkImplSparkSessionAPI
  30. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  31. def textFile(path: String, minPartitions: Int = 2): RDDAPI[String]

    Permalink

    Read a text file from HDFS, a local file system (available on all nodes), or any Hadoop-supported file system URI, and return it as an RDD of Strings.

    Read a text file from HDFS, a local file system (available on all nodes), or any Hadoop-supported file system URI, and return it as an RDD of Strings.

    Definition Classes
    SparkSessionAPISparkImplSparkSessionAPI
  32. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  33. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  34. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  35. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from SparkSessionAPI

Inherited from AnyRef

Inherited from Any

Ungrouped