Package

org.apache.spark

sql

Permalink

package sql

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. sql
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. case class AQPDataFrame(snappySession: SnappySession, qe: QueryExecution) extends DataFrame with Product with Serializable

    Permalink
  2. final class BlockAndExecutorId extends Externalizable

    Permalink
  3. abstract class ClusterMode extends AnyRef

    Permalink
  4. case class DMLExternalTable(tableName: TableIdentifier, child: LogicalPlan, command: String) extends LogicalPlan with Command with Product with Serializable

    Permalink
  5. type DataFrame = Dataset[Row]

    Permalink
  6. class DataFrameJavaFunctions extends AnyRef

    Permalink
  7. final class DataFrameWithTime extends DataFrame with Serializable

    Permalink
  8. class DataFrameWriterJavaFunctions extends AnyRef

    Permalink
  9. class DummyRDD extends RDD[InternalRow]

    Permalink
  10. case class ExternalClusterMode(sc: SparkContext, url: String) extends ClusterMode with Product with Serializable

    Permalink

    A regular Spark/Yarn/Mesos or any other non-snappy cluster.

  11. case class ExternalEmbeddedMode(sc: SparkContext, url: String) extends ClusterMode with Product with Serializable

    Permalink

    This is for the "old-way" of starting GemFireXD inside an existing Spark/Yarn cluster where cluster nodes themselves boot up as GemXD cluster.

  12. final class Keyword extends AnyRef

    Permalink
  13. case class LocalMode(sc: SparkContext, url: String) extends ClusterMode with Product with Serializable

    Permalink

    The local mode which hosts the data, executor, driver (and optionally even jobserver) all in the same node.

  14. final class SampleDataFrame extends DataFrame with Serializable

    Permalink
  15. trait SampleDataFrameContract extends AnyRef

    Permalink
  16. abstract class SnappyBaseParser extends Parser

    Permalink

    Base parsing facilities for all SnappyData SQL parsers.

  17. class SnappyContext extends SQLContext with Serializable with internal.Logging

    Permalink

    Main entry point for SnappyData extensions to Spark.

    Main entry point for SnappyData extensions to Spark. A SnappyContext extends Spark's org.apache.spark.sql.SQLContext to work with Row and Column tables. Any DataFrame can be managed as SnappyData tables and any table can be accessed as a DataFrame. This integrates the SQLContext functionality with the Snappy store.

    When running in the embedded mode (i.e. Spark executor collocated with Snappy data store), Applications typically submit Jobs to the Snappy-JobServer (provide link) and do not explicitly create a SnappyContext. A single shared context managed by SnappyData makes it possible to re-use Executors across client connections or applications.

    SnappyContext uses a HiveMetaStore for catalog , which is persistent. This enables table metadata info recreated on driver restart.

    User should use obtain reference to a SnappyContext instance as below val snc: SnappyContext = SnappyContext.getOrCreate(sparkContext)

    To do

    Provide links to above descriptions

    ,

    document describing the Job server API

    See also

    https://github.com/SnappyDataInc/snappydata#interacting-with-snappydata

    https://github.com/SnappyDataInc/snappydata#step-1---start-the-snappydata-cluster

  18. abstract class SnappyDDLParser extends SnappyBaseParser

    Permalink
  19. case class SnappyEmbeddedMode(sc: SparkContext, url: String) extends ClusterMode with Product with Serializable

    Permalink

    The regular snappy cluster where each node is both a Spark executor as well as GemFireXD data store.

    The regular snappy cluster where each node is both a Spark executor as well as GemFireXD data store. There is a "lead node" which is the Spark driver that also hosts a job-server and GemFireXD accessor.

  20. class SnappyParser extends SnappyDDLParser

    Permalink
  21. class SnappySession extends SparkSession with internal.Logging

    Permalink
  22. class SnappySqlParser extends AbstractSqlParser

    Permalink
  23. case class SplitClusterMode(sc: SparkContext, url: String) extends ClusterMode with Product with Serializable

    Permalink

    This is for the two cluster mode: one is the normal snappy cluster, and this one is a separate local/Spark/Yarn/Mesos cluster fetching data from the snappy cluster on demand that just remains like an external datastore.

  24. type Strategy = SparkStrategy

    Permalink
    Annotations
    @DeveloperApi()
  25. class TableNotFoundException extends AnalysisException with Serializable

    Permalink
  26. class TimeEpoch extends AnyRef

    Permalink

    Manages a time epoch and how to index into it.

Value Members

  1. object DataFrameUtil

    Permalink
  2. object LockUtils

    Permalink
  3. object SampleDataFrameContract

    Permalink
  4. object SnappyContext extends internal.Logging with Serializable

    Permalink
  5. object SnappyParserConsts

    Permalink
  6. object SnappySession extends Serializable

    Permalink
  7. package aqp

    Permalink
  8. package collection

    Permalink
  9. package execution

    Permalink
  10. package hive

    Permalink
  11. package internal

    Permalink
  12. package row

    Permalink
  13. object snappy extends Serializable

    Permalink

    Implicit conversions used by Snappy.

  14. package sources

    Permalink
  15. package store

    Permalink
  16. package streaming

    Permalink

Inherited from AnyRef

Inherited from Any

Ungrouped