Package

org.apache.spark

sql

Permalink

package sql

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. sql
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. case class AQPDataFrame(snappySession: SnappySession, qe: QueryExecution) extends DataFrame with Product with Serializable

    Permalink
  2. final class AggregatePartialDataIterator extends Iterator[Any]

    Permalink
  3. final class BlockAndExecutorId extends Externalizable

    Permalink
  4. class CachedDataFrame extends Dataset[Row] with Logging

    Permalink
  5. abstract class ClusterMode extends AnyRef

    Permalink
  6. case class CollapseCollocatedPlans(session: SparkSession) extends Rule[SparkPlan] with Product with Serializable

    Permalink

    Rule to collapse the partial and final aggregates if the grouping keys match or are superset of the child distribution.

    Rule to collapse the partial and final aggregates if the grouping keys match or are superset of the child distribution. Also introduces exchange when inserting into a partitioned table if number of partitions don't match.

  7. case class CreateTableUsing(tableIdent: TableIdentifier, userSpecifiedSchema: Option[StructType], provider: String, temporary: Boolean, options: Map[String, String], partitionColumns: Array[String], bucketSpec: Option[BucketSpec], allowExisting: Boolean, managedIfNoPath: Boolean) extends LeafNode with Command with Product with Serializable

    Permalink

    Used to represent the operation of create table using a data source.

    Used to represent the operation of create table using a data source.

    allowExisting

    If it is true, we will do nothing when the table already exists. If it is false, an exception will be thrown

  8. case class CreateTableUsingAsSelect(tableIdent: TableIdentifier, provider: String, partitionColumns: Array[String], bucketSpec: Option[BucketSpec], mode: SaveMode, options: Map[String, String], query: LogicalPlan) extends LeafNode with Command with Product with Serializable

    Permalink

    A node used to support CTAS statements and saveAsTable for the data source API.

  9. case class DMLExternalTable(tableName: TableIdentifier, query: LogicalPlan, command: String) extends LeafNode with Command with Product with Serializable

    Permalink
  10. type DataFrame = Dataset[Row]

    Permalink
  11. class DataFrameJavaFunctions extends AnyRef

    Permalink
  12. final class DataFrameWithTime extends DataFrame with Serializable

    Permalink
  13. class DataFrameWriterJavaFunctions extends AnyRef

    Permalink
  14. class DelegateRDD[T] extends RDD[T] with Serializable

    Permalink

    RDD that delegates calls to the base RDD.

    RDD that delegates calls to the base RDD. However the dependencies and preferred locations of this RDD can be altered.

  15. case class EmptyIteratorWithRowCount[U](rowCount: Long) extends Iterator[U] with Product with Serializable

    Permalink
  16. case class ExternalClusterMode(sc: SparkContext, url: String) extends ClusterMode with Product with Serializable

    Permalink

    A regular Spark/Yarn/Mesos or any other non-snappy cluster.

  17. case class ExternalEmbeddedMode(sc: SparkContext, url: String) extends ClusterMode with Product with Serializable

    Permalink

    This is for the "old-way" of starting GemFireXD inside an existing Spark/Yarn cluster where cluster nodes themselves boot up as GemXD cluster.

  18. case class InsertCachedPlanHelper(session: SnappySession, topLevel: Boolean) extends Rule[SparkPlan] with Product with Serializable

    Permalink

    Rule to collapse the partial and final aggregates if the grouping keys match or are superset of the child distribution.

  19. final class Keyword extends AnyRef

    Permalink
  20. case class LocalMode(sc: SparkContext, url: String) extends ClusterMode with Product with Serializable

    Permalink

    The local mode which hosts the data, executor, driver (and optionally even jobserver) all in the same node.

  21. class PartitionResult extends (Array[Byte], Int) with Serializable

    Permalink

    Encapsulates result of a partition having data and number of rows.

    Encapsulates result of a partition having data and number of rows.

    Note: this uses an optimized external serializer for PooledKryoSerializer so any changes to this class need to be reflected in the serializer.

  22. final class SampleDataFrame extends DataFrame with Serializable

    Permalink
  23. trait SampleDataFrameContract extends AnyRef

    Permalink
  24. class SmartConnectorHelper extends Logging

    Permalink
  25. class SnappyAggregationStrategy extends Strategy

    Permalink

    Used to plan the aggregate operator for expressions using the optimized SnappyData aggregation operators.

    Used to plan the aggregate operator for expressions using the optimized SnappyData aggregation operators.

    Adapted from Spark's Aggregation strategy.

  26. abstract class SnappyBaseParser extends Parser

    Permalink

    Base parsing facilities for all SnappyData SQL parsers.

  27. class SnappyContext extends SQLContext with Serializable

    Permalink

    Main entry point for SnappyData extensions to Spark.

    Main entry point for SnappyData extensions to Spark. A SnappyContext extends Spark's org.apache.spark.sql.SQLContext to work with Row and Column tables. Any DataFrame can be managed as SnappyData tables and any table can be accessed as a DataFrame. This integrates the SQLContext functionality with the Snappy store.

    When running in the embedded mode (i.e. Spark executor collocated with Snappy data store), Applications typically submit Jobs to the Snappy-JobServer (provide link) and do not explicitly create a SnappyContext. A single shared context managed by SnappyData makes it possible to re-use Executors across client connections or applications.

    SnappyContext uses a HiveMetaStore for catalog , which is persistent. This enables table metadata info recreated on driver restart.

    User should use obtain reference to a SnappyContext instance as below val snc: SnappyContext = SnappyContext.getOrCreate(sparkContext)

    To do

    Provide links to above descriptions

    ,

    document describing the Job server API

    See also

    https://github.com/SnappyDataInc/snappydata#interacting-with-snappydata

    https://github.com/SnappyDataInc/snappydata#step-1---start-the-snappydata-cluster

  28. abstract class SnappyDDLParser extends SnappyBaseParser

    Permalink
  29. case class SnappyEmbeddedMode(sc: SparkContext, url: String) extends ClusterMode with Product with Serializable

    Permalink

    The regular snappy cluster where each node is both a Spark executor as well as GemFireXD data store.

    The regular snappy cluster where each node is both a Spark executor as well as GemFireXD data store. There is a "lead node" which is the Spark driver that also hosts a job-server and GemFireXD accessor.

  30. class SnappyParser extends SnappyDDLParser

    Permalink
  31. class SnappySession extends SparkSession

    Permalink
  32. class SnappySqlParser extends AbstractSqlParser

    Permalink
  33. type Strategy = SparkStrategy

    Permalink
    Annotations
    @DeveloperApi() @Unstable()
  34. class TableNotFoundException extends AnalysisException with Serializable

    Permalink
  35. case class ThinClientConnectorMode(sc: SparkContext, url: String) extends ClusterMode with Product with Serializable

    Permalink

    This is for the two cluster mode: one is the normal snappy cluster, and this one is a separate local/Spark/Yarn/Mesos cluster fetching data from the snappy cluster on demand that just remains like an external datastore.

  36. class TimeEpoch extends AnyRef

    Permalink

    Manages a time epoch and how to index into it.

Value Members

  1. object CachedDataFrame extends (TaskContext, Iterator[InternalRow]) ⇒ PartitionResult with Serializable with KryoSerializable with Logging

    Permalink
  2. object DataFrameUtil

    Permalink
  3. object LockUtils

    Permalink
  4. object RDDs

    Permalink
  5. object SampleDataFrameContract

    Permalink
  6. object SmartConnectorHelper

    Permalink
  7. object SnappyContext extends Logging with Serializable

    Permalink
  8. object SnappyParserConsts

    Permalink
  9. object SnappySession extends Logging with Serializable

    Permalink
  10. package aqp

    Permalink
  11. package catalyst

    Permalink
  12. package collection

    Permalink
  13. package execution

    Permalink
  14. package hive

    Permalink
  15. package internal

    Permalink
  16. package row

    Permalink
  17. object snappy extends Serializable

    Permalink

    Implicit conversions used by Snappy.

  18. package sources

    Permalink
  19. package store

    Permalink
  20. package streaming

    Permalink
  21. package types

    Permalink

Inherited from AnyRef

Inherited from Any

Ungrouped