Package

com.memsql.spark

connector

Permalink

package connector

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. connector
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. class DataFrameFunctions extends AnyRef

    Permalink
  2. class DefaultSource extends RelationProvider with CreatableRelationProvider with DataSourceRegister

    Permalink

    This class is a proxy for the actual implementation in org.apache.spark.

    This class is a proxy for the actual implementation in org.apache.spark. It allows you to write data to MemSQL via the Spark RelationProvider API.

    Example: df.write.format("com.memsql.spark.connector").save("foo.bar")

  3. abstract class IngestStrategy extends AnyRef

    Permalink
  4. case class InsertStrategy(tableFragment: QueryFragment, conf: SaveToMemSQLConf) extends IngestStrategy with Product with Serializable

    Permalink
  5. case class LoadDataStrategy(tableFragment: QueryFragment, conf: SaveToMemSQLConf) extends IngestStrategy with Product with Serializable

    Permalink
  6. case class MemSQLCluster(conf: MemSQLConf) extends Product with Serializable

    Permalink
  7. case class MemSQLConf(masterHost: String, masterPort: Int, user: String, password: String, defaultDBName: String, defaultSaveMode: SaveMode, defaultCreateMode: CreateMode, defaultInsertBatchSize: Int, defaultLoadDataCompression: CompressionType, disablePartitionPushdown: Boolean) extends Product with Serializable

    Permalink

    Configuration for a MemSQL cluster.

    Configuration for a MemSQL cluster. By default these parameters are set by the corresponding value in the Spark configuration.

    masterHost

    Hostname of the MemSQL Master Aggregator. Corresponds to "spark.memsql.host" in the Spark configuration.

    masterPort

    Port of the MemSQL Master Aggregator. Corresponds to "spark.memsql.port" in the Spark configuration.

    user

    Username to use when connecting to the MemSQL Master Aggregator. Corresponds to "spark.memsql.user" in the Spark configuration.

    password

    Password to use when connecting to the MemSQL Master Aggregator. Corresponds to "sparkk.memsql.password" in the Spark configuration.

    defaultDBName

    The default database to use when connecting to the cluster. Corresponds to "spark.memsql.defaultDatabase" in the Spark configuration.

    defaultSaveMode

    The default org.apache.spark.sql.SaveMode to use when writingsaving org.apache.spark.sql.DataFrames to a MemSQL table. Corresponds to "spark.memsql.defaultSaveMode" in the Spark configuration.

    defaultCreateMode

    The default com.memsql.spark.connector.CreateMode to use when creating a MemSQL table. Corresponds to "spark.memsql.defaultCreateMode" in the Spark configuration.

    defaultInsertBatchSize

    The default batch insert size to use when writing to a MemSQL table using com.memsql.spark.connector.InsertStrategy. Corresponds to "spark.memsql.defaultInsertBatchSize" in the Spark configuration.

    defaultLoadDataCompression

    The default com.memsql.spark.connector.CompressionType to use when writing to a MemSQL table using com.memsql.spark.connector.LoadDataStrategy. Corresponds to "spark.memsql.defaultLoadDataCompression" in the Spark configuration.

  8. case class MemSQLQueryRelation(cluster: MemSQLCluster, query: String, databaseName: Option[String], sqlContext: SQLContext, disablePartitionPushdown: Boolean, enableStreaming: Boolean) extends BaseRelation with TableScan with Product with Serializable

    Permalink
  9. case class MemSQLTableRelation(cluster: MemSQLCluster, tableIdentifier: TableIdentifier, sqlContext: SQLContext, disablePartitionPushdown: Boolean, enableStreaming: Boolean) extends BaseRelation with PrunedFilteredScan with InsertableRelation with Product with Serializable

    Permalink
  10. case class SaveToMemSQLConf(saveMode: SaveMode, createMode: CreateMode, onDuplicateKeySQL: Option[String], insertBatchSize: Int, loadDataCompression: CompressionType, useKeylessShardingOptimization: Boolean, extraColumns: Seq[ColumnDefinition], extraKeys: Seq[MemSQLKey], dryRun: Boolean, writeToMaster: Boolean) extends Product with Serializable

    Permalink
  11. class SchemaFunctions extends Serializable

    Permalink
  12. class SparkSessionFunctions extends Serializable

    Permalink

Value Members

  1. object CompressionType extends Enumeration

    Permalink
  2. object CreateMode extends Enumeration

    Permalink
  3. object DefaultSource

    Permalink
  4. object MemSQLConf extends Serializable

    Permalink
  5. object MemSQLConnectionPool

    Permalink
  6. object SaveToMemSQLConf extends Serializable

    Permalink
  7. implicit def dataFrameFunctions(df: DataFrame): DataFrameFunctions

    Permalink
  8. package dataframe

    Permalink
  9. package rdd

    Permalink
  10. implicit def schemaFunctions(schema: StructType): SchemaFunctions

    Permalink
  11. implicit def sparkSessionFunctions(sparkSession: SparkSession): SparkSessionFunctions

    Permalink
  12. package sql

    Permalink
  13. package util

    Permalink

Inherited from AnyRef

Inherited from Any

Ungrouped