class DefaultSource extends RelationProvider with SchemaRelationProvider with CreatableRelationProvider

Redshift Source implementation for Spark SQL

Linear Supertypes
CreatableRelationProvider, SchemaRelationProvider, RelationProvider, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. DefaultSource
  2. CreatableRelationProvider
  3. SchemaRelationProvider
  4. RelationProvider
  5. AnyRef
  6. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new DefaultSource()

    Default constructor required by Data Source API

  2. new DefaultSource(jdbcWrapper: JDBCWrapper, s3ClientFactory: (AWSCredentialsProvider, MergedParameters) ⇒ AmazonS3)

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  6. def createRelation(sqlContext: SQLContext, saveMode: SaveMode, parameters: Map[String, String], data: DataFrame): BaseRelation

    Creates a Relation instance by first writing the contents of the given DataFrame to Redshift

    Creates a Relation instance by first writing the contents of the given DataFrame to Redshift

    Definition Classes
    DefaultSource → CreatableRelationProvider
  7. def createRelation(sqlContext: SQLContext, parameters: Map[String, String], schema: StructType): BaseRelation

    Load a RedshiftRelation using user-provided schema, so no inference over JDBC will be used.

    Load a RedshiftRelation using user-provided schema, so no inference over JDBC will be used.

    Definition Classes
    DefaultSource → SchemaRelationProvider
  8. def createRelation(sqlContext: SQLContext, parameters: Map[String, String]): BaseRelation

    Create a new RedshiftRelation instance using parameters from Spark SQL DDL.

    Create a new RedshiftRelation instance using parameters from Spark SQL DDL. Resolves the schema using JDBC connection over provided URL, which must contain credentials.

    Definition Classes
    DefaultSource → RelationProvider
  9. def enablePushdownSession(session: SparkSession): Unit

    Enable more advanced query pushdowns to redshift.

    Enable more advanced query pushdowns to redshift.

    session

    The SparkSession for which pushdowns are to be enabled.

  10. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  11. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  12. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  13. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  14. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  15. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  16. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  17. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  18. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  19. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  20. def toString(): String
    Definition Classes
    AnyRef → Any
  21. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  22. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  23. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()

Inherited from CreatableRelationProvider

Inherited from SchemaRelationProvider

Inherited from RelationProvider

Inherited from AnyRef

Inherited from Any

Ungrouped