Packages

class DefaultSource extends RelationProvider with SchemaRelationProvider with CreatableRelationProvider

Redshift Source implementation for Spark SQL

Linear Supertypes
CreatableRelationProvider, SchemaRelationProvider, RelationProvider, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. DefaultSource
  2. CreatableRelationProvider
  3. SchemaRelationProvider
  4. RelationProvider
  5. AnyRef
  6. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new DefaultSource()

    Default constructor required by Data Source API

  2. new DefaultSource(jdbcWrapper: JDBCWrapper, s3ClientFactory: (AWSCredentialsProvider) ⇒ AmazonS3Client)

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  6. def createRelation(sqlContext: SQLContext, saveMode: SaveMode, parameters: Map[String, String], data: DataFrame): BaseRelation

    Creates a Relation instance by first writing the contents of the given DataFrame to Redshift

    Creates a Relation instance by first writing the contents of the given DataFrame to Redshift

    Definition Classes
    DefaultSource → CreatableRelationProvider
  7. def createRelation(sqlContext: SQLContext, parameters: Map[String, String], schema: StructType): BaseRelation

    Load a RedshiftRelation using user-provided schema, so no inference over JDBC will be used.

    Load a RedshiftRelation using user-provided schema, so no inference over JDBC will be used.

    Definition Classes
    DefaultSource → SchemaRelationProvider
  8. def createRelation(sqlContext: SQLContext, parameters: Map[String, String]): BaseRelation

    Create a new RedshiftRelation instance using parameters from Spark SQL DDL.

    Create a new RedshiftRelation instance using parameters from Spark SQL DDL. Resolves the schema using JDBC connection over provided URL, which must contain credentials.

    Definition Classes
    DefaultSource → RelationProvider
  9. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  10. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  11. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  12. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  13. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  14. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  15. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  16. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  17. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  18. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  19. def toString(): String
    Definition Classes
    AnyRef → Any
  20. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  21. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  22. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()

Inherited from CreatableRelationProvider

Inherited from SchemaRelationProvider

Inherited from RelationProvider

Inherited from AnyRef

Inherited from Any

Ungrouped