Class/Object

io.smartdatalake.workflow.connection

JdbcTableConnection

Related Docs: object JdbcTableConnection | package connection

Permalink

case class JdbcTableConnection(id: ConnectionId, url: String, driver: String, authMode: Option[AuthMode] = None, db: Option[String] = None, maxParallelConnections: Int = 1, connectionPoolMaxIdleTimeSec: Int = 3, metadata: Option[ConnectionMetadata] = None) extends Connection with SmartDataLakeLogger with Product with Serializable

Connection information for jdbc tables. If authentication is needed, user and password must be provided.

id

unique id of this connection

url

jdbc connection url

driver

class name of jdbc driver

authMode

optional authentication information: for now BasicAuthMode is supported.

db

jdbc database

maxParallelConnections

number of parallel jdbc connections created by an instance of this connection Note that Spark manages JDBC Connections on its own. This setting only applies to JDBC connection used by SDL for validating metadata or pre/postSQL.

connectionPoolMaxIdleTimeSec

timeout to close unused connections in the pool

Linear Supertypes
Serializable, Serializable, Product, Equals, SmartDataLakeLogger, Connection, AtlasExportable, ParsableFromConfig[Connection], SdlConfigObject, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. JdbcTableConnection
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. SmartDataLakeLogger
  7. Connection
  8. AtlasExportable
  9. ParsableFromConfig
  10. SdlConfigObject
  11. AnyRef
  12. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new JdbcTableConnection(id: ConnectionId, url: String, driver: String, authMode: Option[AuthMode] = None, db: Option[String] = None, maxParallelConnections: Int = 1, connectionPoolMaxIdleTimeSec: Int = 3, metadata: Option[ConnectionMetadata] = None)

    Permalink

    id

    unique id of this connection

    url

    jdbc connection url

    driver

    class name of jdbc driver

    authMode

    optional authentication information: for now BasicAuthMode is supported.

    db

    jdbc database

    maxParallelConnections

    number of parallel jdbc connections created by an instance of this connection Note that Spark manages JDBC Connections on its own. This setting only applies to JDBC connection used by SDL for validating metadata or pre/postSQL.

    connectionPoolMaxIdleTimeSec

    timeout to close unused connections in the pool

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def atlasName: String

    Permalink
    Definition Classes
    Connection → AtlasExportable
  6. def atlasQualifiedName(prefix: String): String

    Permalink
    Definition Classes
    AtlasExportable
  7. val authMode: Option[AuthMode]

    Permalink

    optional authentication information: for now BasicAuthMode is supported.

  8. val catalog: SQLCatalog

    Permalink
  9. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  10. val connectionPoolMaxIdleTimeSec: Int

    Permalink

    timeout to close unused connections in the pool

  11. val db: Option[String]

    Permalink

    jdbc database

  12. val driver: String

    Permalink

    class name of jdbc driver

  13. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  14. def execJdbcQuery[A](sql: String, evalResultSet: (ResultSet) ⇒ A): A

    Permalink

    Execute an SQL query and evaluate its ResultSet

    Execute an SQL query and evaluate its ResultSet

    sql

    sql query to execute

    evalResultSet

    function to evaluate the JDBC ResultSet

    returns

    the evaluated result

  15. def execJdbcStatement(sql: String, logging: Boolean = true): Boolean

    Permalink

    Execute an SQL statement

    Execute an SQL statement

    returns

    true if the first result is a ResultSet object; false if it is an update count or there are no results

  16. def execWithJdbcConnection[A](func: (java.sql.Connection) ⇒ A): A

    Permalink

    Get a connection from the pool and execute an arbitrary function

  17. def execWithJdbcStatement[A](func: (Statement) ⇒ A): A

    Permalink

    Get a JDBC connection from the pool, create a JDBC statement and execute an arbitrary function

  18. def factory: FromConfigFactory[Connection]

    Permalink

    Returns the factory that can parse this type (that is, type CO).

    Returns the factory that can parse this type (that is, type CO).

    Typically, implementations of this method should return the companion object of the implementing class. The companion object in turn should implement FromConfigFactory.

    returns

    the factory (object) for this class.

    Definition Classes
    JdbcTableConnection → ParsableFromConfig
  19. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  20. def getAuthModeSparkOptions: Map[String, String]

    Permalink
  21. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  22. val id: ConnectionId

    Permalink

    unique id of this connection

    unique id of this connection

    Definition Classes
    JdbcTableConnection → Connection → SdlConfigObject
  23. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  24. lazy val logger: Logger

    Permalink
    Attributes
    protected
    Definition Classes
    SmartDataLakeLogger
  25. val maxParallelConnections: Int

    Permalink

    number of parallel jdbc connections created by an instance of this connection Note that Spark manages JDBC Connections on its own.

    number of parallel jdbc connections created by an instance of this connection Note that Spark manages JDBC Connections on its own. This setting only applies to JDBC connection used by SDL for validating metadata or pre/postSQL.

  26. val metadata: Option[ConnectionMetadata]

    Permalink

    Additional metadata for the Connection

    Additional metadata for the Connection

    Definition Classes
    JdbcTableConnection → Connection
  27. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  28. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  29. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  30. val pool: GenericObjectPool[java.sql.Connection]

    Permalink
  31. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  32. def test(): Unit

    Permalink
  33. def toStringShort: String

    Permalink
    Definition Classes
    Connection
  34. val url: String

    Permalink

    jdbc connection url

  35. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  36. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  37. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from SmartDataLakeLogger

Inherited from Connection

Inherited from AtlasExportable

Inherited from ParsableFromConfig[Connection]

Inherited from SdlConfigObject

Inherited from AnyRef

Inherited from Any

Ungrouped