org.apache.spark.sql.cassandra

CassandraSQLContext

Related Docs: object CassandraSQLContext | package cassandra

class CassandraSQLContext extends SQLContext

Allows to execute SQL queries against Cassandra and access results as SchemaRDD collections. Predicate pushdown to Cassandra is supported.

Example:

import com.datastax.spark.connector._

val sparkMasterHost = "127.0.0.1"
val cassandraHost = "127.0.0.1"

// Tell Spark the address of one Cassandra node:
val conf = new SparkConf(true).set("spark.cassandra.connection.host", cassandraHost)

// Connect to the Spark cluster:
val sc = new SparkContext("spark://" + sparkMasterHost + ":7077", "example", conf)

// Create CassandraSQLContext:
val cc = new CassandraSQLContext(sc)

// Execute SQL query:
val rdd = cc.sql("SELECT * FROM keyspace.table ...")
Linear Supertypes
SQLContext, Serializable, Serializable, Logging, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. CassandraSQLContext
  2. SQLContext
  3. Serializable
  4. Serializable
  5. Logging
  6. AnyRef
  7. Any
Implicitly
  1. by any2stringadd
  2. by StringFormat
  3. by Ensuring
  4. by ArrowAssoc
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new CassandraSQLContext(sc: SparkContext)

Type Members

  1. class QueryExecution extends AnyRef

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
    Annotations
    @DeveloperApi()
  2. class SQLSession extends AnyRef

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  3. class SparkPlanner extends SparkStrategies

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext

Value Members

  1. final def !=(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Definition Classes
    AnyRef → Any
  3. def +(other: String): String

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to any2stringadd[CassandraSQLContext] performed by method any2stringadd in scala.Predef.
    Definition Classes
    any2stringadd
  4. def ->[B](y: B): (CassandraSQLContext, B)

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to ArrowAssoc[CassandraSQLContext] performed by method ArrowAssoc in scala.Predef.
    Definition Classes
    ArrowAssoc
    Annotations
    @inline()
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  6. def addClusterLevelCassandraConnConf(cluster: String, conf: CassandraConnectorConf): CassandraSQLContext

    Add cluster level write configuration settings

  7. def addClusterLevelCassandraConnConf(cluster: String, conf: SparkConf): CassandraSQLContext

    Add cluster level write configuration settings

  8. def addClusterLevelReadConf(cluster: String, conf: ReadConf): CassandraSQLContext

    Add cluster level read configuration settings

  9. def addClusterLevelReadConf(cluster: String, conf: SparkConf): CassandraSQLContext

    Add cluster level read configuration settings

  10. def addClusterLevelWriteConf(cluster: String, conf: WriteConf): CassandraSQLContext

    Add cluster level write configuration settings

  11. def addClusterLevelWriteConf(cluster: String, conf: SparkConf): CassandraSQLContext

    Add cluster level write configuration settings

  12. def addKeyspaceLevelReadConf(keyspace: String, conf: ReadConf, cluster: Option[String]): CassandraSQLContext

    Add keyspace level read configuration settings.

    Add keyspace level read configuration settings. Set cluster to None for a single cluster

  13. def addKeyspaceLevelReadConf(keyspace: String, conf: SparkConf, cluster: Option[String]): CassandraSQLContext

    Add keyspace level read configuration settings.

    Add keyspace level read configuration settings. Set cluster to None for a single cluster

  14. def addKeyspaceLevelWriteConf(keyspace: String, writeConf: WriteConf, cluster: Option[String]): CassandraSQLContext

    Add keyspace level write configuration settings.

    Add keyspace level write configuration settings. Set cluster to None for a single cluster

  15. def addKeyspaceLevelWriteConf(keyspace: String, conf: SparkConf, cluster: Option[String]): CassandraSQLContext

    Add keyspace level write configuration settings.

    Add keyspace level write configuration settings. Set cluster to None for a single cluster

  16. def addTableReadConf(keyspace: String, table: String, conf: ReadConf, cluster: Option[String]): CassandraSQLContext

    Add table level read configuration settings.

    Add table level read configuration settings. Set cluster to None for a single cluster

  17. def addTableReadConf(keyspace: String, table: String, conf: SparkConf, cluster: Option[String]): CassandraSQLContext

    Add table level read configuration settings.

    Add table level read configuration settings. Set cluster to None for a single cluster

  18. def addTableWriteConf(keyspace: String, table: String, conf: WriteConf, cluster: Option[String]): CassandraSQLContext

    Add table level write configuration settings.

    Add table level write configuration settings. Set cluster to None for a single cluster

  19. def addTableWriteConf(keyspace: String, table: String, conf: SparkConf, cluster: Option[String]): CassandraSQLContext

    Add table level write configuration settings.

    Add table level write configuration settings. Set cluster to None for a single cluster

  20. lazy val analyzer: Analyzer

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  21. def applySchemaToPythonRDD(rdd: RDD[Array[Any]], schema: StructType): DataFrame

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  22. def applySchemaToPythonRDD(rdd: RDD[Array[Any]], schemaString: String): DataFrame

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  23. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  24. def baseRelationToDataFrame(baseRelation: BaseRelation): DataFrame

    Definition Classes
    SQLContext
  25. val cacheManager: execution.CacheManager

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  26. def cacheTable(tableName: String): Unit

    Definition Classes
    SQLContext
  27. def cassandraSql(cassandraQuery: String): DataFrame

    Executes SQL query against Cassandra and returns DataFrame representing the result.

  28. lazy val catalog: CassandraCatalog with OverrideCatalog

    A catalyst metadata catalog that points to Cassandra.

    A catalyst metadata catalog that points to Cassandra.

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    CassandraSQLContext → SQLContext
  29. def clearCache(): Unit

    Definition Classes
    SQLContext
  30. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  31. def conf: SQLConf

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  32. def createDataFrame(rdd: JavaRDD[_], beanClass: Class[_]): DataFrame

    Definition Classes
    SQLContext
  33. def createDataFrame(rdd: RDD[_], beanClass: Class[_]): DataFrame

    Definition Classes
    SQLContext
  34. def createDataFrame(rowRDD: JavaRDD[Row], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @DeveloperApi()
  35. def createDataFrame(rowRDD: RDD[Row], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @DeveloperApi()
  36. def createDataFrame[A <: Product](data: Seq[A])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  37. def createDataFrame[A <: Product](rdd: RDD[A])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  38. def createExternalTable(tableName: String, source: String, schema: StructType, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  39. def createExternalTable(tableName: String, source: String, schema: StructType, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  40. def createExternalTable(tableName: String, source: String, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  41. def createExternalTable(tableName: String, source: String, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  42. def createExternalTable(tableName: String, path: String, source: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  43. def createExternalTable(tableName: String, path: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  44. def createSession(): SQLSession

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  45. def currentSession(): SQLSession

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  46. val ddlParser: DDLParser

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  47. val defaultSession: SQLSession

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  48. def detachSession(): Unit

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  49. def dialectClassName: String

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  50. def dropTempTable(tableName: String): Unit

    Definition Classes
    SQLContext
  51. lazy val emptyDataFrame: DataFrame

    Definition Classes
    SQLContext
  52. lazy val emptyResult: RDD[Row]

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  53. def ensuring(cond: (CassandraSQLContext) ⇒ Boolean, msg: ⇒ Any): CassandraSQLContext

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to Ensuring[CassandraSQLContext] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  54. def ensuring(cond: (CassandraSQLContext) ⇒ Boolean): CassandraSQLContext

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to Ensuring[CassandraSQLContext] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  55. def ensuring(cond: Boolean, msg: ⇒ Any): CassandraSQLContext

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to Ensuring[CassandraSQLContext] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  56. def ensuring(cond: Boolean): CassandraSQLContext

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to Ensuring[CassandraSQLContext] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  57. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  58. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  59. def executePlan(plan: LogicalPlan): QueryExecution

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    CassandraSQLContext → SQLContext
  60. def executeSql(sql: String): QueryExecution

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  61. val experimental: ExperimentalMethods

    Definition Classes
    SQLContext
  62. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  63. def formatted(fmtstr: String): String

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to StringFormat[CassandraSQLContext] performed by method StringFormat in scala.Predef.
    Definition Classes
    StringFormat
    Annotations
    @inline()
  64. lazy val functionRegistry: FunctionRegistry

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  65. def getAllConfs: Map[String, String]

    Definition Classes
    SQLContext
  66. def getCassandraConnConf(cluster: Option[String]): CassandraConnectorConf

    Get Cassandra connection configuration settings by the order of cluster level, default settings

  67. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  68. def getConf(key: String, defaultValue: String): String

    Definition Classes
    SQLContext
  69. def getConf(key: String): String

    Definition Classes
    SQLContext
  70. def getKeyspace: String

    Returns keyspace set previously by setKeyspace or throws IllegalStateException if keyspace has not been set yet.

  71. def getReadConf(keyspace: String, table: String, cluster: Option[String]): ReadConf

    Get read configuration settings by the order of table level, keyspace level, cluster level, default settings

  72. def getSQLDialect(): ParserDialect

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  73. def getSchema(beanClass: Class[_]): Seq[AttributeReference]

    Attributes
    protected
    Definition Classes
    SQLContext
  74. def getWriteConf(keyspace: String, table: String, cluster: Option[String]): WriteConf

    Get write configuration settings by the order of table level, keyspace level, cluster level, default settings

  75. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  76. def isCached(tableName: String): Boolean

    Definition Classes
    SQLContext
  77. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  78. def isTraceEnabled(): Boolean

    Attributes
    protected
    Definition Classes
    Logging
  79. def log: Logger

    Attributes
    protected
    Definition Classes
    Logging
  80. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  81. def logDebug(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  82. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  83. def logError(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  84. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  85. def logInfo(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  86. def logName: String

    Attributes
    protected
    Definition Classes
    Logging
  87. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  88. def logTrace(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  89. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  90. def logWarning(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  91. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  92. final def notify(): Unit

    Definition Classes
    AnyRef
  93. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  94. def openSession(): SQLSession

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  95. lazy val optimizer: Optimizer

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  96. def parseDataType(dataTypeString: String): DataType

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  97. def parseSql(sql: String): LogicalPlan

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  98. val planner: SparkPlanner with CassandraStrategies { val strategies: Seq[org.apache.spark.sql.Strategy] }

    Modified Catalyst planner that does Cassandra-specific predicate pushdown

    Modified Catalyst planner that does Cassandra-specific predicate pushdown

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    CassandraSQLContext → SQLContext
  99. val prepareForExecution: RuleExecutor[SparkPlan] { val batches: List[this.Batch] }

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  100. def range(start: Long, end: Long, step: Long, numPartitions: Int): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  101. def range(start: Long, end: Long): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  102. def read: DataFrameReader

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  103. def setConf(key: String, value: String): Unit

    Definition Classes
    SQLContext
  104. def setConf(props: Properties): Unit

    Definition Classes
    SQLContext
  105. def setKeyspace(ks: String): Unit

    Sets default Cassandra keyspace to be used when accessing tables with unqualified names.

  106. val sparkConf: SparkConf

  107. val sparkContext: SparkContext

    Definition Classes
    SQLContext
  108. def sql(cassandraQuery: String): DataFrame

    Delegates to cassandraSql

    Delegates to cassandraSql

    Definition Classes
    CassandraSQLContext → SQLContext
  109. val sqlParser: SparkSQLParser

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  110. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  111. def table(tableName: String): DataFrame

    Definition Classes
    SQLContext
  112. def tableNames(databaseName: String): Array[String]

    Definition Classes
    SQLContext
  113. def tableNames(): Array[String]

    Definition Classes
    SQLContext
  114. def tables(databaseName: String): DataFrame

    Definition Classes
    SQLContext
  115. def tables(): DataFrame

    Definition Classes
    SQLContext
  116. val tlSession: ThreadLocal[SQLSession]

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  117. def toString(): String

    Definition Classes
    AnyRef → Any
  118. val udf: UDFRegistration

    Definition Classes
    SQLContext
  119. def uncacheTable(tableName: String): Unit

    Definition Classes
    SQLContext
  120. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  121. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  122. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  123. def [B](y: B): (CassandraSQLContext, B)

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to ArrowAssoc[CassandraSQLContext] performed by method ArrowAssoc in scala.Predef.
    Definition Classes
    ArrowAssoc

Deprecated Value Members

  1. def applySchema(rdd: JavaRDD[_], beanClass: Class[_]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.3.0) use createDataFrame

  2. def applySchema(rdd: RDD[_], beanClass: Class[_]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.3.0) use createDataFrame

  3. def applySchema(rowRDD: JavaRDD[Row], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.3.0) use createDataFrame

  4. def applySchema(rowRDD: RDD[Row], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.3.0) use createDataFrame

  5. def jdbc(url: String, table: String, theParts: Array[String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) use read.jdbc()

  6. def jdbc(url: String, table: String, columnName: String, lowerBound: Long, upperBound: Long, numPartitions: Int): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) use read.jdbc()

  7. def jdbc(url: String, table: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) use read.jdbc()

  8. def jsonFile(path: String, samplingRatio: Double): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json()

  9. def jsonFile(path: String, schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json()

  10. def jsonFile(path: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json()

  11. def jsonRDD(json: JavaRDD[String], samplingRatio: Double): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json()

  12. def jsonRDD(json: RDD[String], samplingRatio: Double): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json()

  13. def jsonRDD(json: JavaRDD[String], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json()

  14. def jsonRDD(json: RDD[String], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json()

  15. def jsonRDD(json: JavaRDD[String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json()

  16. def jsonRDD(json: RDD[String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json()

  17. def load(source: String, schema: StructType, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.format(source).schema(schema).options(options).load()

  18. def load(source: String, schema: StructType, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.format(source).schema(schema).options(options).load()

  19. def load(source: String, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.format(source).options(options).load()

  20. def load(source: String, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.format(source).options(options).load()

  21. def load(path: String, source: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.format(source).load(path)

  22. def load(path: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.load(path)

  23. def parquetFile(paths: String*): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated @varargs()
    Deprecated

    (Since version 1.4.0) Use read.parquet()

Inherited from SQLContext

Inherited from Serializable

Inherited from Serializable

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Inherited by implicit conversion any2stringadd from CassandraSQLContext to any2stringadd[CassandraSQLContext]

Inherited by implicit conversion StringFormat from CassandraSQLContext to StringFormat[CassandraSQLContext]

Inherited by implicit conversion Ensuring from CassandraSQLContext to Ensuring[CassandraSQLContext]

Inherited by implicit conversion ArrowAssoc from CassandraSQLContext to ArrowAssoc[CassandraSQLContext]

Ungrouped