com.datastax.spark.connector.rdd.partitioner

CassandraPartitionedRDD

class CassandraPartitionedRDD[T] extends RDD[T]

RDD created by repartitionByCassandraReplica with preferred locations mapping to the CassandraReplicas each partition was created for.

Linear Supertypes
RDD[T], Logging, Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. CassandraPartitionedRDD
  2. RDD
  3. Logging
  4. Serializable
  5. Serializable
  6. AnyRef
  7. Any
Implicitly
  1. by rddToPairRDDFunctions
  2. by numericRDDToDoubleRDDFunctions
  3. by doubleRDDToDoubleRDDFunctions
  4. by rddToOrderedRDDFunctions
  5. by rddToSequenceFileRDDFunctions
  6. by rddToAsyncRDDActions
  7. by toPairRDDFunctions
  8. by toRDDFunctions
  9. by any2stringadd
  10. by any2stringfmt
  11. by any2ArrowAssoc
  12. by any2Ensuring
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new CassandraPartitionedRDD(prev: RDD[T], keyspace: String, table: String)(implicit ct: ClassTag[T])

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. def +(other: String): String

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to StringAdd performed by method any2stringadd in scala.Predef.
    Definition Classes
    StringAdd
  5. def ++(other: RDD[T]): RDD[T]

    Definition Classes
    RDD
  6. def ->[B](y: B): (CassandraPartitionedRDD[T], B)

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to ArrowAssoc[CassandraPartitionedRDD[T]] performed by method any2ArrowAssoc in scala.Predef.
    Definition Classes
    ArrowAssoc
    Annotations
    @inline()
  7. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  8. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  9. def aggregate[U](zeroValue: U)(seqOp: (U, T) ⇒ U, combOp: (U, U) ⇒ U)(implicit arg0: ClassTag[U]): U

    Definition Classes
    RDD
  10. def aggregateByKey[U](zeroValue: U)(seqOp: (U, V) ⇒ U, combOp: (U, U) ⇒ U)(implicit arg0: ClassTag[U]): RDD[(K, U)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  11. def aggregateByKey[U](zeroValue: U, numPartitions: Int)(seqOp: (U, V) ⇒ U, combOp: (U, U) ⇒ U)(implicit arg0: ClassTag[U]): RDD[(K, U)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  12. def aggregateByKey[U](zeroValue: U, partitioner: Partitioner)(seqOp: (U, V) ⇒ U, combOp: (U, U) ⇒ U)(implicit arg0: ClassTag[U]): RDD[(K, U)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  13. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  14. def cache(): CassandraPartitionedRDD.this.type

    Definition Classes
    RDD
  15. def cartesian[U](other: RDD[U])(implicit arg0: ClassTag[U]): RDD[(T, U)]

    Definition Classes
    RDD
  16. def checkpoint(): Unit

    Definition Classes
    RDD
  17. def clearDependencies(): Unit

    Attributes
    protected
    Definition Classes
    RDD
  18. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  19. def coalesce(numPartitions: Int, shuffle: Boolean, partitionCoalescer: Option[PartitionCoalescer])(implicit ord: Ordering[T]): RDD[T]

    Definition Classes
    RDD
  20. def cogroup[W1, W2, W3](other1: RDD[(K, W1)], other2: RDD[(K, W2)], other3: RDD[(K, W3)], numPartitions: Int): RDD[(K, (Iterable[V], Iterable[W1], Iterable[W2], Iterable[W3]))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  21. def cogroup[W1, W2](other1: RDD[(K, W1)], other2: RDD[(K, W2)], numPartitions: Int): RDD[(K, (Iterable[V], Iterable[W1], Iterable[W2]))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  22. def cogroup[W](other: RDD[(K, W)], numPartitions: Int): RDD[(K, (Iterable[V], Iterable[W]))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  23. def cogroup[W1, W2](other1: RDD[(K, W1)], other2: RDD[(K, W2)]): RDD[(K, (Iterable[V], Iterable[W1], Iterable[W2]))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  24. def cogroup[W](other: RDD[(K, W)]): RDD[(K, (Iterable[V], Iterable[W]))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  25. def cogroup[W1, W2, W3](other1: RDD[(K, W1)], other2: RDD[(K, W2)], other3: RDD[(K, W3)]): RDD[(K, (Iterable[V], Iterable[W1], Iterable[W2], Iterable[W3]))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  26. def cogroup[W1, W2](other1: RDD[(K, W1)], other2: RDD[(K, W2)], partitioner: Partitioner): RDD[(K, (Iterable[V], Iterable[W1], Iterable[W2]))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  27. def cogroup[W](other: RDD[(K, W)], partitioner: Partitioner): RDD[(K, (Iterable[V], Iterable[W]))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  28. def cogroup[W1, W2, W3](other1: RDD[(K, W1)], other2: RDD[(K, W2)], other3: RDD[(K, W3)], partitioner: Partitioner): RDD[(K, (Iterable[V], Iterable[W1], Iterable[W2], Iterable[W3]))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  29. def collect[U](f: PartialFunction[T, U])(implicit arg0: ClassTag[U]): RDD[U]

    Definition Classes
    RDD
  30. def collect(): Array[T]

    Definition Classes
    RDD
  31. def collectAsMap(): Map[K, V]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  32. def collectAsync(): FutureAction[Seq[T]]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to AsyncRDDActions[T] performed by method rddToAsyncRDDActions in org.apache.spark.rdd.RDD. This conversion will take place only if T is accompanied by a ClassTag, which is a runtime representation of its type that survives erasure (T: ClassTag).
    Definition Classes
    AsyncRDDActions
  33. def combineByKey[C](createCombiner: (V) ⇒ C, mergeValue: (C, V) ⇒ C, mergeCombiners: (C, C) ⇒ C): RDD[(K, C)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  34. def combineByKey[C](createCombiner: (V) ⇒ C, mergeValue: (C, V) ⇒ C, mergeCombiners: (C, C) ⇒ C, numPartitions: Int): RDD[(K, C)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  35. def combineByKey[C](createCombiner: (V) ⇒ C, mergeValue: (C, V) ⇒ C, mergeCombiners: (C, C) ⇒ C, partitioner: Partitioner, mapSideCombine: Boolean, serializer: Serializer): RDD[(K, C)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  36. def combineByKeyWithClassTag[C](createCombiner: (V) ⇒ C, mergeValue: (C, V) ⇒ C, mergeCombiners: (C, C) ⇒ C)(implicit ct: ClassTag[C]): RDD[(K, C)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
    Annotations
    @Experimental()
  37. def combineByKeyWithClassTag[C](createCombiner: (V) ⇒ C, mergeValue: (C, V) ⇒ C, mergeCombiners: (C, C) ⇒ C, numPartitions: Int)(implicit ct: ClassTag[C]): RDD[(K, C)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
    Annotations
    @Experimental()
  38. def combineByKeyWithClassTag[C](createCombiner: (V) ⇒ C, mergeValue: (C, V) ⇒ C, mergeCombiners: (C, C) ⇒ C, partitioner: Partitioner, mapSideCombine: Boolean, serializer: Serializer)(implicit ct: ClassTag[C]): RDD[(K, C)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
    Annotations
    @Experimental()
  39. def compute(split: Partition, context: TaskContext): Iterator[T]

    Definition Classes
    CassandraPartitionedRDD → RDD
  40. def context: SparkContext

    Definition Classes
    RDD
  41. def count(): Long

    Definition Classes
    RDD
  42. def countApprox(timeout: Long, confidence: Double): PartialResult[BoundedDouble]

    Definition Classes
    RDD
  43. def countApproxDistinct(relativeSD: Double): Long

    Definition Classes
    RDD
  44. def countApproxDistinct(p: Int, sp: Int): Long

    Definition Classes
    RDD
  45. def countApproxDistinctByKey(relativeSD: Double): RDD[(K, Long)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  46. def countApproxDistinctByKey(relativeSD: Double, numPartitions: Int): RDD[(K, Long)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  47. def countApproxDistinctByKey(relativeSD: Double, partitioner: Partitioner): RDD[(K, Long)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  48. def countApproxDistinctByKey(p: Int, sp: Int, partitioner: Partitioner): RDD[(K, Long)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  49. def countAsync(): FutureAction[Long]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to AsyncRDDActions[T] performed by method rddToAsyncRDDActions in org.apache.spark.rdd.RDD. This conversion will take place only if T is accompanied by a ClassTag, which is a runtime representation of its type that survives erasure (T: ClassTag).
    Definition Classes
    AsyncRDDActions
  50. def countByKey(): Map[K, Long]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  51. def countByKeyApprox(timeout: Long, confidence: Double): PartialResult[Map[K, BoundedDouble]]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  52. def countByValue()(implicit ord: Ordering[T]): Map[T, Long]

    Definition Classes
    RDD
  53. def countByValueApprox(timeout: Long, confidence: Double)(implicit ord: Ordering[T]): PartialResult[Map[T, BoundedDouble]]

    Definition Classes
    RDD
  54. def deleteFromCassandra(keyspaceName: String, tableName: String, deleteColumns: ColumnSelector = SomeColumns(), keyColumns: ColumnSelector = PrimaryKeyColumns, writeConf: WriteConf = ...)(implicit connector: CassandraConnector = CassandraConnector(sparkContext), rwf: RowWriterFactory[T]): Unit

    Delete data from Cassandra table, using data from the RDD as primary keys.

    Delete data from Cassandra table, using data from the RDD as primary keys. Uses the specified column names.

    keyspaceName

    the name of the Keyspace to use

    tableName

    the name of the Table to use

    deleteColumns

    The list of column names to delete, empty ColumnSelector means full row.

    keyColumns

    Primary key columns selector, Optional. All RDD primary columns columns will be checked by default

    writeConf

    additional configuration object allowing to set consistency level, batch size, etc.

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to RDDFunctions[T] performed by method toRDDFunctions in com.datastax.spark.connector.
    Definition Classes
    RDDFunctionsWritableToCassandra
    See also

    com.datastax.spark.connector.writer.WritableToCassandra

  55. final def dependencies: Seq[Dependency[_]]

    Definition Classes
    RDD
  56. def distinct(): RDD[T]

    Definition Classes
    RDD
  57. def distinct(numPartitions: Int)(implicit ord: Ordering[T]): RDD[T]

    Definition Classes
    RDD
  58. def ensuring(cond: (CassandraPartitionedRDD[T]) ⇒ Boolean, msg: ⇒ Any): CassandraPartitionedRDD[T]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to Ensuring[CassandraPartitionedRDD[T]] performed by method any2Ensuring in scala.Predef. This conversion will take place only if T is a superclass of Any and a subclass of (Nothing, Nothing) with Double (T >: Any <: (Nothing, Nothing) with Double).
    Definition Classes
    Ensuring
  59. def ensuring(cond: (CassandraPartitionedRDD[T]) ⇒ Boolean): CassandraPartitionedRDD[T]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to Ensuring[CassandraPartitionedRDD[T]] performed by method any2Ensuring in scala.Predef. This conversion will take place only if T is a superclass of Any and a subclass of (Nothing, Nothing) with Double (T >: Any <: (Nothing, Nothing) with Double).
    Definition Classes
    Ensuring
  60. def ensuring(cond: Boolean, msg: ⇒ Any): CassandraPartitionedRDD[T]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to Ensuring[CassandraPartitionedRDD[T]] performed by method any2Ensuring in scala.Predef. This conversion will take place only if T is a superclass of Any and a subclass of (Nothing, Nothing) with Double (T >: Any <: (Nothing, Nothing) with Double).
    Definition Classes
    Ensuring
  61. def ensuring(cond: Boolean): CassandraPartitionedRDD[T]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to Ensuring[CassandraPartitionedRDD[T]] performed by method any2Ensuring in scala.Predef. This conversion will take place only if T is a superclass of Any and a subclass of (Nothing, Nothing) with Double (T >: Any <: (Nothing, Nothing) with Double).
    Definition Classes
    Ensuring
  62. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  63. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  64. def filter(f: (T) ⇒ Boolean): RDD[T]

    Definition Classes
    RDD
  65. def filterByRange(lower: K, upper: K): RDD[(K, V)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to OrderedRDDFunctions[K, V, (K, V)] performed by method rddToOrderedRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type Ordering[K] is in scope
    2. an implicit value of type ClassTag[K] is in scope
    3. an implicit value of type ClassTag[V] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    OrderedRDDFunctions
  66. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  67. def first(): T

    Definition Classes
    RDD
  68. def firstParent[U](implicit arg0: ClassTag[U]): RDD[U]

    Attributes
    protected[org.apache.spark]
    Definition Classes
    RDD
  69. def flatMap[U](f: (T) ⇒ TraversableOnce[U])(implicit arg0: ClassTag[U]): RDD[U]

    Definition Classes
    RDD
  70. def flatMapValues[U](f: (V) ⇒ TraversableOnce[U]): RDD[(K, U)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  71. def fold(zeroValue: T)(op: (T, T) ⇒ T): T

    Definition Classes
    RDD
  72. def foldByKey(zeroValue: V)(func: (V, V) ⇒ V): RDD[(K, V)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  73. def foldByKey(zeroValue: V, numPartitions: Int)(func: (V, V) ⇒ V): RDD[(K, V)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  74. def foldByKey(zeroValue: V, partitioner: Partitioner)(func: (V, V) ⇒ V): RDD[(K, V)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  75. def foreach(f: (T) ⇒ Unit): Unit

    Definition Classes
    RDD
  76. def foreachAsync(f: (T) ⇒ Unit): FutureAction[Unit]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to AsyncRDDActions[T] performed by method rddToAsyncRDDActions in org.apache.spark.rdd.RDD. This conversion will take place only if T is accompanied by a ClassTag, which is a runtime representation of its type that survives erasure (T: ClassTag).
    Definition Classes
    AsyncRDDActions
  77. def foreachPartition(f: (Iterator[T]) ⇒ Unit): Unit

    Definition Classes
    RDD
  78. def foreachPartitionAsync(f: (Iterator[T]) ⇒ Unit): FutureAction[Unit]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to AsyncRDDActions[T] performed by method rddToAsyncRDDActions in org.apache.spark.rdd.RDD. This conversion will take place only if T is accompanied by a ClassTag, which is a runtime representation of its type that survives erasure (T: ClassTag).
    Definition Classes
    AsyncRDDActions
  79. def formatted(fmtstr: String): String

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to StringFormat performed by method any2stringfmt in scala.Predef.
    Definition Classes
    StringFormat
    Annotations
    @inline()
  80. def fullOuterJoin[W](other: RDD[(K, W)], numPartitions: Int): RDD[(K, (Option[V], Option[W]))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  81. def fullOuterJoin[W](other: RDD[(K, W)]): RDD[(K, (Option[V], Option[W]))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  82. def fullOuterJoin[W](other: RDD[(K, W)], partitioner: Partitioner): RDD[(K, (Option[V], Option[W]))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  83. def getCheckpointFile: Option[String]

    Definition Classes
    RDD
  84. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  85. def getDependencies: Seq[Dependency[_]]

    Attributes
    protected
    Definition Classes
    RDD
  86. final def getNumPartitions: Int

    Definition Classes
    RDD
    Annotations
    @Since( "1.6.0" )
  87. def getPartitions: Array[Partition]

    Definition Classes
    CassandraPartitionedRDD → RDD
  88. def getPreferredLocations(split: Partition): Seq[String]

    Definition Classes
    CassandraPartitionedRDD → RDD
  89. def getStorageLevel: StorageLevel

    Definition Classes
    RDD
  90. def glom(): RDD[Array[T]]

    Definition Classes
    RDD
  91. def groupBy[K](f: (T) ⇒ K, p: Partitioner)(implicit kt: ClassTag[K], ord: Ordering[K]): RDD[(K, Iterable[T])]

    Definition Classes
    RDD
  92. def groupBy[K](f: (T) ⇒ K, numPartitions: Int)(implicit kt: ClassTag[K]): RDD[(K, Iterable[T])]

    Definition Classes
    RDD
  93. def groupBy[K](f: (T) ⇒ K)(implicit kt: ClassTag[K]): RDD[(K, Iterable[T])]

    Definition Classes
    RDD
  94. def groupByKey(): RDD[(K, Iterable[V])]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  95. def groupByKey(numPartitions: Int): RDD[(K, Iterable[V])]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  96. def groupByKey(partitioner: Partitioner): RDD[(K, Iterable[V])]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  97. def groupWith[W1, W2, W3](other1: RDD[(K, W1)], other2: RDD[(K, W2)], other3: RDD[(K, W3)]): RDD[(K, (Iterable[V], Iterable[W1], Iterable[W2], Iterable[W3]))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  98. def groupWith[W1, W2](other1: RDD[(K, W1)], other2: RDD[(K, W2)]): RDD[(K, (Iterable[V], Iterable[W1], Iterable[W2]))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  99. def groupWith[W](other: RDD[(K, W)]): RDD[(K, (Iterable[V], Iterable[W]))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  100. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  101. val id: Int

    Definition Classes
    RDD
  102. def initializeLogIfNecessary(isInterpreter: Boolean): Unit

    Attributes
    protected
    Definition Classes
    Logging
  103. def intersection(other: RDD[T], numPartitions: Int): RDD[T]

    Definition Classes
    RDD
  104. def intersection(other: RDD[T], partitioner: Partitioner)(implicit ord: Ordering[T]): RDD[T]

    Definition Classes
    RDD
  105. def intersection(other: RDD[T]): RDD[T]

    Definition Classes
    RDD
  106. def isCheckpointed: Boolean

    Definition Classes
    RDD
  107. def isEmpty(): Boolean

    Definition Classes
    RDD
  108. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  109. def isTraceEnabled(): Boolean

    Attributes
    protected
    Definition Classes
    Logging
  110. final def iterator(split: Partition, context: TaskContext): Iterator[T]

    Definition Classes
    RDD
  111. def join[W](other: RDD[(K, W)], numPartitions: Int): RDD[(K, (V, W))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  112. def join[W](other: RDD[(K, W)]): RDD[(K, (V, W))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  113. def join[W](other: RDD[(K, W)], partitioner: Partitioner): RDD[(K, (V, W))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  114. def joinWithCassandraTable[R](keyspaceName: String, tableName: String, selectedColumns: ColumnSelector = AllColumns, joinColumns: ColumnSelector = PartitionKeyColumns)(implicit connector: CassandraConnector = CassandraConnector(sparkContext), newType: ClassTag[R], rrf: RowReaderFactory[R], ev: ValidRDDType[R], currentType: ClassTag[T], rwf: RowWriterFactory[T]): CassandraJoinRDD[T, R]

    Uses the data from RDD to join with a Cassandra table without retrieving the entire table.

    Uses the data from RDD to join with a Cassandra table without retrieving the entire table. Any RDD which can be used to saveToCassandra can be used to joinWithCassandra as well as any RDD which only specifies the partition Key of a Cassandra Table. This method executes single partition requests against the Cassandra Table and accepts the functional modifiers that a normal com.datastax.spark.connector.rdd.CassandraTableScanRDD takes.

    By default this method only uses the Partition Key for joining but any combination of columns which are acceptable to C* can be used in the join. Specify columns using joinColumns as a parameter or the on() method.

    Example With Prior Repartitioning:

    val source = sc.parallelize(keys).map(x => new KVRow(x))
    val repart = source.repartitionByCassandraReplica(keyspace, tableName, 10)
    val someCass = repart.joinWithCassandraTable(keyspace, tableName)

    Example Joining on Clustering Columns:

    val source = sc.parallelize(keys).map(x => (x, x * 100))
    val someCass = source.joinWithCassandraTable(keyspace, wideTable).on(SomeColumns("key", "group"))
    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to RDDFunctions[T] performed by method toRDDFunctions in com.datastax.spark.connector.
    Definition Classes
    RDDFunctions
  115. def keyBy[K](f: (T) ⇒ K): RDD[(K, T)]

    Definition Classes
    RDD
  116. def keyByCassandraReplica(keyspaceName: String, tableName: String, partitionKeyMapper: ColumnSelector = PartitionKeyColumns)(implicit connector: CassandraConnector = CassandraConnector(sparkContext), currentType: ClassTag[T], rwf: RowWriterFactory[T]): RDD[(Set[InetAddress], T)]

    Key every row in the RDD by with the IP Adresses of all of the Cassandra nodes which a contain a replica of the data specified by that row.

    Key every row in the RDD by with the IP Adresses of all of the Cassandra nodes which a contain a replica of the data specified by that row. The calling RDD must have rows that can be converted into the partition key of the given Cassandra Table.

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to RDDFunctions[T] performed by method toRDDFunctions in com.datastax.spark.connector.
    Definition Classes
    RDDFunctions
  117. def keys: RDD[K]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  118. def leftJoinWithCassandraTable[R](keyspaceName: String, tableName: String, selectedColumns: ColumnSelector = AllColumns, joinColumns: ColumnSelector = PartitionKeyColumns)(implicit connector: CassandraConnector = CassandraConnector(sparkContext), newType: ClassTag[R], rrf: RowReaderFactory[R], ev: ValidRDDType[R], currentType: ClassTag[T], rwf: RowWriterFactory[T]): CassandraLeftJoinRDD[T, R]

    Uses the data from RDD to left join with a Cassandra table without retrieving the entire table.

    Uses the data from RDD to left join with a Cassandra table without retrieving the entire table. Any RDD which can be used to saveToCassandra can be used to leftJoinWithCassandra as well as any RDD which only specifies the partition Key of a Cassandra Table. This method executes single partition requests against the Cassandra Table and accepts the functional modifiers that a normal com.datastax.spark.connector.rdd.CassandraTableScanRDD takes.

    By default this method only uses the Partition Key for joining but any combination of columns which are acceptable to C* can be used in the join. Specify columns using joinColumns as a parameter or the on() method.

    Example With Prior Repartitioning:

    val source = sc.parallelize(keys).map(x => new KVRow(x))
    val repart = source.repartitionByCassandraReplica(keyspace, tableName, 10)
    val someCass = repart.leftJoinWithCassandraTable(keyspace, tableName)

    Example Joining on Clustering Columns:

    val source = sc.parallelize(keys).map(x => (x, x * 100))
    val someCass = source.leftJoinWithCassandraTable(keyspace, wideTable).on(SomeColumns("key", "group"))
    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to RDDFunctions[T] performed by method toRDDFunctions in com.datastax.spark.connector.
    Definition Classes
    RDDFunctions
  119. def leftOuterJoin[W](other: RDD[(K, W)], numPartitions: Int): RDD[(K, (V, Option[W]))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  120. def leftOuterJoin[W](other: RDD[(K, W)]): RDD[(K, (V, Option[W]))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  121. def leftOuterJoin[W](other: RDD[(K, W)], partitioner: Partitioner): RDD[(K, (V, Option[W]))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  122. def localCheckpoint(): CassandraPartitionedRDD.this.type

    Definition Classes
    RDD
  123. def log: Logger

    Attributes
    protected
    Definition Classes
    Logging
  124. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  125. def logDebug(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  126. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  127. def logError(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  128. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  129. def logInfo(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  130. def logName: String

    Attributes
    protected
    Definition Classes
    Logging
  131. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  132. def logTrace(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  133. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  134. def logWarning(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  135. def lookup(key: K): Seq[V]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  136. def map[U](f: (T) ⇒ U)(implicit arg0: ClassTag[U]): RDD[U]

    Definition Classes
    RDD
  137. def mapPartitions[U](f: (Iterator[T]) ⇒ Iterator[U], preservesPartitioning: Boolean)(implicit arg0: ClassTag[U]): RDD[U]

    Definition Classes
    RDD
  138. def mapPartitionsWithIndex[U](f: (Int, Iterator[T]) ⇒ Iterator[U], preservesPartitioning: Boolean)(implicit arg0: ClassTag[U]): RDD[U]

    Definition Classes
    RDD
  139. def mapValues[U](f: (V) ⇒ U): RDD[(K, U)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  140. def max()(implicit ord: Ordering[T]): T

    Definition Classes
    RDD
  141. def min()(implicit ord: Ordering[T]): T

    Definition Classes
    RDD
  142. var name: String

    Definition Classes
    RDD
  143. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  144. final def notify(): Unit

    Definition Classes
    AnyRef
  145. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  146. def parent[U](j: Int)(implicit arg0: ClassTag[U]): RDD[U]

    Attributes
    protected[org.apache.spark]
    Definition Classes
    RDD
  147. def partitionBy(partitioner: Partitioner): RDD[(K, V)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  148. val partitioner: Option[Partitioner]

    Definition Classes
    CassandraPartitionedRDD → RDD
  149. final def partitions: Array[Partition]

    Definition Classes
    RDD
  150. def persist(): CassandraPartitionedRDD.this.type

    Definition Classes
    RDD
  151. def persist(newLevel: StorageLevel): CassandraPartitionedRDD.this.type

    Definition Classes
    RDD
  152. def pipe(command: Seq[String], env: Map[String, String], printPipeContext: ((String) ⇒ Unit) ⇒ Unit, printRDDElement: (T, (String) ⇒ Unit) ⇒ Unit, separateWorkingDir: Boolean, bufferSize: Int, encoding: String): RDD[String]

    Definition Classes
    RDD
  153. def pipe(command: String, env: Map[String, String]): RDD[String]

    Definition Classes
    RDD
  154. def pipe(command: String): RDD[String]

    Definition Classes
    RDD
  155. final def preferredLocations(split: Partition): Seq[String]

    Definition Classes
    RDD
  156. def randomSplit(weights: Array[Double], seed: Long): Array[RDD[T]]

    Definition Classes
    RDD
  157. def reduce(f: (T, T) ⇒ T): T

    Definition Classes
    RDD
  158. def reduceByKey(func: (V, V) ⇒ V): RDD[(K, V)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  159. def reduceByKey(func: (V, V) ⇒ V, numPartitions: Int): RDD[(K, V)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  160. def reduceByKey(partitioner: Partitioner, func: (V, V) ⇒ V): RDD[(K, V)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  161. def reduceByKeyLocally(func: (V, V) ⇒ V): Map[K, V]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  162. def repartition(numPartitions: Int)(implicit ord: Ordering[T]): RDD[T]

    Definition Classes
    RDD
  163. def repartitionAndSortWithinPartitions(partitioner: Partitioner): RDD[(K, V)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to OrderedRDDFunctions[K, V, (K, V)] performed by method rddToOrderedRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type Ordering[K] is in scope
    2. an implicit value of type ClassTag[K] is in scope
    3. an implicit value of type ClassTag[V] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    OrderedRDDFunctions
  164. def repartitionByCassandraReplica(keyspaceName: String, tableName: String, partitionsPerHost: Int = 10, partitionKeyMapper: ColumnSelector = PartitionKeyColumns)(implicit connector: CassandraConnector = CassandraConnector(sparkContext), currentType: ClassTag[T], rwf: RowWriterFactory[T]): CassandraPartitionedRDD[T]

    Repartitions the data (via a shuffle) based upon the replication of the given keyspaceName and tableName.

    Repartitions the data (via a shuffle) based upon the replication of the given keyspaceName and tableName. Calling this method before using joinWithCassandraTable will ensure that requests will be coordinator local. partitionsPerHost Controls the number of Spark Partitions that will be created in this repartitioning event. The calling RDD must have rows that can be converted into the partition key of the given Cassandra Table.

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to RDDFunctions[T] performed by method toRDDFunctions in com.datastax.spark.connector.
    Definition Classes
    RDDFunctions
  165. def rightOuterJoin[W](other: RDD[(K, W)], numPartitions: Int): RDD[(K, (Option[V], W))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  166. def rightOuterJoin[W](other: RDD[(K, W)]): RDD[(K, (Option[V], W))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  167. def rightOuterJoin[W](other: RDD[(K, W)], partitioner: Partitioner): RDD[(K, (Option[V], W))]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  168. def sample(withReplacement: Boolean, fraction: Double, seed: Long): RDD[T]

    Definition Classes
    RDD
  169. def sampleByKey(withReplacement: Boolean, fractions: Map[K, Double], seed: Long): RDD[(K, V)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  170. def sampleByKeyExact(withReplacement: Boolean, fractions: Map[K, Double], seed: Long): RDD[(K, V)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  171. def saveAsCassandraTable(keyspaceName: String, tableName: String, columns: ColumnSelector = AllColumns, writeConf: WriteConf = ...)(implicit connector: CassandraConnector = CassandraConnector(sparkContext), rwf: RowWriterFactory[T], columnMapper: ColumnMapper[T]): Unit

    Saves the data from RDD to a new table with definition taken from the ColumnMapper for this class.

    Saves the data from RDD to a new table with definition taken from the ColumnMapper for this class.

    keyspaceName

    keyspace where to create a new table

    tableName

    name of the table to create; the table must not exist

    columns

    Selects the columns to save data to. Uses only the unique column names, and you must select at least all primary key columns. All other fields are discarded. Non-selected property/column names are left unchanged. This parameter does not affect table creation.

    writeConf

    additional configuration object allowing to set consistency level, batch size, etc.

    connector

    optional, implicit connector to Cassandra

    rwf

    factory for obtaining the row writer to be used to extract column values from items of the RDD

    columnMapper

    a column mapper determining the definition of the table

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to RDDFunctions[T] performed by method toRDDFunctions in com.datastax.spark.connector.
    Definition Classes
    RDDFunctions
  172. def saveAsCassandraTableEx(table: TableDef, columns: ColumnSelector = AllColumns, writeConf: WriteConf = ...)(implicit connector: CassandraConnector = CassandraConnector(sparkContext), rwf: RowWriterFactory[T]): Unit

    Saves the data from RDD to a new table defined by the given TableDef.

    Saves the data from RDD to a new table defined by the given TableDef.

    First it creates a new table with all columns from the TableDef and then it saves RDD content in the same way as saveToCassandra. The table must not exist prior to this call.

    table

    table definition used to create a new table

    columns

    Selects the columns to save data to. Uses only the unique column names, and you must select at least all primary key columns. All other fields are discarded. Non-selected property/column names are left unchanged. This parameter does not affect table creation.

    writeConf

    additional configuration object allowing to set consistency level, batch size, etc.

    connector

    optional, implicit connector to Cassandra

    rwf

    factory for obtaining the row writer to be used to extract column values from items of the RDD

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to RDDFunctions[T] performed by method toRDDFunctions in com.datastax.spark.connector.
    Definition Classes
    RDDFunctions
  173. def saveAsHadoopDataset(conf: JobConf): Unit

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  174. def saveAsHadoopFile(path: String, keyClass: Class[_], valueClass: Class[_], outputFormatClass: Class[_ <: OutputFormat[_, _]], conf: JobConf, codec: Option[Class[_ <: CompressionCodec]]): Unit

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  175. def saveAsHadoopFile(path: String, keyClass: Class[_], valueClass: Class[_], outputFormatClass: Class[_ <: OutputFormat[_, _]], codec: Class[_ <: CompressionCodec]): Unit

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  176. def saveAsHadoopFile[F <: OutputFormat[K, V]](path: String, codec: Class[_ <: CompressionCodec])(implicit fm: ClassTag[F]): Unit

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  177. def saveAsHadoopFile[F <: OutputFormat[K, V]](path: String)(implicit fm: ClassTag[F]): Unit

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  178. def saveAsNewAPIHadoopDataset(conf: Configuration): Unit

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  179. def saveAsNewAPIHadoopFile(path: String, keyClass: Class[_], valueClass: Class[_], outputFormatClass: Class[_ <: OutputFormat[_, _]], conf: Configuration): Unit

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  180. def saveAsNewAPIHadoopFile[F <: OutputFormat[K, V]](path: String)(implicit fm: ClassTag[F]): Unit

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  181. def saveAsObjectFile(path: String): Unit

    Definition Classes
    RDD
  182. def saveAsSequenceFile(path: String, codec: Option[Class[_ <: CompressionCodec]]): Unit

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to SequenceFileRDDFunctions[K, V] performed by method rddToSequenceFileRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type WritableFactory[K] is in scope
    4. an implicit value of type WritableFactory[V] is in scope
    5. T is (K, V) (T =:= (K, V))
    Definition Classes
    SequenceFileRDDFunctions
  183. def saveAsTextFile(path: String, codec: Class[_ <: CompressionCodec]): Unit

    Definition Classes
    RDD
  184. def saveAsTextFile(path: String): Unit

    Definition Classes
    RDD
  185. def saveToCassandra(keyspaceName: String, tableName: String, columns: ColumnSelector = AllColumns, writeConf: WriteConf = ...)(implicit connector: CassandraConnector = CassandraConnector(sparkContext), rwf: RowWriterFactory[T]): Unit

    Saves the data from RDD to a Cassandra table.

    Saves the data from RDD to a Cassandra table. Uses the specified column names.

    keyspaceName

    the name of the Keyspace to use

    tableName

    the name of the Table to use

    writeConf

    additional configuration object allowing to set consistency level, batch size, etc.

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to RDDFunctions[T] performed by method toRDDFunctions in com.datastax.spark.connector.
    Definition Classes
    RDDFunctionsWritableToCassandra
    See also

    com.datastax.spark.connector.writer.WritableToCassandra

  186. def setName(_name: String): CassandraPartitionedRDD.this.type

    Definition Classes
    RDD
  187. def sortBy[K](f: (T) ⇒ K, ascending: Boolean, numPartitions: Int)(implicit ord: Ordering[K], ctag: ClassTag[K]): RDD[T]

    Definition Classes
    RDD
  188. def sortByKey(ascending: Boolean, numPartitions: Int): RDD[(K, V)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to OrderedRDDFunctions[K, V, (K, V)] performed by method rddToOrderedRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type Ordering[K] is in scope
    2. an implicit value of type ClassTag[K] is in scope
    3. an implicit value of type ClassTag[V] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    OrderedRDDFunctions
  189. def spanBy[U](f: (T) ⇒ U): RDD[(U, Iterable[T])]

    Applies a function to each item, and groups consecutive items having the same value together.

    Applies a function to each item, and groups consecutive items having the same value together. Contrary to groupBy, items from the same group must be already next to each other in the original collection. Works locally on each partition, so items from different partitions will never be placed in the same group.

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to RDDFunctions[T] performed by method toRDDFunctions in com.datastax.spark.connector.
    Definition Classes
    RDDFunctions
  190. def spanByKey: RDD[(K, Seq[V])]

    Groups items with the same key, assuming the items with the same key are next to each other in the collection.

    Groups items with the same key, assuming the items with the same key are next to each other in the collection. It does not perform shuffle, therefore it is much faster than using much more universal Spark RDD groupByKey. For this method to be useful with Cassandra tables, the key must represent a prefix of the primary key, containing at least the partition key of the Cassandra table.

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to PairRDDFunctions[K, V] performed by method toPairRDDFunctions in com.datastax.spark.connector. This conversion will take place only if T is (K, V) (T =:= (K, V)).
    Definition Classes
    PairRDDFunctions
  191. def sparkContext: SparkContext

    Definition Classes
    RDD
  192. def subtract(other: RDD[T], p: Partitioner)(implicit ord: Ordering[T]): RDD[T]

    Definition Classes
    RDD
  193. def subtract(other: RDD[T], numPartitions: Int): RDD[T]

    Definition Classes
    RDD
  194. def subtract(other: RDD[T]): RDD[T]

    Definition Classes
    RDD
  195. def subtractByKey[W](other: RDD[(K, W)], p: Partitioner)(implicit arg0: ClassTag[W]): RDD[(K, V)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  196. def subtractByKey[W](other: RDD[(K, W)], numPartitions: Int)(implicit arg0: ClassTag[W]): RDD[(K, V)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  197. def subtractByKey[W](other: RDD[(K, W)])(implicit arg0: ClassTag[W]): RDD[(K, V)]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  198. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  199. def take(num: Int): Array[T]

    Definition Classes
    RDD
  200. def takeAsync(num: Int): FutureAction[Seq[T]]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to AsyncRDDActions[T] performed by method rddToAsyncRDDActions in org.apache.spark.rdd.RDD. This conversion will take place only if T is accompanied by a ClassTag, which is a runtime representation of its type that survives erasure (T: ClassTag).
    Definition Classes
    AsyncRDDActions
  201. def takeOrdered(num: Int)(implicit ord: Ordering[T]): Array[T]

    Definition Classes
    RDD
  202. def takeSample(withReplacement: Boolean, num: Int, seed: Long): Array[T]

    Definition Classes
    RDD
  203. def toDebugString: String

    Definition Classes
    RDD
  204. def toJavaRDD(): JavaRDD[T]

    Definition Classes
    RDD
  205. def toLocalIterator: Iterator[T]

    Definition Classes
    RDD
  206. def toString(): String

    Definition Classes
    RDD → AnyRef → Any
  207. def top(num: Int)(implicit ord: Ordering[T]): Array[T]

    Definition Classes
    RDD
  208. def treeAggregate[U](zeroValue: U)(seqOp: (U, T) ⇒ U, combOp: (U, U) ⇒ U, depth: Int)(implicit arg0: ClassTag[U]): U

    Definition Classes
    RDD
  209. def treeReduce(f: (T, T) ⇒ T, depth: Int): T

    Definition Classes
    RDD
  210. def union(other: RDD[T]): RDD[T]

    Definition Classes
    RDD
  211. def unpersist(blocking: Boolean): CassandraPartitionedRDD.this.type

    Definition Classes
    RDD
  212. def values: RDD[V]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. T is (K, V) (T =:= (K, V))
    Definition Classes
    PairRDDFunctions
  213. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  214. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  215. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  216. def zip[U](other: RDD[U])(implicit arg0: ClassTag[U]): RDD[(T, U)]

    Definition Classes
    RDD
  217. def zipPartitions[B, C, D, V](rdd2: RDD[B], rdd3: RDD[C], rdd4: RDD[D])(f: (Iterator[T], Iterator[B], Iterator[C], Iterator[D]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[C], arg2: ClassTag[D], arg3: ClassTag[V]): RDD[V]

    Definition Classes
    RDD
  218. def zipPartitions[B, C, D, V](rdd2: RDD[B], rdd3: RDD[C], rdd4: RDD[D], preservesPartitioning: Boolean)(f: (Iterator[T], Iterator[B], Iterator[C], Iterator[D]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[C], arg2: ClassTag[D], arg3: ClassTag[V]): RDD[V]

    Definition Classes
    RDD
  219. def zipPartitions[B, C, V](rdd2: RDD[B], rdd3: RDD[C])(f: (Iterator[T], Iterator[B], Iterator[C]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[C], arg2: ClassTag[V]): RDD[V]

    Definition Classes
    RDD
  220. def zipPartitions[B, C, V](rdd2: RDD[B], rdd3: RDD[C], preservesPartitioning: Boolean)(f: (Iterator[T], Iterator[B], Iterator[C]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[C], arg2: ClassTag[V]): RDD[V]

    Definition Classes
    RDD
  221. def zipPartitions[B, V](rdd2: RDD[B])(f: (Iterator[T], Iterator[B]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[V]): RDD[V]

    Definition Classes
    RDD
  222. def zipPartitions[B, V](rdd2: RDD[B], preservesPartitioning: Boolean)(f: (Iterator[T], Iterator[B]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[V]): RDD[V]

    Definition Classes
    RDD
  223. def zipWithIndex(): RDD[(T, Long)]

    Definition Classes
    RDD
  224. def zipWithUniqueId(): RDD[(T, Long)]

    Definition Classes
    RDD
  225. def [B](y: B): (CassandraPartitionedRDD[T], B)

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to ArrowAssoc[CassandraPartitionedRDD[T]] performed by method any2ArrowAssoc in scala.Predef.
    Definition Classes
    ArrowAssoc

Shadowed Implicit Value Members

  1. def histogram(buckets: Array[Double], evenBuckets: Boolean): Array[Long]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if T is a numeric class, such as Int, Long, Float or Double (T: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: DoubleRDDFunctions).histogram(buckets, evenBuckets)
    Definition Classes
    DoubleRDDFunctions
  2. def histogram(bucketCount: Int): (Array[Double], Array[Long])

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if T is a numeric class, such as Int, Long, Float or Double (T: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: DoubleRDDFunctions).histogram(bucketCount)
    Definition Classes
    DoubleRDDFunctions
  3. def histogram(buckets: Array[Double], evenBuckets: Boolean): Array[Long]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if T is Double (T =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: DoubleRDDFunctions).histogram(buckets, evenBuckets)
    Definition Classes
    DoubleRDDFunctions
  4. def histogram(bucketCount: Int): (Array[Double], Array[Long])

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if T is Double (T =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: DoubleRDDFunctions).histogram(bucketCount)
    Definition Classes
    DoubleRDDFunctions
  5. def mean(): Double

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if T is a numeric class, such as Int, Long, Float or Double (T: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: DoubleRDDFunctions).mean()
    Definition Classes
    DoubleRDDFunctions
  6. def mean(): Double

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if T is Double (T =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: DoubleRDDFunctions).mean()
    Definition Classes
    DoubleRDDFunctions
  7. def meanApprox(timeout: Long, confidence: Double): PartialResult[BoundedDouble]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if T is a numeric class, such as Int, Long, Float or Double (T: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: DoubleRDDFunctions).meanApprox(timeout, confidence)
    Definition Classes
    DoubleRDDFunctions
  8. def meanApprox(timeout: Long, confidence: Double): PartialResult[BoundedDouble]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if T is Double (T =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: DoubleRDDFunctions).meanApprox(timeout, confidence)
    Definition Classes
    DoubleRDDFunctions
  9. def sampleStdev(): Double

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if T is a numeric class, such as Int, Long, Float or Double (T: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: DoubleRDDFunctions).sampleStdev()
    Definition Classes
    DoubleRDDFunctions
  10. def sampleStdev(): Double

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if T is Double (T =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: DoubleRDDFunctions).sampleStdev()
    Definition Classes
    DoubleRDDFunctions
  11. def sampleVariance(): Double

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if T is a numeric class, such as Int, Long, Float or Double (T: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: DoubleRDDFunctions).sampleVariance()
    Definition Classes
    DoubleRDDFunctions
  12. def sampleVariance(): Double

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if T is Double (T =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: DoubleRDDFunctions).sampleVariance()
    Definition Classes
    DoubleRDDFunctions
  13. val self: Any

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to StringAdd performed by method any2stringadd in scala.Predef.
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: StringAdd).self
    Definition Classes
    StringAdd
  14. val self: Any

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to StringFormat performed by method any2stringfmt in scala.Predef.
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: StringFormat).self
    Definition Classes
    StringFormat
  15. val sparkContext: SparkContext

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to RDDFunctions[T] performed by method toRDDFunctions in com.datastax.spark.connector.
    Shadowing
    This implicitly inherited member is shadowed by one or more members in this class.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: RDDFunctions[T]).sparkContext
    Definition Classes
    RDDFunctionsWritableToCassandra
  16. def stats(): StatCounter

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if T is a numeric class, such as Int, Long, Float or Double (T: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: DoubleRDDFunctions).stats()
    Definition Classes
    DoubleRDDFunctions
  17. def stats(): StatCounter

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if T is Double (T =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: DoubleRDDFunctions).stats()
    Definition Classes
    DoubleRDDFunctions
  18. def stdev(): Double

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if T is a numeric class, such as Int, Long, Float or Double (T: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: DoubleRDDFunctions).stdev()
    Definition Classes
    DoubleRDDFunctions
  19. def stdev(): Double

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if T is Double (T =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: DoubleRDDFunctions).stdev()
    Definition Classes
    DoubleRDDFunctions
  20. def sum(): Double

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if T is a numeric class, such as Int, Long, Float or Double (T: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: DoubleRDDFunctions).sum()
    Definition Classes
    DoubleRDDFunctions
  21. def sum(): Double

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if T is Double (T =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: DoubleRDDFunctions).sum()
    Definition Classes
    DoubleRDDFunctions
  22. def sumApprox(timeout: Long, confidence: Double): PartialResult[BoundedDouble]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if T is a numeric class, such as Int, Long, Float or Double (T: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: DoubleRDDFunctions).sumApprox(timeout, confidence)
    Definition Classes
    DoubleRDDFunctions
  23. def sumApprox(timeout: Long, confidence: Double): PartialResult[BoundedDouble]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if T is Double (T =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: DoubleRDDFunctions).sumApprox(timeout, confidence)
    Definition Classes
    DoubleRDDFunctions
  24. def variance(): Double

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if T is a numeric class, such as Int, Long, Float or Double (T: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: DoubleRDDFunctions).variance()
    Definition Classes
    DoubleRDDFunctions
  25. def variance(): Double

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if T is Double (T =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: DoubleRDDFunctions).variance()
    Definition Classes
    DoubleRDDFunctions

Deprecated Value Members

  1. def x: CassandraPartitionedRDD[T]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to ArrowAssoc[CassandraPartitionedRDD[T]] performed by method any2ArrowAssoc in scala.Predef.
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: ArrowAssoc[CassandraPartitionedRDD[T]]).x
    Definition Classes
    ArrowAssoc
    Annotations
    @deprecated
    Deprecated

    (Since version 2.10.0) Use leftOfArrow instead

  2. def x: CassandraPartitionedRDD[T]

    Implicit information
    This member is added by an implicit conversion from CassandraPartitionedRDD[T] to Ensuring[CassandraPartitionedRDD[T]] performed by method any2Ensuring in scala.Predef. This conversion will take place only if T is a superclass of Any and a subclass of (Nothing, Nothing) with Double (T >: Any <: (Nothing, Nothing) with Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraPartitionedRDD: Ensuring[CassandraPartitionedRDD[T]]).x
    Definition Classes
    Ensuring
    Annotations
    @deprecated
    Deprecated

    (Since version 2.10.0) Use resultOfEnsuring instead

Inherited from RDD[T]

Inherited from Logging

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Inherited by implicit conversion rddToPairRDDFunctions from CassandraPartitionedRDD[T] to org.apache.spark.rdd.PairRDDFunctions[K, V]

Inherited by implicit conversion numericRDDToDoubleRDDFunctions from CassandraPartitionedRDD[T] to DoubleRDDFunctions

Inherited by implicit conversion doubleRDDToDoubleRDDFunctions from CassandraPartitionedRDD[T] to DoubleRDDFunctions

Inherited by implicit conversion rddToOrderedRDDFunctions from CassandraPartitionedRDD[T] to OrderedRDDFunctions[K, V, (K, V)]

Inherited by implicit conversion rddToSequenceFileRDDFunctions from CassandraPartitionedRDD[T] to SequenceFileRDDFunctions[K, V]

Inherited by implicit conversion rddToAsyncRDDActions from CassandraPartitionedRDD[T] to AsyncRDDActions[T]

Inherited by implicit conversion toPairRDDFunctions from CassandraPartitionedRDD[T] to PairRDDFunctions[K, V]

Inherited by implicit conversion toRDDFunctions from CassandraPartitionedRDD[T] to RDDFunctions[T]

Inherited by implicit conversion any2stringadd from CassandraPartitionedRDD[T] to StringAdd

Inherited by implicit conversion any2stringfmt from CassandraPartitionedRDD[T] to StringFormat

Inherited by implicit conversion any2ArrowAssoc from CassandraPartitionedRDD[T] to ArrowAssoc[CassandraPartitionedRDD[T]]

Inherited by implicit conversion any2Ensuring from CassandraPartitionedRDD[T] to Ensuring[CassandraPartitionedRDD[T]]

Ungrouped