Object/Class

com.basho.riak.spark.rdd

ReadConf

Related Docs: class ReadConf | package rdd

Permalink

object ReadConf extends Serializable

Linear Supertypes
Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. ReadConf
  2. Serializable
  3. Serializable
  4. AnyRef
  5. Any
  1. Hide All
  2. Show all
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final val DefaultFetchSize: Int(1000)

    Permalink
  5. final val DefaultSplitCount: Int(10)

    Permalink
  6. final val DefaultTsTimestampBinding: UseTimestamp.type

    Permalink
  7. final val DefaultUseStreamingValues4FBRead: Boolean

    Permalink
  8. def apply(conf: SparkConf, options: Map[String, String]): ReadConf

    Permalink

    Creates ReadConf based on an externally provided map of properties to override those of SparkCon

    Creates ReadConf based on an externally provided map of properties to override those of SparkCon

    conf

    SparkConf of Spark context to be taken as defaults

    options

    externally provided map of properties

  9. def apply(conf: SparkConf): ReadConf

    Permalink

    Creates ReadConf based on properties provided to Spark Conf

    Creates ReadConf based on properties provided to Spark Conf

    conf

    SparkConf of Spark context with Riak-related properties

  10. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  11. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  12. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  13. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  14. final val fetchSizePropName: String("spark.riak.input.fetch-size")

    Permalink
  15. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  16. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  17. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  18. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  19. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  20. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  21. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  22. final val smartSplitMultiplier: Int(3)

    Permalink

    The docs recommend to have your number of partitions set to 3 or 4 times the number of CPUs in your cluster so that the work gets distributed more evenly among the CPUs.

    The docs recommend to have your number of partitions set to 3 or 4 times the number of CPUs in your cluster so that the work gets distributed more evenly among the CPUs. Meaning, if you only have 1 partition per core in the cluster you will have to wait for the one longest running task to complete but if you had broken that down further the workload would be more evenly balanced with fast and slow running tasks evening out.

    Since there is no enough information about available Spark resources such as real number of cores, 3x multiplier will be used.

  23. final val splitCountPropName: String("spark.riak.input.split.count")

    Permalink
  24. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  25. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  26. final val tsBindingsTimestamp: String("spark.riakts.bindings.timestamp")

    Permalink
  27. final val tsQuantumPropName: String("spark.riak.partitioning.ts-quantum")

    Permalink
  28. final val tsRangeFieldPropName: String("spark.riak.partitioning.ts-range-field-name")

    Permalink
  29. final val useStreamingValuesPropName: String("spark.riak.fullbucket.use-streaming-values")

    Permalink
  30. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  31. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  32. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped