SparkContext-wrapper that is also a Hadoop configuration object (and Serializable).
Minimal trait for objects that can be used as their own org.apache.spark.serializer.KryoRegistrators, for ease of encapsulating Kryo-registration information in tandem with org.hammerlab.kryo.Registrar.register syntax.
Interface for applications to register fall-back default Spark-configuration values, using the SparkConfBase.sparkConf method below.
Interface for applications to register fall-back default Spark-configuration values, using the SparkConfBase.sparkConf method below.
Configs are added to a org.apache.spark.SparkConf after it's been instantiated and other defaults have been applied to it, and are only written to keys that have no value.
Convenience method for loading a SparkConf with initial values taken from a comma-delimited list of files in the SPARK_PROPERTIES_FILES environment variable (as well as system properties as usual).
Generate a Spark org.apache.spark.Partitioner that maps elements to a partition indicated by an Int that either is the key, or is the first element of a tuple.
Convenience-methods for creating Spark Partitioners from (partial-)functions