Trait/Object

io.univalence.sparktest

SparkTest

Related Docs: object SparkTest | package sparktest

Permalink

trait SparkTest extends SparkTestSQLImplicits with ReadOps

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. SparkTest
  2. ReadOps
  3. HasSparkSession
  4. SparkTestSQLImplicits
  5. AnyRef
  6. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. case class SchemaError(modifications: Seq[SchemaModification]) extends Exception with SparkTestError with Product with Serializable

    Permalink
  2. implicit class SparkTestDfOps extends AnyRef

    Permalink
  3. implicit class SparkTestDsOps[T] extends AnyRef

    Permalink
  4. sealed trait SparkTestError extends Exception

    Permalink
  5. implicit class SparkTestRDDOps[T] extends AnyRef

    Permalink
  6. implicit class StringToColumn extends AnyRef

    Permalink

    Converts $"col name" into a Column.

    Converts $"col name" into a Column.

    Definition Classes
    SparkTestSQLImplicits
    Since

    2.0.0

  7. case class ValueError(modifications: Seq[Seq[ObjectModification]], thisDf: DataFrame, otherDf: DataFrame) extends Exception with SparkTestError with Product with Serializable

    Permalink

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. def _sqlContext: SQLContext

    Permalink
    Attributes
    protected
    Definition Classes
    SparkTestSparkTestSQLImplicits
  5. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  6. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  7. def dataframe(json: String*): DataFrame

    Permalink

    Create a dataframe using a json

    Create a dataframe using a json

    Example: val df = dataframe("{a:1, b:true}", "{a:2, b:false}") | a | b | +---+-------+ | 1 | true | | 2 | false |

    json

    json's agruments, each argument represent one line of our dataframe

    returns

    a dataframe

    Definition Classes
    ReadOps
  8. def dataset[T](value: T*)(implicit arg0: Encoder[T], arg1: ClassTag[T]): Dataset[T]

    Permalink
    Definition Classes
    ReadOps
  9. def dfFromJsonFile(path: String): DataFrame

    Permalink
    Definition Classes
    ReadOps
  10. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  11. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  12. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  13. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  14. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  15. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  16. def loadJson(filenames: String*): DataFrame

    Permalink
    Definition Classes
    ReadOps
  17. implicit def localSeqToDatasetHolder[T](s: Seq[T])(implicit arg0: Encoder[T]): DatasetHolder[T]

    Permalink

    Creates a Dataset from a local Seq.

    Creates a Dataset from a local Seq.

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.0

  18. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  19. implicit def newBooleanArrayEncoder: Encoder[Array[Boolean]]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.1

  20. implicit def newBooleanEncoder: Encoder[Boolean]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.0

  21. implicit def newBooleanSeqEncoder: Encoder[Seq[Boolean]]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.1

  22. implicit def newBoxedBooleanEncoder: Encoder[Boolean]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    2.0.0

  23. implicit def newBoxedByteEncoder: Encoder[Byte]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    2.0.0

  24. implicit def newBoxedDoubleEncoder: Encoder[Double]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    2.0.0

  25. implicit def newBoxedFloatEncoder: Encoder[Float]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    2.0.0

  26. implicit def newBoxedIntEncoder: Encoder[Integer]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    2.0.0

  27. implicit def newBoxedLongEncoder: Encoder[Long]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    2.0.0

  28. implicit def newBoxedShortEncoder: Encoder[Short]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    2.0.0

  29. implicit def newByteArrayEncoder: Encoder[Array[Byte]]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.1

  30. implicit def newByteEncoder: Encoder[Byte]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.0

  31. implicit def newByteSeqEncoder: Encoder[Seq[Byte]]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.1

  32. implicit def newDoubleArrayEncoder: Encoder[Array[Double]]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.1

  33. implicit def newDoubleEncoder: Encoder[Double]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.0

  34. implicit def newDoubleSeqEncoder: Encoder[Seq[Double]]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.1

  35. implicit def newFloatArrayEncoder: Encoder[Array[Float]]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.1

  36. implicit def newFloatEncoder: Encoder[Float]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.0

  37. implicit def newFloatSeqEncoder: Encoder[Seq[Float]]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.1

  38. implicit def newIntArrayEncoder: Encoder[Array[Int]]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.1

  39. implicit def newIntEncoder: Encoder[Int]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.0

  40. implicit def newIntSeqEncoder: Encoder[Seq[Int]]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.1

  41. implicit def newLongArrayEncoder: Encoder[Array[Long]]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.1

  42. implicit def newLongEncoder: Encoder[Long]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.0

  43. implicit def newLongSeqEncoder: Encoder[Seq[Long]]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.1

  44. implicit def newProductArrayEncoder[A <: Product](implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): Encoder[Array[A]]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.1

  45. implicit def newProductEncoder[T <: Product](implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Encoder[T]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.0

  46. implicit def newProductSeqEncoder[A <: Product](implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): Encoder[Seq[A]]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.1

  47. implicit def newShortArrayEncoder: Encoder[Array[Short]]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.1

  48. implicit def newShortEncoder: Encoder[Short]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.0

  49. implicit def newShortSeqEncoder: Encoder[Seq[Short]]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.1

  50. implicit def newStringArrayEncoder: Encoder[Array[String]]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.1

  51. implicit def newStringEncoder: Encoder[String]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.0

  52. implicit def newStringSeqEncoder: Encoder[Seq[String]]

    Permalink

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.1

  53. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  54. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  55. implicit def rddToDatasetHolder[T](rdd: RDD[T])(implicit arg0: Encoder[T]): DatasetHolder[T]

    Permalink

    Creates a Dataset from an RDD.

    Creates a Dataset from an RDD.

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.6.0

  56. lazy val ss: SparkSession

    Permalink
    Definition Classes
    SparkTestHasSparkSession
  57. implicit def symbolToColumn(s: Symbol): ColumnName

    Permalink

    An implicit conversion that turns a Scala Symbol into a Column.

    An implicit conversion that turns a Scala Symbol into a Column.

    Definition Classes
    SparkTestSQLImplicits
    Since

    1.3.0

  58. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  59. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  60. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  61. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  62. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  63. def withConfiguration(failOnMissingOriginalCol: Boolean = ..., failOnChangedDataTypeExpectedCol: Boolean = ..., failOnMissingExpectedCol: Boolean = ..., failOnNullable: Boolean = configuration.failOnNullable, maxRowError: Int = configuration.maxRowError)(body: ⇒ Unit): Unit

    Permalink

    You can wrap your Spark-Test's function using a configuration to customize the behaviour of the function Each functions who can be modified using the configuration have the tag @configuration

    You can wrap your Spark-Test's function using a configuration to customize the behaviour of the function Each functions who can be modified using the configuration have the tag @configuration

    Example: withConfiguration(failOnMissingExpectedCol = false, failOnMissingOriginalCol = false)({ df1.assertEquals(df2) }) result: ignore if there are any extra columns in df1 or df2

    failOnMissingOriginalCol: if true, Return an exception if a column appear in the original DataFrame but not in the expected one failOnChangedDataTypeExpectedCol: if true, Return an exception if both columns don't have the same DataType failOnMissingExpectedCol: if true, Return an exception if a column appear in the expected DataFrame but not in the original one maxRowError: if > 0, print maxRowError's rows during the error handling else print every row's errors

Inherited from ReadOps

Inherited from HasSparkSession

Inherited from SparkTestSQLImplicits

Inherited from AnyRef

Inherited from Any

Ungrouped