com.holdenkarau.spark.testing

StructuredStreamingBase

trait StructuredStreamingBase extends DataFrameSuiteBase with StructuredStreamingBaseLike

Early Experimental Structured Streaming Base.

Self Type
StructuredStreamingBase with Suite
Linear Supertypes
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. StructuredStreamingBase
  2. StructuredStreamingBaseLike
  3. DataFrameSuiteBase
  4. DataFrameSuiteBaseLike
  5. Serializable
  6. Serializable
  7. SharedSparkContext
  8. SparkContextProvider
  9. BeforeAndAfterAll
  10. SuiteMixin
  11. TestSuite
  12. TestSuiteLike
  13. AnyRef
  14. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Abstract Value Members

  1. abstract def expectedTestCount(filter: Filter): Int

    Definition Classes
    SuiteMixin
  2. abstract def nestedSuites: IndexedSeq[Suite]

    Definition Classes
    SuiteMixin
  3. abstract def rerunner: Option[String]

    Definition Classes
    SuiteMixin
  4. abstract def runNestedSuites(args: Args): Status

    Attributes
    protected
    Definition Classes
    SuiteMixin
  5. abstract def runTest(testName: String, args: Args): Status

    Attributes
    protected
    Definition Classes
    SuiteMixin
  6. abstract def runTests(testName: Option[String], args: Args): Status

    Attributes
    protected
    Definition Classes
    SuiteMixin
  7. abstract val styleName: String

    Definition Classes
    SuiteMixin
  8. abstract def suiteId: String

    Definition Classes
    SuiteMixin
  9. abstract def suiteName: String

    Definition Classes
    SuiteMixin
  10. abstract def tags: Map[String, Set[String]]

    Definition Classes
    SuiteMixin
  11. abstract def testDataFor(testName: String, theConfigMap: ConfigMap): TestData

    Definition Classes
    SuiteMixin
  12. abstract def testNames: Set[String]

    Definition Classes
    SuiteMixin

Concrete Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. def afterAll(): Unit

    Definition Classes
    DataFrameSuiteBaseSharedSparkContext → BeforeAndAfterAll
  7. def appID: String

    Definition Classes
    SparkContextProvider
  8. def approxEquals(r1: Row, r2: Row, tol: Double): Boolean

    Definition Classes
    DataFrameSuiteBaseLike
  9. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  10. def assert[U](message: String, expected: U, actual: U)(implicit CT: ClassTag[U]): Unit

    Definition Classes
    TestSuiteTestSuiteLike
  11. def assert[U](expected: U, actual: U)(implicit CT: ClassTag[U]): Unit

    Definition Classes
    TestSuiteTestSuiteLike
  12. def assertDataFrameApproximateEquals(expected: DataFrame, result: DataFrame, tol: Double): Unit

    Compares if two DataFrames are equal, checks that the schemas are the same.

    Compares if two DataFrames are equal, checks that the schemas are the same. When comparing inexact fields uses tol.

    tol

    max acceptable tolerance, should be less than 1.

    Definition Classes
    DataFrameSuiteBaseLike
  13. def assertDataFrameEquals(expected: DataFrame, result: DataFrame): Unit

    Compares if two DataFrames are equal, checks the schema and then if that matches checks if the rows are equal.

    Compares if two DataFrames are equal, checks the schema and then if that matches checks if the rows are equal.

    Definition Classes
    DataFrameSuiteBaseLike
  14. def assertEmpty[U](arr: Array[U])(implicit CT: ClassTag[U]): Unit

    Definition Classes
    TestSuiteTestSuiteLike
  15. def assertTrue(expected: Boolean): Unit

    Definition Classes
    TestSuiteTestSuiteLike
  16. def beforeAll(): Unit

    Definition Classes
    DataFrameSuiteBaseSharedSparkContext → BeforeAndAfterAll
  17. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  18. def conf: SparkConf

    Definition Classes
    SparkContextProvider
  19. var count: Int

    Definition Classes
    StructuredStreamingBaseLike
  20. implicit def enableHiveSupport: Boolean

    Attributes
    protected
    Definition Classes
    DataFrameSuiteBaseLike
  21. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  22. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  23. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  24. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  25. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  26. implicit def impSqlContext: SQLContext

    Attributes
    protected
    Definition Classes
    DataFrameSuiteBaseLike
  27. val invokeBeforeAllAndAfterAllEvenIfNoTestsAreExpected: Boolean

    Definition Classes
    BeforeAndAfterAll
  28. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  29. val maxUnequalRowsToShow: Int

    Definition Classes
    DataFrameSuiteBaseLike
  30. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  31. final def notify(): Unit

    Definition Classes
    AnyRef
  32. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  33. implicit def reuseContextIfPossible: Boolean

    Attributes
    protected
    Definition Classes
    SharedSparkContext
  34. def run(testName: Option[String], args: Args): Status

    Definition Classes
    BeforeAndAfterAll → SuiteMixin
  35. def sc: SparkContext

  36. def setup(sc: SparkContext): Unit

    Setup work to be called when creating a new SparkContext.

    Setup work to be called when creating a new SparkContext. Default implementation currently sets a checkpoint directory.

    This _should_ be called by the context provider automatically.

    Definition Classes
    SparkContextProvider
  37. lazy val spark: SparkSession

    Definition Classes
    DataFrameSuiteBaseLike
  38. def sqlBeforeAllTestCases(): Unit

    Definition Classes
    DataFrameSuiteBaseLike
  39. lazy val sqlContext: SQLContext

    Definition Classes
    DataFrameSuiteBaseLike
  40. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  41. def testSimpleStreamEndState[T, R](spark: SparkSession, input: Seq[Seq[T]], expected: Seq[R], mode: String, queryFunction: (Dataset[T]) ⇒ Dataset[R])(implicit arg0: Encoder[T], arg1: Encoder[R]): Assertion

    Test a simple streams end state

  42. def toString(): String

    Definition Classes
    AnyRef → Any
  43. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  44. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  45. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from DataFrameSuiteBase

Inherited from DataFrameSuiteBaseLike

Inherited from Serializable

Inherited from Serializable

Inherited from SharedSparkContext

Inherited from SparkContextProvider

Inherited from BeforeAndAfterAll

Inherited from SuiteMixin

Inherited from TestSuite

Inherited from TestSuiteLike

Inherited from AnyRef

Inherited from Any

Ungrouped