Package

zio.spark

sql

Permalink

package sql

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. sql
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. type DataFrame = Dataset[Row]

    Permalink
  2. final case class DataFrameNaFunctions(underlying: org.apache.spark.sql.DataFrameNaFunctions) extends Product with Serializable

    Permalink
  3. final case class DataFrameReader[State <: SchemaState] extends Product with Serializable

    Permalink
  4. final case class DataFrameStatFunctions(underlying: org.apache.spark.sql.DataFrameStatFunctions) extends Product with Serializable

    Permalink
  5. final case class DataFrameWriter[T] extends Product with Serializable

    Permalink
  6. final case class Dataset[T](underlying: org.apache.spark.sql.Dataset[T]) extends Product with Serializable

    Permalink
  7. implicit final class DatasetConversionOps[T] extends AnyVal

    Permalink
  8. abstract class ExtraSparkSessionFeature extends AnyRef

    Permalink
  9. final case class KeyValueGroupedDataset[K, V](underlying: org.apache.spark.sql.KeyValueGroupedDataset[K, V]) extends Product with Serializable

    Permalink
  10. trait LowPrioritySQLImplicits extends AnyRef

    Permalink
  11. final case class RelationalGroupedDataset(underlying: org.apache.spark.sql.RelationalGroupedDataset) extends Product with Serializable

    Permalink
  12. type SIO[A] = ZIO[SparkSession, Throwable, A]

    Permalink
  13. type SRIO[R, A] = ZIO[R with SparkSession, Throwable, A]

    Permalink
  14. final case class SparkSession(underlyingSparkSession: org.apache.spark.sql.SparkSession) extends ExtraSparkSessionFeature with Product with Serializable

    Permalink
  15. sealed trait Statistics extends AnyRef

    Permalink
  16. sealed trait TryAnalysis[+T] extends AnyRef

    Permalink

    A Try like structure to describe a transformation that can fail with an AnalysisException.

    A Try like structure to describe a transformation that can fail with an AnalysisException. Generally speaking when you make a transformation on your dataset, for some transformations, Spark throws an AnalysisException, these transformations are wrapped into a TryAnalysis.

    You can ignore the TryAnalysis wrapper (and make Spark fails as usual when a impossible transformation is being build (like selecting a column that don't exist)) using:

    scala> import zio.spark.sql.TryAnalysis.syntax.throwAnalysisException

Value Members

  1. object DataFrameReader extends Serializable

    Permalink
  2. object DataFrameWriter extends Serializable

    Permalink
  3. object SchemaFromCaseClass

    Permalink
  4. object SparkSession extends Serializable

    Permalink
  5. object Statistics

    Permalink
  6. object TryAnalysis

    Permalink
  7. def fromSpark[Out](f: (org.apache.spark.sql.SparkSession) ⇒ Out)(implicit trace: Trace): SIO[Out]

    Permalink

    Wrap an effecful spark job into zio-spark.

  8. object implicits extends LowPrioritySQLImplicits

    Permalink
  9. package streaming

    Permalink
  10. object syntax

    Permalink

Inherited from AnyRef

Inherited from Any

Ungrouped