Package

org.apache.spark.sql.catalyst

encoders

Permalink

package encoders

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. encoders
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. case class ExpressionEncoder[T](schema: StructType, flat: Boolean, serializer: Seq[Expression], deserializer: Expression, clsTag: ClassTag[T]) extends Encoder[T] with Product with Serializable

    Permalink

    A generic encoder for JVM objects.

    A generic encoder for JVM objects.

    schema

    The schema after converting T to a Spark SQL row.

    serializer

    A set of expressions, one for each top-level field that can be used to extract the values from a raw object into an InternalRow.

    deserializer

    An expression that will construct an object given an InternalRow.

    clsTag

    A classtag for T.

Value Members

  1. object ExpressionEncoder extends Serializable

    Permalink

    A factory for constructing encoders that convert objects and primitives to and from the internal row format using catalyst expressions and code generation.

    A factory for constructing encoders that convert objects and primitives to and from the internal row format using catalyst expressions and code generation. By default, the expressions used to retrieve values from an input row when producing an object will be created as follows:

    • Classes will have their sub fields extracted by name using UnresolvedAttribute expressions and UnresolvedExtractValue expressions.
    • Tuples will have their subfields extracted by position using BoundReference expressions.
    • Primitives will have their values extracted from the first ordinal with a schema that defaults to the name value.
  2. object OuterScopes

    Permalink
  3. object RowEncoder

    Permalink

    A factory for constructing encoders that convert external row to/from the Spark SQL internal binary representation.

    A factory for constructing encoders that convert external row to/from the Spark SQL internal binary representation.

    The following is a mapping between Spark SQL types and its allowed external types:

    BooleanType -> java.lang.Boolean
    ByteType -> java.lang.Byte
    ShortType -> java.lang.Short
    IntegerType -> java.lang.Integer
    FloatType -> java.lang.Float
    DoubleType -> java.lang.Double
    StringType -> String
    DecimalType -> java.math.BigDecimal or scala.math.BigDecimal or Decimal
    
    DateType -> java.sql.Date
    TimestampType -> java.sql.Timestamp
    
    BinaryType -> byte array
    ArrayType -> scala.collection.Seq or Array
    MapType -> scala.collection.Map
    StructType -> org.apache.spark.sql.Row
  4. def encoderFor[A](implicit arg0: Encoder[A]): ExpressionEncoder[A]

    Permalink

    Returns an internal encoder object that can be used to serialize / deserialize JVM objects into Spark SQL rows.

    Returns an internal encoder object that can be used to serialize / deserialize JVM objects into Spark SQL rows. The implicit encoder should always be unresolved (i.e. have no attribute references from a specific schema.) This requirement allows us to preserve whether a given object type is being bound by name or by ordinal when doing resolution.

Inherited from AnyRef

Inherited from Any

Ungrouped