package encoders
- Alphabetic
- Public
- Protected
Type Members
- trait AgnosticEncoder[T] extends Encoder[T]
A non implementation specific encoder.
A non implementation specific encoder. This encoder containers all the information needed to generate an implementation specific encoder (e.g. InternalRow <=> Custom Object).
The input of the serialization does not need to match the external type of the encoder. This is called lenient serialization. An example of this is lenient date serialization, in this case both java.sql.Date and java.time.LocalDate are allowed. Deserialization is never lenient; it will always produce instance of the external type.
- trait Codec[I, O] extends Serializable
Codec for doing conversions between two representations.
Codec for doing conversions between two representations.
- I
input type (typically the external representation of the data.
- O
output type (typically the internal representation of the data.
- class JavaSerializationCodec[I] extends Codec[I, Array[Byte]]
A codec that uses Java Serialization as its output format.
- trait ToAgnosticEncoder[T] extends AnyRef
Extract an AgnosticEncoder from an Encoder.
Value Members
- object AgnosticEncoders
- object JavaSerializationCodec extends () => Codec[Any, Array[Byte]] with Serializable
- object KryoSerializationCodec extends () => Codec[Any, Array[Byte]]
A codec that uses Kryo to (de)serialize arbitrary objects to and from a byte array.
A codec that uses Kryo to (de)serialize arbitrary objects to and from a byte array.
Please note that this is currently only supported for Classic Spark applications. The reason for this is that Connect applications can have a significantly different classpath than the driver or executor. This makes having a the same Kryo configuration on both the client and server (driver & executors) very tricky. As a workaround a user can define their own Codec which internalizes the Kryo configuration.
- object OuterScopes
- object RowEncoder extends DataTypeErrorsBase
A factory for constructing encoders that convert external row to/from the Spark SQL internal binary representation.
A factory for constructing encoders that convert external row to/from the Spark SQL internal binary representation.
The following is a mapping between Spark SQL types and its allowed external types:
BooleanType -> java.lang.Boolean ByteType -> java.lang.Byte ShortType -> java.lang.Short IntegerType -> java.lang.Integer FloatType -> java.lang.Float DoubleType -> java.lang.Double StringType -> String DecimalType -> java.math.BigDecimal or scala.math.BigDecimal or Decimal DateType -> java.sql.Date if spark.sql.datetime.java8API.enabled is false DateType -> java.time.LocalDate if spark.sql.datetime.java8API.enabled is true TimestampType -> java.sql.Timestamp if spark.sql.datetime.java8API.enabled is false TimestampType -> java.time.Instant if spark.sql.datetime.java8API.enabled is true TimestampNTZType -> java.time.LocalDateTime DayTimeIntervalType -> java.time.Duration YearMonthIntervalType -> java.time.Period BinaryType -> byte array ArrayType -> scala.collection.Seq or Array MapType -> scala.collection.Map StructType -> org.apache.spark.sql.Row