:: DeveloperApi :: The data type for collections of multiple values.
:: DeveloperApi ::
The data type for collections of multiple values.
Internally these are represented as columns that contain a
.scala.collection.Seq
Please use DataTypes.createArrayType() to create a specific instance.
An ArrayType object comprises two fields, elementType: DataType
and
containsNull: Boolean
. The field of elementType
is used to specify the type of
array elements. The field of containsNull
is used to specify if the array has null
values.
The data type of values.
Indicates if values have null
values
An internal type used to represent everything that is not null, UDTs, arrays, structs, and maps.
An internal type used to represent everything that is not null, UDTs, arrays, structs, and maps.
:: DeveloperApi ::
The data type representing Array[Byte]
values.
:: DeveloperApi ::
The data type representing Array[Byte]
values.
Please use the singleton DataTypes.BinaryType.
:: DeveloperApi ::
The data type representing Boolean
values.
:: DeveloperApi ::
The data type representing Boolean
values. Please use the singleton DataTypes.BooleanType.
:: DeveloperApi ::
The data type representing Byte
values.
:: DeveloperApi ::
The data type representing Byte
values. Please use the singleton DataTypes.ByteType.
:: DeveloperApi :: The base type of all Spark SQL data types.
:: DeveloperApi :: The base type of all Spark SQL data types.
:: DeveloperApi ::
The data type representing java.sql.Date
values.
:: DeveloperApi ::
The data type representing java.sql.Date
values.
Please use the singleton DataTypes.DateType.
A mutable implementation of BigDecimal that can hold a Long if values are small enough.
A mutable implementation of BigDecimal that can hold a Long if values are small enough.
The semantics of the fields are as follows: - _precision and _scale represent the SQL precision and scale we are looking for - If decimalVal is set, it represents the whole decimal value - Otherwise, the decimal value is longVal / (10 ** _scale)
:: DeveloperApi ::
The data type representing java.math.BigDecimal
values.
:: DeveloperApi ::
The data type representing java.math.BigDecimal
values.
A Decimal that might have fixed precision and scale, or unlimited values for these.
Please use DataTypes.createDecimalType() to create a specific instance.
:: DeveloperApi ::
The data type representing Double
values.
:: DeveloperApi ::
The data type representing Double
values. Please use the singleton DataTypes.DoubleType.
:: DeveloperApi ::
The data type representing Float
values.
:: DeveloperApi ::
The data type representing Float
values. Please use the singleton DataTypes.FloatType.
:: DeveloperApi ::
The data type representing Int
values.
:: DeveloperApi ::
The data type representing Int
values. Please use the singleton DataTypes.IntegerType.
:: DeveloperApi ::
The data type representing Long
values.
:: DeveloperApi ::
The data type representing Long
values. Please use the singleton DataTypes.LongType.
:: DeveloperApi :: The data type for Maps.
:: DeveloperApi ::
The data type for Maps. Keys in a map are not allowed to have null
values.
Please use DataTypes.createMapType() to create a specific instance.
The data type of map keys.
The data type of map values.
Indicates if map values have null
values.
:: DeveloperApi ::
:: DeveloperApi ::
Metadata is a wrapper over Map[String, Any] that limits the value type to simple ones: Boolean, Long, Double, String, Metadata, Array[Boolean], Array[Long], Array[Double], Array[String], and Array[Metadata]. JSON is used for serialization.
The default constructor is private. User should use either MetadataBuilder or Metadata.fromJson() to create Metadata instances.
:: DeveloperApi ::
:: DeveloperApi ::
Builder for Metadata. If there is a key collision, the latter will overwrite the former.
:: DeveloperApi ::
The data type representing NULL
values.
:: DeveloperApi ::
The data type representing NULL
values. Please use the singleton DataTypes.NullType.
:: DeveloperApi :: Numeric data types.
Precision parameters for a Decimal
:: DeveloperApi ::
The data type representing Short
values.
:: DeveloperApi ::
The data type representing Short
values. Please use the singleton DataTypes.ShortType.
:: DeveloperApi ::
The data type representing String
values.
:: DeveloperApi ::
The data type representing String
values. Please use the singleton DataTypes.StringType.
A field inside a StructType.
A field inside a StructType.
The name of this field.
The data type of this field.
Indicates if values of this field can be null
values.
The metadata of this field. The metadata should be preserved during transformation if the content of the column is not modified, e.g, in selection.
:: DeveloperApi :: A StructType object can be constructed by
:: DeveloperApi :: A StructType object can be constructed by
StructType(fields: Seq[StructField])
For a StructType object, one or multiple StructFields can be extracted by names.
If multiple StructFields are extracted, a StructType object will be returned.
If a provided name does not have a matching field, it will be ignored. For the case
of extracting a single StructField, a null
will be returned.
Example:
import org.apache.spark.sql._ val struct = StructType( StructField("a", IntegerType, true) :: StructField("b", LongType, false) :: StructField("c", BooleanType, false) :: Nil) // Extract a single StructField. val singleField = struct("b") // singleField: StructField = StructField(b,LongType,false) // This struct does not have a field called "d". null will be returned. val nonExisting = struct("d") // nonExisting: StructField = null // Extract multiple StructFields. Field names are provided in a set. // A StructType object will be returned. val twoFields = struct(Set("b", "c")) // twoFields: StructType = // StructType(List(StructField(b,LongType,false), StructField(c,BooleanType,false))) // Any names without matching fields will be ignored. // For the case shown below, "d" will be ignored and // it is treated as struct(Set("b", "c")). val ignoreNonExisting = struct(Set("b", "c", "d")) // ignoreNonExisting: StructType = // StructType(List(StructField(b,LongType,false), StructField(c,BooleanType,false)))
A org.apache.spark.sql.Row object is used as a value of the StructType. Example:
import org.apache.spark.sql._ val innerStruct = StructType( StructField("f1", IntegerType, true) :: StructField("f2", LongType, false) :: StructField("f3", BooleanType, false) :: Nil) val struct = StructType( StructField("a", innerStruct, true) :: Nil) // Create a Row with the schema defined by struct val row = Row(Row(1, 2, true)) // row: Row = [[1,2,true]]
:: DeveloperApi ::
The data type representing java.sql.Timestamp
values.
:: DeveloperApi ::
The data type representing java.sql.Timestamp
values.
Please use the singleton DataTypes.TimestampType.
:: DeveloperApi :: A UTF-8 String, as internal representation of StringType in SparkSQL
:: DeveloperApi :: A UTF-8 String, as internal representation of StringType in SparkSQL
A String encoded in UTF-8 as an Array[Byte], which can be used for comparison, search, see http://en.wikipedia.org/wiki/UTF-8 for details.
Note: This is not designed for general use cases, should not be used outside SQL.
::DeveloperApi:: The data type for User Defined Types (UDTs).
::DeveloperApi:: The data type for User Defined Types (UDTs).
This interface allows a user to make their own classes more interoperable with SparkSQL;
e.g., by creating a UserDefinedType for a class X, it becomes possible to create
a DataFrame
which has class X in the schema.
For SparkSQL to recognize UDTs, the UDT must be annotated with SQLUserDefinedType.
The conversion via serialize
occurs when instantiating a DataFrame
from another RDD.
The conversion via deserialize
occurs when reading from a DataFrame
.
Extra factory methods and pattern matchers for Decimals
:: DeveloperApi ::
:: DeveloperApi ::
Contains a type system for attributes produced by relations, including complex types like structs, arrays and maps.