Schema Converter for getting schema in json format into a spark Structure
Schema Converter for getting schema in json format into a spark Structure
The given schema for spark has almost no validity checks, so it will make sense
to combine this with the schema-validator. For loading data with schema, data is converted
to the type given in the schema. If this is not possible the whole row will be null (!).
Fields can be null, no matter if the schema has nullable true or false. The converted
schema doesn't check for 'enum' fields, i.e. fields which are limited to a given set.
It also doesn't check for required fields or if additional properties are set to true
or false. If a field is specified in the schema, than you can select it and it will
be null if missing. If a field is not in the schema, it cannot be selected even if
given in the dataset.
Schema Converter for getting schema in json format into a spark Structure
The given schema for spark has almost no validity checks, so it will make sense to combine this with the schema-validator. For loading data with schema, data is converted to the type given in the schema. If this is not possible the whole row will be null (!). Fields can be null, no matter if the schema has nullable true or false. The converted schema doesn't check for 'enum' fields, i.e. fields which are limited to a given set. It also doesn't check for required fields or if additional properties are set to true or false. If a field is specified in the schema, than you can select it and it will be null if missing. If a field is not in the schema, it cannot be selected even if given in the dataset.