Inherited from Serializable
Inherited from Serializable
Inherited from Command
Inherited from LogicalPlan
Inherited from Logging
Inherited from QueryPlan[LogicalPlan]
Inherited from TreeNode[LogicalPlan]
Inherited from Product
Inherited from Equals
Inherited from AnyRef
Inherited from Any
A command for writing data to a HadoopFsRelation. Supports both overwriting and appending. Writing to dynamic partitions is also supported. Each InsertIntoHadoopFsRelationCommand issues a single write job, and owns a UUID that identifies this job. Each concrete implementation of HadoopFsRelation should use this UUID together with task id to generate unique file path for each task output file. This UUID is passed to executor side via a property named
spark.sql.sources.writeJobUUID
.Different writer containers, DefaultWriterContainer and DynamicPartitionWriterContainer are used to write to normal tables and tables with dynamic partitions.
Basic work flow of this command is: