object DataWritingCommand
- Alphabetic
- By Inheritance
- DataWritingCommand
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
assertEmptyRootPath(tablePath: URI, saveMode: SaveMode, hadoopConf: Configuration): Unit
When execute CTAS operators, and the location is not empty, throw AnalysisException.
When execute CTAS operators, and the location is not empty, throw AnalysisException. For CTAS, the SaveMode is always ErrorIfExists
- tablePath
Table location.
- saveMode
Save mode of the table.
- hadoopConf
Configuration.
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
def
logicalPlanOutputWithNames(query: LogicalPlan, names: Seq[String]): Seq[Attribute]
Returns output attributes with provided names.
Returns output attributes with provided names. The length of provided names should be the same of the length of LogicalPlan.output.
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
def
propogateMetrics(sparkContext: SparkContext, command: DataWritingCommand, metrics: Map[String, SQLMetric]): Unit
When execute CTAS operators, Spark will use InsertIntoHadoopFsRelationCommand or InsertIntoHiveTable command to write data, they both inherit metrics from DataWritingCommand, but after running InsertIntoHadoopFsRelationCommand or InsertIntoHiveTable, we only update metrics in these two command through BasicWriteJobStatsTracker, we also need to propogate metrics to the command that actually calls InsertIntoHadoopFsRelationCommand or InsertIntoHiveTable.
When execute CTAS operators, Spark will use InsertIntoHadoopFsRelationCommand or InsertIntoHiveTable command to write data, they both inherit metrics from DataWritingCommand, but after running InsertIntoHadoopFsRelationCommand or InsertIntoHiveTable, we only update metrics in these two command through BasicWriteJobStatsTracker, we also need to propogate metrics to the command that actually calls InsertIntoHadoopFsRelationCommand or InsertIntoHiveTable.
- sparkContext
Current SparkContext.
- command
Command to execute writing data.
- metrics
Metrics of real DataWritingCommand.
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()