org.apache.spark.sql.execution.command
CreateUserDefinedFunctionCommand
Companion class CreateUserDefinedFunctionCommand
object CreateUserDefinedFunctionCommand
- Alphabetic
- By Inheritance
- CreateUserDefinedFunctionCommand
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- def apply(name: FunctionIdentifier, inputParamText: Option[String], returnTypeText: String, exprText: Option[String], queryText: Option[String], comment: Option[String], isDeterministic: Option[Boolean], containsSQL: Option[Boolean], language: RoutineLanguage, isTableFunc: Boolean, isTemp: Boolean, ignoreIfExists: Boolean, replace: Boolean): CreateUserDefinedFunctionCommand
This factory methods serves as a central place to verify required inputs and returns the CREATE command for the parsed user defined function.
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- def checkDefaultsTrailing(param: StructType, name: FunctionIdentifier): Unit
Check whether the function parameters contain non trailing defaults.
Check whether the function parameters contain non trailing defaults. For languages that support default values for input parameters, this check ensures once a default value is given to a parameter, all subsequent parameters must also have a default value. It throws error if otherwise.
Perform this check on function input parameters while registering the function to fail early. This check does not need to run the function itself.
- def checkParameterNameDuplication(param: StructType, conf: SQLConf, name: FunctionIdentifier): Unit
Check whether the function parameters contain duplicated column names.
Check whether the function parameters contain duplicated column names. It takes the function input parameter struct as input and verifies that there is no duplicates in the parameter column names. If any duplicates are found, it throws an exception with helpful information for users to fix the wrong function parameters.
Perform this check while registering the function to fail early. This check does not need to run the function itself.
- def checkParameterNotNull(param: StructType, input: String): Unit
Check whether the function input or return columns (for TABLE Return type) have NOT NULL specified.
Check whether the function input or return columns (for TABLE Return type) have NOT NULL specified. Throw exception if NOT NULL is found.
Perform this check on function input and return parameters while registering the function to fail early. This check does not need to run the function itself.
- def checkReturnsColumnDuplication(columns: StructType, conf: SQLConf, name: FunctionIdentifier): Unit
Check whether the function has duplicate column names in the RETURNS clause.
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @IntrinsicCandidate() @native()
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef → Any
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @IntrinsicCandidate() @native()
- def hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @IntrinsicCandidate() @native()
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @IntrinsicCandidate() @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @IntrinsicCandidate() @native()
- def sqlConfigsToProps(conf: SQLConf): Map[String, String]
Convert SQL configs to properties by prefixing all configs with a key.
Convert SQL configs to properties by prefixing all configs with a key. When converting a function to org.apache.spark.sql.catalyst.catalog.CatalogFunction or org.apache.spark.sql.catalyst.expressions.ExpressionInfo, all SQL configs and other function properties (such as the function parameters and the function return type) are saved together in a property map.
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def toString(): String
- Definition Classes
- AnyRef → Any
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
Deprecated Value Members
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable]) @Deprecated
- Deprecated
(Since version 9)