abstract class FileTable extends Table with SupportsRead with SupportsWrite
- Alphabetic
- By Inheritance
- FileTable
- SupportsWrite
- SupportsRead
- Table
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
- new FileTable(sparkSession: SparkSession, options: CaseInsensitiveStringMap, paths: Seq[String], userSpecifiedSchema: Option[StructType])
Abstract Value Members
-
abstract
def
fallbackFileFormat: Class[_ <: FileFormat]
Returns a V1 FileFormat class of the same file data source.
Returns a V1 FileFormat class of the same file data source. This is a solution for the following cases: 1. File datasource V2 implementations cause regression. Users can disable the problematic data source via SQL configuration and fall back to FileFormat. 2. Catalog support is required, which is still under development for data source V2.
-
abstract
def
formatName: String
The string that represents the format that this data source provider uses.
The string that represents the format that this data source provider uses. This is overridden by children to provide a nice alias for the data source. For example:
override def formatName(): String = "ORC"
-
abstract
def
inferSchema(files: Seq[FileStatus]): Option[StructType]
When possible, this method should return the schema of the given
files
.When possible, this method should return the schema of the given
files
. When the format does not support inference, or no valid files are given should return None. In these cases Spark will require that user specify the schema manually. -
abstract
def
name(): String
- Definition Classes
- Table
-
abstract
def
newScanBuilder(arg0: CaseInsensitiveStringMap): ScanBuilder
- Definition Classes
- SupportsRead
-
abstract
def
newWriteBuilder(arg0: LogicalWriteInfo): WriteBuilder
- Definition Classes
- SupportsWrite
Concrete Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
capabilities(): Set[TableCapability]
- Definition Classes
- FileTable → Table
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
- lazy val dataSchema: StructType
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- lazy val fileIndex: PartitioningAwareFileIndex
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
def
partitioning(): Array[Transform]
- Definition Classes
- FileTable → Table
-
def
properties(): Map[String, String]
- Definition Classes
- FileTable → Table
-
lazy val
schema: StructType
- Definition Classes
- FileTable → Table
-
def
supportsDataType(dataType: DataType): Boolean
Returns whether this format supports the given DataType in read/write path.
Returns whether this format supports the given DataType in read/write path. By default all data types are supported.
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()