A B C D E F G H I J L M N O P R S T U V W
All Classes All Packages
All Classes All Packages
All Classes All Packages
A
- AGGREGATION_ALLOW_DISK_USE_CONFIG - Static variable in class com.mongodb.spark.sql.connector.config.ReadConfig
-
Allow disk use when running the aggregation.
- AGGREGATION_PIPELINE_CONFIG - Static variable in class com.mongodb.spark.sql.connector.config.ReadConfig
-
Provide a custom aggregation pipeline.
- AGGREGATION_PIPELINE_DEFAULT - Static variable in class com.mongodb.spark.sql.connector.config.ReadConfig
- alterNamespace(String[], NamespaceChange...) - Method in class com.mongodb.spark.sql.connector.MongoCatalog
-
Altering databases is currently not supported, so
alterNamespace
will always throw an exception. - alterTable(Identifier, TableChange...) - Method in class com.mongodb.spark.sql.connector.MongoCatalog
-
Altering collections is currently not supported, so
alterTable
will always throw an exception. - ANY - com.mongodb.spark.sql.connector.config.WriteConfig.ConvertJson
-
Try to parse any string as a json value
- asJava(Map<K, V>) - Static method in class com.mongodb.spark.sql.connector.interop.JavaScala
-
Wrapper to convert a scala map to a java map
- asJava(Seq<A>) - Static method in class com.mongodb.spark.sql.connector.interop.JavaScala
-
Wrapper to convert a scala seq to a java list
- asScala(List<A>) - Static method in class com.mongodb.spark.sql.connector.interop.JavaScala
-
Wrapper to convert a java list to a scala seq
- asScala(Map<K, V>) - Static method in class com.mongodb.spark.sql.connector.interop.JavaScala
-
Wrapper to convert a java map to a scala map
- Assertions - Class in com.mongodb.spark.sql.connector.assertions
-
Assertions to validate inputs
B
- BsonDocumentToRowConverter - Class in com.mongodb.spark.sql.connector.schema
-
The helper for conversion of BsonDocuments to GenericRowWithSchema instances.
- BsonDocumentToRowConverter(StructType, boolean) - Constructor for class com.mongodb.spark.sql.connector.schema.BsonDocumentToRowConverter
-
Create a new instance
- build() - Method in class com.mongodb.spark.sql.connector.read.MongoScanBuilder
- buildForBatch() - Method in class com.mongodb.spark.sql.connector.write.MongoWriteBuilder
-
Returns a
MongoBatchWrite
to write data to batch source. - buildForStreaming() - Method in class com.mongodb.spark.sql.connector.write.MongoWriteBuilder
-
Returns a
MongoStreamingWrite
to write data to streaming source.
C
- CLIENT_FACTORY_CONFIG - Static variable in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
The MongoClientFactory configuration key
- CLIENT_FACTORY_DEFAULT - Static variable in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
The default MongoClientFactory configuration value
- COLLECTION_NAME_CONFIG - Static variable in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
The collection name config
- com.mongodb.spark.connector - package com.mongodb.spark.connector
- com.mongodb.spark.sql.connector - package com.mongodb.spark.sql.connector
- com.mongodb.spark.sql.connector.annotations - package com.mongodb.spark.sql.connector.annotations
- com.mongodb.spark.sql.connector.assertions - package com.mongodb.spark.sql.connector.assertions
- com.mongodb.spark.sql.connector.config - package com.mongodb.spark.sql.connector.config
- com.mongodb.spark.sql.connector.connection - package com.mongodb.spark.sql.connector.connection
- com.mongodb.spark.sql.connector.exceptions - package com.mongodb.spark.sql.connector.exceptions
- com.mongodb.spark.sql.connector.interop - package com.mongodb.spark.sql.connector.interop
- com.mongodb.spark.sql.connector.read - package com.mongodb.spark.sql.connector.read
-
This package contains the sql connector read support.
- com.mongodb.spark.sql.connector.read.partitioner - package com.mongodb.spark.sql.connector.read.partitioner
- com.mongodb.spark.sql.connector.schema - package com.mongodb.spark.sql.connector.schema
- com.mongodb.spark.sql.connector.write - package com.mongodb.spark.sql.connector.write
-
This package contains the sql connector write support.
- COMMENT_CONFIG - Static variable in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
Add a comment to mongodb operations
- ConfigException - Exception in com.mongodb.spark.sql.connector.exceptions
-
A class for all config exceptions
- ConfigException(String) - Constructor for exception com.mongodb.spark.sql.connector.exceptions.ConfigException
-
Constructs a new instance.
- ConfigException(String, Object, String) - Constructor for exception com.mongodb.spark.sql.connector.exceptions.ConfigException
-
Constructs a new instance.
- ConfigException(String, Throwable) - Constructor for exception com.mongodb.spark.sql.connector.exceptions.ConfigException
-
Constructs a new instance.
- ConfigException(Throwable) - Constructor for exception com.mongodb.spark.sql.connector.exceptions.ConfigException
-
Constructs a new instance.
- CONNECTION_STRING_CONFIG - Static variable in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
The connection string configuration key
- CONNECTION_STRING_DEFAULT - Static variable in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
The default connection string configuration value
- containsKey(String) - Method in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
Returns
true
if this map contains a mapping for the specified key. - CONVERT_JSON_CONFIG - Static variable in class com.mongodb.spark.sql.connector.config.WriteConfig
-
Convert JSON and extended JSON values into their BSON equivalent.
- ConverterHelper - Class in com.mongodb.spark.sql.connector.schema
-
Shared converter helper methods and statics
- convertJson() - Method in class com.mongodb.spark.sql.connector.config.WriteConfig
- create() - Method in class com.mongodb.spark.sql.connector.connection.DefaultMongoClientFactory
- create() - Method in interface com.mongodb.spark.sql.connector.connection.MongoClientFactory
- createConfig(Map<String, String>) - Static method in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
Create a Mongo Configuration that does not yet have a fixed use case
- createNamespace(String[], Map<String, String>) - Method in class com.mongodb.spark.sql.connector.MongoCatalog
-
Create a database.
- createObjectToBsonValue(DataType, WriteConfig.ConvertJson, boolean) - Static method in class com.mongodb.spark.sql.connector.schema.RowToBsonDocumentConverter
-
Returns a conversion function that converts an object to a BsonValue
- createPartitionBounds(BsonValue, BsonValue) - Static method in class com.mongodb.spark.sql.connector.read.partitioner.PartitionerHelper
-
Creates the upper and lower boundary query
- createPartitionPipeline(BsonDocument, List<BsonDocument>) - Static method in class com.mongodb.spark.sql.connector.read.partitioner.PartitionerHelper
-
Creates the aggregation pipeline for a partition.
- createTable(Identifier, StructType, Transform[], Map<String, String>) - Method in class com.mongodb.spark.sql.connector.MongoCatalog
-
Create a collection.
D
- DATABASE_NAME_CONFIG - Static variable in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
The database name config
- DataException - Exception in com.mongodb.spark.sql.connector.exceptions
-
A class for all data exceptions.
- DataException(String) - Constructor for exception com.mongodb.spark.sql.connector.exceptions.DataException
-
Constructs a new instance.
- DataException(String, Throwable) - Constructor for exception com.mongodb.spark.sql.connector.exceptions.DataException
-
Constructs a new instance.
- DataException(Throwable) - Constructor for exception com.mongodb.spark.sql.connector.exceptions.DataException
-
Constructs a new instance.
- DefaultMongoClientFactory - Class in com.mongodb.spark.sql.connector.connection
-
The default MongoClientFactory implementation.
- DefaultMongoClientFactory(MongoConfig) - Constructor for class com.mongodb.spark.sql.connector.connection.DefaultMongoClientFactory
-
Create a new instance of MongoClientFactory
- dropNamespace(String[]) - Method in class com.mongodb.spark.sql.connector.MongoCatalog
-
Drop a database.
- dropTable(Identifier) - Method in class com.mongodb.spark.sql.connector.MongoCatalog
-
Drop a collection.
E
- ensureArgument(Supplier<Boolean>, Supplier<String>) - Static method in class com.mongodb.spark.sql.connector.assertions.Assertions
-
Ensures the validity of arguments
- ensureState(Supplier<Boolean>, Supplier<String>) - Static method in class com.mongodb.spark.sql.connector.assertions.Assertions
-
Ensures the validity of state
- equals(Object) - Method in class com.mongodb.spark.sql.connector.connection.DefaultMongoClientFactory
- equals(Object) - Method in class com.mongodb.spark.sql.connector.read.MongoInputPartition
F
- FALSE - com.mongodb.spark.sql.connector.config.WriteConfig.ConvertJson
-
No conversion of string types
- fromRow(InternalRow) - Method in class com.mongodb.spark.sql.connector.schema.RowToBsonDocumentConverter
-
Converts a
InternalRow
to aBsonDocument
- fromRow(Row) - Method in class com.mongodb.spark.sql.connector.schema.RowToBsonDocumentConverter
-
Converts a
Row
to aBsonDocument
G
- generatePartitions(ReadConfig) - Method in class com.mongodb.spark.sql.connector.read.partitioner.PaginateBySizePartitioner
- generatePartitions(ReadConfig) - Method in class com.mongodb.spark.sql.connector.read.partitioner.PaginateIntoPartitionsPartitioner
- generatePartitions(ReadConfig) - Method in interface com.mongodb.spark.sql.connector.read.partitioner.Partitioner
-
Generate the partitions for the collection based upon the read configuration
- generatePartitions(ReadConfig) - Method in class com.mongodb.spark.sql.connector.read.partitioner.SamplePartitioner
- generatePartitions(ReadConfig) - Method in class com.mongodb.spark.sql.connector.read.partitioner.ShardedPartitioner
- generatePartitions(ReadConfig) - Method in class com.mongodb.spark.sql.connector.read.partitioner.SinglePartitionPartitioner
- get(String) - Method in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
Returns the value to which the specified key is mapped
- getAggregationAllowDiskUse() - Method in class com.mongodb.spark.sql.connector.config.ReadConfig
- getAggregationPipeline() - Method in class com.mongodb.spark.sql.connector.config.ReadConfig
- getBoolean(String, boolean) - Method in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
Returns the boolean value to which the specified key is mapped, or
defaultValue
if there is no mapping for the key. - getCollectionName() - Method in interface com.mongodb.spark.sql.connector.config.MongoConfig
- getComment() - Method in interface com.mongodb.spark.sql.connector.config.MongoConfig
- getConnectionString() - Method in interface com.mongodb.spark.sql.connector.config.MongoConfig
- getDatabaseName() - Method in interface com.mongodb.spark.sql.connector.config.MongoConfig
- getDouble(String, double) - Method in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
Returns the double value to which the specified key is mapped, or
defaultValue
if there is no mapping for the key. - getIdFields() - Method in class com.mongodb.spark.sql.connector.config.WriteConfig
- getInferSchemaMapTypeMinimumKeySize() - Method in class com.mongodb.spark.sql.connector.config.ReadConfig
- getInferSchemaSampleSize() - Method in class com.mongodb.spark.sql.connector.config.ReadConfig
- getInt(String, int) - Method in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
Returns the int value to which the specified key is mapped, or
defaultValue
if there is no mapping for the key. - getList(String, List<String>) - Method in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
Returns a list of strings from a comma delimited string to which the specified key is mapped, or
defaultValue
if there is no mapping for the key. - getLong(String, long) - Method in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
Returns the long value to which the specified key is mapped, or
defaultValue
if there is no mapping for the key. - getMaxBatchSize() - Method in class com.mongodb.spark.sql.connector.config.WriteConfig
- getMicroBatchMaxPartitionCount() - Method in class com.mongodb.spark.sql.connector.config.ReadConfig
- getMongoClient(MongoClientFactory) - Static method in class com.mongodb.spark.sql.connector.connection.LazyMongoClientCache
-
Returns a
MongoClient
from the cache. - getNamespace() - Method in interface com.mongodb.spark.sql.connector.config.MongoConfig
- getOperationType() - Method in class com.mongodb.spark.sql.connector.config.WriteConfig
- getOptions() - Method in interface com.mongodb.spark.sql.connector.config.MongoConfig
- getOrDefault(String, String) - Method in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
Returns the value to which the specified key is mapped, or
defaultValue
if this config contains no mapping for the key. - getOriginals() - Method in interface com.mongodb.spark.sql.connector.config.MongoConfig
- getPartitioner() - Method in class com.mongodb.spark.sql.connector.config.ReadConfig
- getPartitionerOptions() - Method in class com.mongodb.spark.sql.connector.config.ReadConfig
- getPartitionId() - Method in class com.mongodb.spark.sql.connector.read.MongoInputPartition
- getPipeline() - Method in class com.mongodb.spark.sql.connector.read.MongoInputPartition
- getPreferredLocations() - Method in class com.mongodb.spark.sql.connector.read.MongoInputPartition
- getPreferredLocations(ReadConfig) - Static method in class com.mongodb.spark.sql.connector.read.partitioner.PartitionerHelper
- getSchema() - Method in class com.mongodb.spark.sql.connector.schema.BsonDocumentToRowConverter
- getStreamFullDocument() - Method in class com.mongodb.spark.sql.connector.config.ReadConfig
- getStreamInitialBsonTimestamp() - Method in class com.mongodb.spark.sql.connector.config.ReadConfig
-
Returns the initial start at operation time for a stream
- getTable(StructType, Transform[], Map<String, String>) - Method in class com.mongodb.spark.sql.connector.MongoTableProvider
-
Return a
Table
instance with the specified table schema, partitioning and properties to do read/write. - getWriteConcern() - Method in class com.mongodb.spark.sql.connector.config.WriteConfig
H
- hashCode() - Method in class com.mongodb.spark.sql.connector.connection.DefaultMongoClientFactory
- hashCode() - Method in class com.mongodb.spark.sql.connector.read.MongoInputPartition
I
- ID_FIELD_CONFIG - Static variable in class com.mongodb.spark.sql.connector.config.WriteConfig
-
A comma delimited field list used to identify a document
- IGNORE_NULL_VALUES_CONFIG - Static variable in class com.mongodb.spark.sql.connector.config.WriteConfig
-
Ignore null values, even those within arrays or documents.
- ignoreNullValues() - Method in class com.mongodb.spark.sql.connector.config.WriteConfig
- INFER_SCHEMA_MAP_TYPE_ENABLED_CONFIG - Static variable in class com.mongodb.spark.sql.connector.config.ReadConfig
-
Enable Map Types when inferring the schema.
- INFER_SCHEMA_MAP_TYPE_MINIMUM_KEY_SIZE_CONFIG - Static variable in class com.mongodb.spark.sql.connector.config.ReadConfig
-
The minimum size of a
StructType
before its inferred to aMapType
instead. - INFER_SCHEMA_SAMPLE_SIZE_CONFIG - Static variable in class com.mongodb.spark.sql.connector.config.ReadConfig
-
The size of the sample of documents from the collection to use when inferring the schema
- INFERRED_METADATA - Static variable in class com.mongodb.spark.sql.connector.schema.InferSchema
-
Inferred schema metadata
- inferSchema(CaseInsensitiveStringMap) - Method in class com.mongodb.spark.sql.connector.MongoTableProvider
-
Infer the schema of the table identified by the given options.
- inferSchema(CaseInsensitiveStringMap) - Static method in class com.mongodb.spark.sql.connector.schema.InferSchema
-
Infer the schema for the collection
- InferSchema - Class in com.mongodb.spark.sql.connector.schema
-
A helper that determines the
StructType
for aBsonDocument
and finds the commonStructType
for a list of BsonDocuments. - inferSchemaMapType() - Method in class com.mongodb.spark.sql.connector.config.ReadConfig
- initialize(String, CaseInsensitiveStringMap) - Method in class com.mongodb.spark.sql.connector.MongoCatalog
-
Initializes the MongoCatalog.
- INSERT - com.mongodb.spark.sql.connector.config.WriteConfig.OperationType
-
Insert the whole document
- isInferred(StructType) - Static method in class com.mongodb.spark.sql.connector.schema.InferSchema
- isOrdered() - Method in class com.mongodb.spark.sql.connector.config.WriteConfig
- isUpsert() - Method in class com.mongodb.spark.sql.connector.config.WriteConfig
J
- JavaScala - Class in com.mongodb.spark.sql.connector.interop
-
Utils object to convert Java To Scala to enable cross build
L
- LazyMongoClientCache - Class in com.mongodb.spark.sql.connector.connection
-
A lazily initialized
MongoClientCache
. - listNamespaces() - Method in class com.mongodb.spark.sql.connector.MongoCatalog
-
List namespaces (databases).
- listNamespaces(String[]) - Method in class com.mongodb.spark.sql.connector.MongoCatalog
-
List namespaces (databases).
- listTables(String[]) - Method in class com.mongodb.spark.sql.connector.MongoCatalog
-
List the collections in a namespace (database).
- loadNamespaceMetadata(String[]) - Method in class com.mongodb.spark.sql.connector.MongoCatalog
-
Currently only returns an empty map if the namespace (database) exists.
- loadTable(Identifier) - Method in class com.mongodb.spark.sql.connector.MongoCatalog
-
Load collection.
- LOGGER - Static variable in interface com.mongodb.spark.sql.connector.read.partitioner.Partitioner
M
- matchQuery(List<BsonDocument>) - Static method in class com.mongodb.spark.sql.connector.read.partitioner.PartitionerHelper
-
Returns the head
$match
aggregation stage or an empty document. - MAX_BATCH_SIZE_CONFIG - Static variable in class com.mongodb.spark.sql.connector.config.WriteConfig
-
The maximum batch size for the batch in the bulk operation.
- MAX_NUMBER_OF_PARTITIONS_CONFIG - Static variable in class com.mongodb.spark.sql.connector.read.partitioner.PaginateIntoPartitionsPartitioner
- MongoCatalog - Class in com.mongodb.spark.sql.connector
-
Spark Catalog methods for working with namespaces (databases) and tables (collections).
- MongoCatalog() - Constructor for class com.mongodb.spark.sql.connector.MongoCatalog
- MongoClientFactory - Interface in com.mongodb.spark.sql.connector.connection
-
A factory for creating MongoClients
- MongoConfig - Interface in com.mongodb.spark.sql.connector.config
-
The MongoConfig interface.
- MongoInputPartition - Class in com.mongodb.spark.sql.connector.read
-
The MongoInputPartition.
- MongoInputPartition(int, List<BsonDocument>) - Constructor for class com.mongodb.spark.sql.connector.read.MongoInputPartition
-
Construct a new instance
- MongoInputPartition(int, List<BsonDocument>, List<String>) - Constructor for class com.mongodb.spark.sql.connector.read.MongoInputPartition
-
Construct a new instance
- MongoScanBuilder - Class in com.mongodb.spark.sql.connector.read
-
A builder for a
MongoScan
. - MongoScanBuilder(StructType, ReadConfig) - Constructor for class com.mongodb.spark.sql.connector.read.MongoScanBuilder
-
Construct a new instance
- MongoSparkException - Exception in com.mongodb.spark.sql.connector.exceptions
-
A base class for all mongo spark exceptions.
- MongoSparkException(String) - Constructor for exception com.mongodb.spark.sql.connector.exceptions.MongoSparkException
-
Constructs a new instance.
- MongoSparkException(String, Throwable) - Constructor for exception com.mongodb.spark.sql.connector.exceptions.MongoSparkException
-
Constructs a new instance.
- MongoSparkException(Throwable) - Constructor for exception com.mongodb.spark.sql.connector.exceptions.MongoSparkException
-
Constructs a new instance.
- MongoTableProvider - Class in com.mongodb.spark.sql.connector
-
The MongoDB collection provider
- MongoTableProvider() - Constructor for class com.mongodb.spark.sql.connector.MongoTableProvider
-
Construct a new instance
- MongoWriteBuilder - Class in com.mongodb.spark.sql.connector.write
-
MongoWriteBuilder handles the creation of batch writer or streaming writers.
- MongoWriteBuilder(LogicalWriteInfo, WriteConfig) - Constructor for class com.mongodb.spark.sql.connector.write.MongoWriteBuilder
-
Construct a new instance
N
- name() - Method in class com.mongodb.spark.sql.connector.MongoCatalog
- NAME - Static variable in class com.mongodb.spark.connector.Versions
- namespaceExists(String[]) - Method in class com.mongodb.spark.sql.connector.MongoCatalog
-
Test whether a namespace (database) exists.
- NotThreadSafe - Annotation Type in com.mongodb.spark.sql.connector.annotations
-
The class to which this annotation is applied is not thread-safe.
O
- OBJECT_OR_ARRAY_ONLY - com.mongodb.spark.sql.connector.config.WriteConfig.ConvertJson
-
Only try to parse strings are potentially json objects or arrays
- OPERATION_TYPE_CONFIG - Static variable in class com.mongodb.spark.sql.connector.config.WriteConfig
-
The write operation type to perform
- ORDERED_BULK_OPERATION_CONFIG - Static variable in class com.mongodb.spark.sql.connector.config.WriteConfig
-
Use ordered bulk operations
- OUTPUT_EXTENDED_JSON_CONFIG - Static variable in class com.mongodb.spark.sql.connector.config.ReadConfig
-
Output extended JSON for any String types.
- outputExtendedJson() - Method in class com.mongodb.spark.sql.connector.config.ReadConfig
P
- PaginateBySizePartitioner - Class in com.mongodb.spark.sql.connector.read.partitioner
-
Paginate by size partitioner.
- PaginateBySizePartitioner() - Constructor for class com.mongodb.spark.sql.connector.read.partitioner.PaginateBySizePartitioner
-
Construct an instance
- PaginateIntoPartitionsPartitioner - Class in com.mongodb.spark.sql.connector.read.partitioner
-
Paginate into partitions partitioner.
- PaginateIntoPartitionsPartitioner() - Constructor for class com.mongodb.spark.sql.connector.read.partitioner.PaginateIntoPartitionsPartitioner
-
Construct an instance
- PARTITION_SIZE_MB_CONFIG - Static variable in class com.mongodb.spark.sql.connector.read.partitioner.PaginateBySizePartitioner
- PARTITION_SIZE_MB_CONFIG - Static variable in class com.mongodb.spark.sql.connector.read.partitioner.SamplePartitioner
- Partitioner - Interface in com.mongodb.spark.sql.connector.read.partitioner
-
The Partitioner provides the logic to partition a collection individual processable partitions.
- PARTITIONER_CONFIG - Static variable in class com.mongodb.spark.sql.connector.config.ReadConfig
-
The partitioner full class name.
- PARTITIONER_DEFAULT - Static variable in class com.mongodb.spark.sql.connector.config.ReadConfig
-
The default partitioner if none is set: "com.mongodb.spark.sql.connector.read.partitioner.SamplePartitioner"
- PARTITIONER_OPTIONS_PREFIX - Static variable in class com.mongodb.spark.sql.connector.config.ReadConfig
-
The prefix for specific partitioner based configuration.
- PartitionerHelper - Class in com.mongodb.spark.sql.connector.read.partitioner
-
Partitioner helper class, contains various utility methods used by the partitioner instances.
- preferredLocations() - Method in class com.mongodb.spark.sql.connector.read.MongoInputPartition
-
The preferred locations for the read.
- PREFIX - Static variable in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
The prefix for all general Spark MongoDB configurations.
- pruneColumns(StructType) - Method in class com.mongodb.spark.sql.connector.read.MongoScanBuilder
- pushedFilters() - Method in class com.mongodb.spark.sql.connector.read.MongoScanBuilder
- pushFilters(Filter[]) - Method in class com.mongodb.spark.sql.connector.read.MongoScanBuilder
-
Processes filters on the dataset.
R
- READ_PREFIX - Static variable in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
The prefix for specific output (read) based configurations.
- readConfig(Map<String, String>) - Static method in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
Create a Read Configuration
- ReadConfig - Class in com.mongodb.spark.sql.connector.config
-
The Read Configuration
- renameTable(Identifier, Identifier) - Method in class com.mongodb.spark.sql.connector.MongoCatalog
-
Renames a collection.
- REPLACE - com.mongodb.spark.sql.connector.config.WriteConfig.OperationType
-
Replace the whole document or insert a new one if it doesn't exist
- RowToBsonDocumentConverter - Class in com.mongodb.spark.sql.connector.schema
-
The helper for conversion of GenericRowWithSchema instances to BsonDocuments.
- RowToBsonDocumentConverter(StructType, WriteConfig.ConvertJson, boolean) - Constructor for class com.mongodb.spark.sql.connector.schema.RowToBsonDocumentConverter
-
Construct a new instance
- RowToBsonDocumentConverter.ObjectToBsonValue - Interface in com.mongodb.spark.sql.connector.schema
-
A serializable
Function<Object, BsonValue>
interface.
S
- SamplePartitioner - Class in com.mongodb.spark.sql.connector.read.partitioner
-
Sample Partitioner
- SamplePartitioner() - Constructor for class com.mongodb.spark.sql.connector.read.partitioner.SamplePartitioner
-
Construct an instance
- ShardedPartitioner - Class in com.mongodb.spark.sql.connector.read.partitioner
-
Sharded Partitioner
- ShardedPartitioner() - Constructor for class com.mongodb.spark.sql.connector.read.partitioner.ShardedPartitioner
-
Construct an instance
- shortName() - Method in class com.mongodb.spark.sql.connector.MongoTableProvider
- SINGLE_PARTITIONER - Static variable in class com.mongodb.spark.sql.connector.read.partitioner.PartitionerHelper
- SinglePartitionPartitioner - Class in com.mongodb.spark.sql.connector.read.partitioner
-
Single Partition Partitioner
- SinglePartitionPartitioner() - Constructor for class com.mongodb.spark.sql.connector.read.partitioner.SinglePartitionPartitioner
-
Construct an instance
- storageStats(ReadConfig) - Static method in class com.mongodb.spark.sql.connector.read.partitioner.PartitionerHelper
- STREAM_LOOKUP_FULL_DOCUMENT_CONFIG - Static variable in class com.mongodb.spark.sql.connector.config.ReadConfig
-
Streaming full document configuration.
- STREAM_MICRO_BATCH_MAX_PARTITION_COUNT_CONFIG - Static variable in class com.mongodb.spark.sql.connector.config.ReadConfig
-
Configures the maximum number of partitions per micro batch.
- STREAM_PUBLISH_FULL_DOCUMENT_ONLY_CONFIG - Static variable in class com.mongodb.spark.sql.connector.config.ReadConfig
-
Publish Full Document only when streaming.
- STREAMING_STARTUP_MODE_CONFIG - Static variable in class com.mongodb.spark.sql.connector.config.ReadConfig
-
The start up behavior when there is no stored offset available.
- STREAMING_STARTUP_MODE_TIMESTAMP_START_AT_OPERATION_TIME_CONFIG - Static variable in class com.mongodb.spark.sql.connector.config.ReadConfig
-
The `startAtOperationTime` configuration.
- streamPublishFullDocumentOnly() - Method in class com.mongodb.spark.sql.connector.config.ReadConfig
- subConfiguration(String) - Method in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
Gets all configurations starting with a prefix.
- supportsExternalMetadata() - Method in class com.mongodb.spark.sql.connector.MongoTableProvider
-
Returns true if the source has the ability of accepting external table metadata when getting tables.
T
- tableExists(Identifier) - Method in class com.mongodb.spark.sql.connector.MongoCatalog
-
Test whether a collection exists.
- ThreadSafe - Annotation Type in com.mongodb.spark.sql.connector.annotations
-
The class to which this annotation is applied is thread-safe.
- toInternalRow(BsonDocument) - Method in class com.mongodb.spark.sql.connector.schema.BsonDocumentToRowConverter
-
Converts a
BsonDocument
into aInternalRow
. - toJson(BsonValue) - Static method in class com.mongodb.spark.sql.connector.schema.ConverterHelper
-
Converts a bson value into its extended JSON form
- toReadConfig() - Method in interface com.mongodb.spark.sql.connector.config.MongoConfig
- toString() - Method in enum com.mongodb.spark.sql.connector.config.WriteConfig.ConvertJson
- toString() - Method in enum com.mongodb.spark.sql.connector.config.WriteConfig.OperationType
- toString() - Method in class com.mongodb.spark.sql.connector.read.MongoInputPartition
- toString() - Method in class com.mongodb.spark.sql.connector.schema.BsonDocumentToRowConverter
- toWriteConfig() - Method in interface com.mongodb.spark.sql.connector.config.MongoConfig
- truncate() - Method in class com.mongodb.spark.sql.connector.write.MongoWriteBuilder
U
- UPDATE - com.mongodb.spark.sql.connector.config.WriteConfig.OperationType
-
Update the document or insert a new one if it doesn't exist
- UPSERT_DOCUMENT_CONFIG - Static variable in class com.mongodb.spark.sql.connector.config.WriteConfig
-
Upsert documents, when using replace or update operations.
V
- validateConfig(Supplier<T>, Supplier<String>) - Static method in class com.mongodb.spark.sql.connector.assertions.Assertions
-
Checks the validity of a value
- validateConfig(T, Predicate<T>, Supplier<String>) - Static method in class com.mongodb.spark.sql.connector.assertions.Assertions
-
Checks the validity of a value
- valueOf(String) - Static method in enum com.mongodb.spark.sql.connector.config.WriteConfig.ConvertJson
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum com.mongodb.spark.sql.connector.config.WriteConfig.OperationType
-
Returns the enum constant of this type with the specified name.
- values() - Static method in enum com.mongodb.spark.sql.connector.config.WriteConfig.ConvertJson
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum com.mongodb.spark.sql.connector.config.WriteConfig.OperationType
-
Returns an array containing the constants of this enum type, in the order they are declared.
- VERSION - Static variable in class com.mongodb.spark.connector.Versions
- Versions - Class in com.mongodb.spark.connector
W
- withOption(String, String) - Method in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
Return a
MongoConfig
instance with the extra options applied. - withOption(String, String) - Method in class com.mongodb.spark.sql.connector.config.ReadConfig
- withOption(String, String) - Method in class com.mongodb.spark.sql.connector.config.WriteConfig
- withOptions(Map<String, String>) - Method in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
Return a
MongoConfig
instance with the extra options applied. - withOptions(Map<String, String>) - Method in class com.mongodb.spark.sql.connector.config.ReadConfig
- withOptions(Map<String, String>) - Method in class com.mongodb.spark.sql.connector.config.WriteConfig
- WRITE_CONCERN_JOURNAL_CONFIG - Static variable in class com.mongodb.spark.sql.connector.config.WriteConfig
-
The optional
WriteConcern
journal property. - WRITE_CONCERN_W_CONFIG - Static variable in class com.mongodb.spark.sql.connector.config.WriteConfig
-
The optional
WriteConcern
w property. - WRITE_CONCERN_W_TIMEOUT_MS_CONFIG - Static variable in class com.mongodb.spark.sql.connector.config.WriteConfig
-
The optional
WriteConcern
wTimeout property in milliseconds. - WRITE_PREFIX - Static variable in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
The prefix for specific input (write) based configurations.
- writeConfig(Map<String, String>) - Static method in interface com.mongodb.spark.sql.connector.config.MongoConfig
-
Create a Write Configuration
- WriteConfig - Class in com.mongodb.spark.sql.connector.config
-
The Write Configuration
- WriteConfig.ConvertJson - Enum in com.mongodb.spark.sql.connector.config
-
The convert json configuration.
- WriteConfig.OperationType - Enum in com.mongodb.spark.sql.connector.config
-
The operation type for the write.
All Classes All Packages