:: Evolving ::
:: Evolving ::
Apply an alias to the DeltaTable. This is similar to Dataset.as(alias)
or
SQL tableName AS alias
.
0.3.0
:: Evolving ::
:: Evolving ::
Apply an alias to the DeltaTable. This is similar to Dataset.as(alias)
or
SQL tableName AS alias
.
0.3.0
:: Evolving ::
:: Evolving ::
Delete data from the table.
0.3.0
:: Evolving ::
:: Evolving ::
Delete data from the table that match the given condition
.
Boolean SQL expression
0.3.0
:: Evolving ::
:: Evolving ::
Delete data from the table that match the given condition
.
Boolean SQL expression
0.3.0
:: Evolving ::
:: Evolving ::
Generate a manifest for the given Delta Table
Specifies the mode for the generation of the manifest. The valid modes are as follows (not case sensitive):
0.5.0
:: Evolving ::
:: Evolving ::
Get the information available commits on this table as a Spark DataFrame. The information is in reverse chronological order.
0.3.0
:: Evolving ::
:: Evolving ::
Get the information of the latest limit
commits on this table as a Spark DataFrame.
The information is in reverse chronological order.
The number of previous commands to get history for
0.3.0
:: Evolving ::
:: Evolving ::
Merge data from the source
DataFrame based on the given merge condition
. This returns
a DeltaMergeBuilder object that can be used to specify the update, delete, or insert
actions to be performed on rows based on whether the rows matched the condition or not.
See the DeltaMergeBuilder for a full description of this operation and what combinations of update, delete and insert operations are allowed.
Scala example to update a key-value Delta table with new key-values from a source DataFrame:
deltaTable .as("target") .merge( source.as("source"), "target.key = source.key") .whenMatched .updateExpr(Map( "value" -> "source.value")) .whenNotMatched .insertExpr(Map( "key" -> "source.key", "value" -> "source.value")) .execute()
Java example to update a key-value Delta table with new key-values from a source DataFrame:
deltaTable .as("target") .merge( source.as("source"), "target.key = source.key") .whenMatched .updateExpr( new HashMap<String, String>() {{ put("value" -> "source.value") }}) .whenNotMatched .insertExpr( new HashMap<String, String>() {{ put("key", "source.key"); put("value", "source.value"); }}) .execute()
source Dataframe to be merged.
boolean expression as a Column object
0.3.0
:: Evolving ::
:: Evolving ::
Merge data from the source
DataFrame based on the given merge condition
. This returns
a DeltaMergeBuilder object that can be used to specify the update, delete, or insert
actions to be performed on rows based on whether the rows matched the condition or not.
See the DeltaMergeBuilder for a full description of this operation and what combinations of update, delete and insert operations are allowed.
Scala example to update a key-value Delta table with new key-values from a source DataFrame:
deltaTable .as("target") .merge( source.as("source"), "target.key = source.key") .whenMatched .updateExpr(Map( "value" -> "source.value")) .whenNotMatched .insertExpr(Map( "key" -> "source.key", "value" -> "source.value")) .execute()
Java example to update a key-value Delta table with new key-values from a source DataFrame:
deltaTable .as("target") .merge( source.as("source"), "target.key = source.key") .whenMatched .updateExpr( new HashMap<String, String>() {{ put("value" -> "source.value"); }}) .whenNotMatched .insertExpr( new HashMap<String, String>() {{ put("key", "source.key"); put("value", "source.value"); }}) .execute();
source Dataframe to be merged.
boolean expression as SQL formatted string
0.3.0
:: Evolving ::
:: Evolving ::
Get a DataFrame (that is, Dataset[Row]) representation of this Delta table.
0.3.0
:: Evolving ::
:: Evolving ::
Update data from the table on the rows that match the given condition
based on the rules defined by set
.
Java example to increment the column data
.
import org.apache.spark.sql.Column; import org.apache.spark.sql.functions; deltaTable.update( functions.col("date").gt("2018-01-01"), new HashMap<String, Column>() {{ put("data", functions.col("data").plus(1)); }} );
boolean expression as Column object specifying which rows to update.
rules to update a row as a Java map between target column names and corresponding update expressions as Column objects.
0.3.0
:: Evolving ::
:: Evolving ::
Update data from the table on the rows that match the given condition
based on the rules defined by set
.
Scala example to increment the column data
.
import org.apache.spark.sql.functions._ deltaTable.update( col("date") > "2018-01-01", Map("data" -> col("data") + 1))
boolean expression as Column object specifying which rows to update.
rules to update a row as a Scala map between target column names and corresponding update expressions as Column objects.
0.3.0
:: Evolving ::
:: Evolving ::
Update rows in the table based on the rules defined by set
.
Java example to increment the column data
.
import org.apache.spark.sql.Column; import org.apache.spark.sql.functions; deltaTable.update( new HashMap<String, Column>() {{ put("data", functions.col("data").plus(1)); }} );
rules to update a row as a Java map between target column names and corresponding update expressions as Column objects.
0.3.0
:: Evolving ::
:: Evolving ::
Update rows in the table based on the rules defined by set
.
Scala example to increment the column data
.
import org.apache.spark.sql.functions._ deltaTable.update(Map("data" -> col("data") + 1))
rules to update a row as a Scala map between target column names and corresponding update expressions as Column objects.
0.3.0
:: Evolving ::
:: Evolving ::
Update data from the table on the rows that match the given condition
,
which performs the rules defined by set
.
Java example to increment the column data
.
deltaTable.update( "date > '2018-01-01'", new HashMap<String, String>() {{ put("data", "data + 1"); }} );
boolean expression as SQL formatted string object specifying which rows to update.
rules to update a row as a Java map between target column names and corresponding update expressions as SQL formatted strings.
0.3.0
:: Evolving ::
:: Evolving ::
Update data from the table on the rows that match the given condition
,
which performs the rules defined by set
.
Scala example to increment the column data
.
deltaTable.update( "date > '2018-01-01'", Map("data" -> "data + 1"))
boolean expression as SQL formatted string object specifying which rows to update.
rules to update a row as a Scala map between target column names and corresponding update expressions as SQL formatted strings.
0.3.0
:: Evolving ::
:: Evolving ::
Update rows in the table based on the rules defined by set
.
Java example to increment the column data
.
deltaTable.updateExpr( new HashMap<String, String>() {{ put("data", "data + 1"); }} );
rules to update a row as a Java map between target column names and corresponding update expressions as SQL formatted strings.
0.3.0
:: Evolving ::
:: Evolving ::
Update rows in the table based on the rules defined by set
.
Scala example to increment the column data
.
deltaTable.updateExpr(Map("data" -> "data + 1")))
rules to update a row as a Scala map between target column names and corresponding update expressions as SQL formatted strings.
0.3.0
:: Evolving ::
:: Evolving ::
Recursively delete files and directories in the table that are not needed by the table for maintaining older versions up to the given retention threshold. This method will return an empty DataFrame on successful completion.
note: This will use the default retention period of 7 days.
0.3.0
:: Evolving ::
:: Evolving ::
Recursively delete files and directories in the table that are not needed by the table for maintaining older versions up to the given retention threshold. This method will return an empty DataFrame on successful completion.
The retention threshold in hours. Files required by the table for reading versions earlier than this will be preserved and the rest of them will be deleted.
0.3.0
:: Evolving ::
Main class for programmatically interacting with Delta tables. You can create DeltaTable instances using the static methods.
0.3.0