Skip to content

Commit d4fd5e2

Browse files
lzlfredvkorukanti
authored andcommitted
[Spark] Support write partition columns to data files for Delta
This support write partition columns to data files for Delta. The feature is required by Uniform Iceberg as Iceberg spec defines so. The approach is to copy FileFormatWriter from Spark as a DeltaFileFormatWriter, and add the option and logic there to support writing partition columns. Closes #2367 GitOrigin-RevId: 77657bb422ce93b924f3cb25548e845477f8632f
1 parent 75dba07 commit d4fd5e2

File tree

3 files changed

+490
-4
lines changed

3 files changed

+490
-4
lines changed

spark/src/main/scala/org/apache/spark/sql/delta/DeltaOptions.scala

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -289,6 +289,10 @@ object DeltaOptions extends DeltaLogging {
289289
*/
290290
val STREAMING_SOURCE_TRACKING_ID = "streamingSourceTrackingId"
291291

292+
/**
293+
* An option to control if delta will write partition columns to data files
294+
*/
295+
val WRITE_PARTITION_COLUMNS = "writePartitionColumns"
292296

293297
val validOptionKeys : Set[String] = Set(
294298
REPLACE_WHERE_OPTION,
@@ -323,7 +327,8 @@ object DeltaOptions extends DeltaLogging {
323327
"checkpointLocation",
324328
"path",
325329
VERSION_AS_OF,
326-
TIMESTAMP_AS_OF
330+
TIMESTAMP_AS_OF,
331+
WRITE_PARTITION_COLUMNS
327332
)
328333

329334

0 commit comments

Comments
 (0)