Skip to main content

hours

Partition transform function: A transform for timestamps to partition data into hours. Supports Spark Connect.

warning

Deprecated in 4.0.0. Use partitioning.hours instead.

Syntax

Python
from pyspark.databricks.sql import functions as dbf

dbf.hours(col=<col>)

Parameters

Parameter

Type

Description

col

pyspark.sql.Column or str

Target date or timestamp column to work on.

Returns

pyspark.sql.Column: Data partitioned by hours.

Examples

Python
df.writeTo("catalog.db.table").partitionedBy(
hours("ts")
).createOrReplace()
note

This function can be used only in combination with the partitionedBy method of the DataFrameWriterV2.