Skip to main content

try_make_timestamp_ltz

Try to create the current timestamp with local time zone from years, months, days, hours, mins, secs and timezone fields. The function returns NULL on invalid inputs.

Syntax

Python
from pyspark.databricks.sql import functions as dbf

dbf.try_make_timestamp_ltz(years=<years>, months=<months>, days=<days>, hours=<hours>, mins=<mins>, secs=<secs>, timezone=<timezone>)

Parameters

Parameter

Type

Description

years

pyspark.sql.Column or str

The year to represent, from 1 to 9999

months

pyspark.sql.Column or str

The month-of-year to represent, from 1 (January) to 12 (December)

days

pyspark.sql.Column or str

The day-of-month to represent, from 1 to 31

hours

pyspark.sql.Column or str

The hour-of-day to represent, from 0 to 23

mins

pyspark.sql.Column or str

The minute-of-hour to represent, from 0 to 59

secs

pyspark.sql.Column or str

The second-of-minute and its micro-fraction to represent, from 0 to 60. The value can be either an integer like 13 , or a fraction like 13.123. If the sec argument equals to 60, the seconds field is set to 0 and 1 minute is added to the final timestamp.

timezone

pyspark.sql.Column or str, optional

The time zone identifier. For example, CET, UTC and etc.

Returns

pyspark.sql.Column: A new column that contains a current timestamp, or NULL in case of an error.

Examples

Python
spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles")
Python
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([[2014, 12, 28, 6, 30, 45.887, 'CET']],
['year', 'month', 'day', 'hour', 'min', 'sec', 'tz'])
df.select(
dbf.try_make_timestamp_ltz('year', 'month', df.day, df.hour, df.min, df.sec, 'tz')
).show(truncate=False)
df = spark.createDataFrame([[2014, 12, 28, 6, 30, 45.887, 'CET']],
['year', 'month', 'day', 'hour', 'min', 'sec', 'tz'])
df.select(
dbf.try_make_timestamp_ltz('year', 'month', df.day, df.hour, df.min, df.sec)
).show(truncate=False)
df = spark.createDataFrame([[2014, 13, 28, 6, 30, 45.887, 'CET']],
['year', 'month', 'day', 'hour', 'min', 'sec', 'tz'])
df.select(
dbf.try_make_timestamp_ltz('year', 'month', df.day, df.hour, df.min, df.sec)
).show(truncate=False)
spark.conf.unset("spark.sql.session.timeZone")