Skip to main content

exp

Computes the exponential of the given value. Supports Spark Connect.

For the corresponding Databricks SQL function, see exp function.

Syntax

Python
from pyspark.databricks.sql import functions as dbf

dbf.exp(col=<col>)

Parameters

Parameter

Type

Description

col

pyspark.sql.Column or column name

column to calculate exponential for.

Returns

pyspark.sql.Column: exponential of the given value.

Examples

Python
from pyspark.databricks.sql import functions as dbf
df = spark.sql("SELECT id AS value FROM RANGE(5)")
df.select("*", dbf.exp(df.value)).show() # doctest: +SKIP
Output
+-----+------------------+
|value| EXP(value)|
+-----+------------------+
| 0| 1.0|
| 1|2.7182818284590...|
| 2| 7.38905609893...|
| 3|20.085536923187...|
| 4|54.598150033144...|
+-----+------------------+

Python
from pyspark.databricks.sql import functions as dbf
spark.sql(
"SELECT * FROM VALUES (FLOAT('NAN')), (NULL) AS TAB(value)"
).select("*", dbf.exp("value")).show()
Output
+-----+----------+
|value|EXP(value)|
+-----+----------+
| NaN| NaN|
| NULL| NULL|
+-----+----------+