Skip to main content

try_ip_host

Applies to: check marked yes Databricks Runtime 18.2 and above

Beta

This feature is in Beta. Workspace admins can control access to this feature from the Previews page. See Manage Databricks previews.

Returns the canonical representation of an IPv4 or IPv6 address. Returns None instead of raising an error if the input is invalid.

For the corresponding SQL function, see try_ip_host function.

Syntax

Python
from pyspark.databricks.sql import functions as dbf

dbf.try_ip_host(col=<col>)

Parameters

Parameter

Type

Description

col

pyspark.sql.Column or str

A STRING or BINARY value representing a valid IPv4 or IPv6 address.

Examples

Example 1: Validate an IPv4 address.

Python
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([('192.168.1.5',)], ['ip'])
df.select(dbf.try_ip_host('ip').alias('result')).collect()
Output
[Row(result='192.168.1.5')]

Example 2: Canonicalize an IPv6 address.

Python
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([('2001:0db8::1',)], ['ip'])
df.select(dbf.try_ip_host('ip').alias('result')).collect()
Output
[Row(result='2001:db8::1')]

Example 3: Invalid input returns None.

Python
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([('invalid.ip',)], ['ip'])
df.select(dbf.try_ip_host('ip').alias('result')).collect()
Output
[Row(result=None)]

Example 4: None input returns None.

Python
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([(None,)], 'ip: string')
df.select(dbf.try_ip_host('ip').alias('result')).collect()
Output
[Row(result=None)]