split
Splits str around matches of the given pattern.
For the corresponding Databricks SQL function, see split function.
Syntax
from pyspark.databricks.sql import functions as dbf
dbf.split(str=<str>, pattern=<pattern>, limit=<limit>)
Parameters
Parameter | Type | Description |
|---|---|---|
|
| a string expression to split |
|
| a string representing a regular expression. The regex string should be a Java regular expression. accepted as a regular expression representation, for backwards compatibility. In addition to int, |
|
| an integer which controls the number of times |
Returns
pyspark.sql.Column: array of separated strings.
Examples
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([('oneAtwoBthreeC',)], ['s',])
df.select('*', dbf.split(df.s, '[ABC]')).show()
df.select('*', dbf.split(df.s, '[ABC]', 2)).show()
df.select('*', dbf.split('s', '[ABC]', -2)).show()
df = spark.createDataFrame([
('oneAtwoBthreeC', '[ABC]', 2),
('1A2B3C', '[1-9]+', 1),
('aa2bb3cc4', '[1-9]+', -1)], ['s', 'p', 'l'])
df.select('*', dbf.split(df.s, df.p)).show()
df.select(dbf.split('s', df.p, 'l')).show()