Skip to main content

array_append

Returns a new array column by appending a value to the existing array.

Syntax

Python
from pyspark.sql import functions as sf

sf.array_append(col, value)

Parameters

Parameter

Type

Description

col

pyspark.sql.Column or str

The name of the column containing the array.

value

Any

A literal value, or a Column expression to be appended to the array.

Returns

pyspark.sql.Column: A new array column with value appended to the original array.

Examples

Example 1: Appending a column value to an array column

Python
from pyspark.sql import Row, functions as sf
df = spark.createDataFrame([Row(c1=["b", "a", "c"], c2="c")])
df.select(sf.array_append(df.c1, df.c2)).show()
Output
+--------------------+
|array_append(c1, c2)|
+--------------------+
| [b, a, c, c]|
+--------------------+

Example 2: Appending a numeric value to an array column

Python
from pyspark.sql import functions as sf
df = spark.createDataFrame([([1, 2, 3],)], ['data'])
df.select(sf.array_append(df.data, 4)).show()
Output
+---------------------+
|array_append(data, 4)|
+---------------------+
| [1, 2, 3, 4]|
+---------------------+

Example 3: Appending a null value to an array column

Python
from pyspark.sql import functions as sf
df = spark.createDataFrame([([1, 2, 3],)], ['data'])
df.select(sf.array_append(df.data, None)).show()
Output
+------------------------+
|array_append(data, NULL)|
+------------------------+
| [1, 2, 3, NULL]|
+------------------------+

Example 4: Appending a value to a NULL array column

Python
from pyspark.sql import functions as sf
from pyspark.sql.types import ArrayType, IntegerType, StructType, StructField
schema = StructType([
StructField("data", ArrayType(IntegerType()), True)
])
df = spark.createDataFrame([(None,)], schema=schema)
df.select(sf.array_append(df.data, 4)).show()
Output
+---------------------+
|array_append(data, 4)|
+---------------------+
| NULL|
+---------------------+

Example 5: Appending a value to an empty array

Python
from pyspark.sql import functions as sf
from pyspark.sql.types import ArrayType, IntegerType, StructType, StructField
schema = StructType([
StructField("data", ArrayType(IntegerType()), True)
])
df = spark.createDataFrame([([],)], schema=schema)
df.select(sf.array_append(df.data, 1)).show()
Output
+---------------------+
|array_append(data, 1)|
+---------------------+
| [1]|
+---------------------+