Skip to main content

array_size

Returns the total number of elements in the array. The function returns null for null input.

Syntax

Python
from pyspark.sql import functions as sf

sf.array_size(col)

Parameters

Parameter

Type

Description

col

pyspark.sql.Column or str

The name of the column or an expression that represents the array.

Returns

pyspark.sql.Column: A new column that contains the size of each array.

Examples

Example 1: Basic usage with integer array

Python
from pyspark.sql import functions as sf
df = spark.createDataFrame([([2, 1, 3],), (None,)], ['data'])
df.select(sf.array_size(df.data)).show()
Output
+----------------+
|array_size(data)|
+----------------+
| 3|
| NULL|
+----------------+

Example 2: Usage with string array

Python
from pyspark.sql import functions as sf
df = spark.createDataFrame([(['apple', 'banana', 'cherry'],)], ['data'])
df.select(sf.array_size(df.data)).show()
Output
+----------------+
|array_size(data)|
+----------------+
| 3|
+----------------+

Example 3: Usage with mixed type array

Python
from pyspark.sql import functions as sf
df = spark.createDataFrame([(['apple', 1, 'cherry'],)], ['data'])
df.select(sf.array_size(df.data)).show()
Output
+----------------+
|array_size(data)|
+----------------+
| 3|
+----------------+

Example 4: Usage with array of arrays

Python
from pyspark.sql import functions as sf
df = spark.createDataFrame([([[2, 1], [3, 4]],)], ['data'])
df.select(sf.array_size(df.data)).show()
Output
+----------------+
|array_size(data)|
+----------------+
| 2|
+----------------+

Example 5: Usage with empty array

Python
from pyspark.sql import functions as sf
from pyspark.sql.types import ArrayType, IntegerType, StructType, StructField
schema = StructType([
StructField("data", ArrayType(IntegerType()), True)
])
df = spark.createDataFrame([([],)], schema=schema)
df.select(sf.array_size(df.data)).show()
Output
+----------------+
|array_size(data)|
+----------------+
| 0|
+----------------+