Skip to main content

count

Returns the number of items in a group.

Syntax

Python
from pyspark.sql import functions as sf

sf.count(col)

Parameters

Parameter

Type

Description

col

pyspark.sql.Column or column name

Target column to compute on.

Returns

pyspark.sql.Column: column for computed results.

Examples

Example 1: Count all rows in a DataFrame

Python
from pyspark.sql import functions as sf
df = spark.createDataFrame([(None,), ("a",), ("b",), ("c",)], schema=["alphabets"])
df.select(sf.count(sf.expr("*"))).show()
Output
+--------+
|count(1)|
+--------+
| 4|
+--------+

Example 2: Count non-null values in a specific column

Python
from pyspark.sql import functions as sf
df.select(sf.count(df.alphabets)).show()
Output
+----------------+
|count(alphabets)|
+----------------+
| 3|
+----------------+

Example 3: Count all rows in a DataFrame with multiple columns

Python
from pyspark.sql import functions as sf
df = spark.createDataFrame(
[(1, "apple"), (2, "banana"), (3, None)], schema=["id", "fruit"])
df.select(sf.count(sf.expr("*"))).show()
Output
+--------+
|count(1)|
+--------+
| 3|
+--------+

Example 4: Count non-null values in multiple columns

Python
from pyspark.sql import functions as sf
df.select(sf.count(df.id), sf.count(df.fruit)).show()
Output
+---------+------------+
|count(id)|count(fruit)|
+---------+------------+
| 3| 2|
+---------+------------+