map_entries
Returns an unordered array of all entries in the given map.
Syntax
Python
from pyspark.sql import functions as sf
sf.map_entries(col)
Parameters
Parameter | Type | Description |
|---|---|---|
|
| Name of column or expression |
Returns
pyspark.sql.Column: An array of key value pairs as a struct type
Examples
Example 1: Extracting entries from a simple map
Python
from pyspark.sql import functions as sf
df = spark.sql("SELECT map(1, 'a', 2, 'b') as data")
df.select(sf.sort_array(sf.map_entries("data"))).show()
Output
+-----------------------------------+
|sort_array(map_entries(data), true)|
+-----------------------------------+
| [{1, a}, {2, b}]|
+-----------------------------------+
Example 2: Extracting entries from a map with complex keys and values
Python
from pyspark.sql import functions as sf
df = spark.sql("SELECT map(array(1, 2), array('a', 'b'), "
"array(3, 4), array('c', 'd')) as data")
df.select(sf.sort_array(sf.map_entries("data"))).show(truncate=False)
Output
+------------------------------------+
|sort_array(map_entries(data), true) |
+------------------------------------+
|[{[1, 2], [a, b]}, {[3, 4], [c, d]}]|
+------------------------------------+
Example 3: Extracting entries from an empty map
Python
from pyspark.sql import functions as sf
df = spark.sql("SELECT map() as data")
df.select(sf.map_entries("data")).show()
Output
+-----------------+
|map_entries(data)|
+-----------------+
| []|
+-----------------+