Skip to main content

h3_polyfillash3

Returns an array of H3 cell IDs represented as long integers, corresponding to hexagons or pentagons of the specified resolution that are contained by the input areal geography. Containment is determined by the cell centroids: a cell is considered to cover the geography if the cell's centroid lies inside the areal geography. The expression emits an error if the geography is not areal (polygon or multipolygon) or if an error is found when parsing the input representation of the geography. The acceptable input representations are WKT, GeoJSON, and WKB. In the first two cases the input is expected to be of type string, whereas in the last case the input is expected to be of type BINARY. Supports Spark Connect.

For the corresponding Databricks SQL function, see h3_polyfillash3 function.

Syntax

Python
from pyspark.databricks.sql import functions as dbf

dbf.h3_polyfillash3(col1=<col1>, col2=<col2>)

Parameters

Parameter

Type

Description

col1

pyspark.sql.Column or str

A string representing a geography in the WGS84 coordinate reference system in WKT or GeoJSON format, or a BINARY representing a geography in the WGS84 coordinate reference system in WKB format.

col2

pyspark.sql.Column, str, or int

The resolution of the H3 cell IDs that cover the geography.

Examples

Python
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([(
... 'POLYGON((-122.4194 37.7749,-118.2437 34.0522,-74.0060 40.7128,-122.4194 37.7749))', 2),],
... ['wkt', 'res'])
df.select(dbf.h3_polyfillash3('wkt', 'res').alias('result')).collect()
Output
[Row(result=[586146350232502271, 586147449744130047, 586198577034821631, 586152397546455039,     586199676546449407, 586153497058082815, 586142501941805055, 586201325813891071])]