Skip to main content

h3_try_coverash3

Returns an array of H3 cell IDs represented as long integers, corresponding to hexagons or pentagons of the specified resolution that minimally cover the input linear or areal geography. The expression returns None if the geography is not linear (linestring or multilinestring), areal (polygon or multipolygon) or if an error is found when parsing the input. The expression returns an error if the input resolution is invalid. representation of the geography.

Acceptable input representations are WKT, GeoJSON, and WKB. In the first two cases the input is expected to be of type string, whereas in the last case the input is expected to be of type BINARY.

For the corresponding Databricks SQL function, see h3_try_coverash3 function.

Syntax

Python
from pyspark.databricks.sql import functions as dbf

dbf.h3_try_coverash3(col1=<col1>, col2=<col2>)

Parameters

Parameter

Type

Description

col1

pyspark.sql.Column or str

A string representing a linear or areal geography in the WGS84 coordinate reference system in WKT or GeoJSON format, or a BINARY representing a linear or areal geography in the WGS84 coordinate reference system in WKB format.

col2

pyspark.sql.Column, str, or int

The resolution of the H3 cell IDs that cover the geography.

Examples

Python
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([('POLYGON((-122.4194 37.7749,-118.2437 34.0522,-74.0060 40.7128,-122.4194 37.7749))', 1),],['wkt', 'res'])
df.select(dbf.h3_try_coverash3('wkt', 'res').alias('result')).collect()
Output
[Row(result=[581650447186526207, 581672437419081727, 581698825698148351, 581707621791170559,     581716417884192767, 582248581512036351, 581637253046992895, 581641651093503999,     581646049140015103])]
Python
df_invalid = spark.createDataFrame([('invalid input', 1),], ['wkt', 'res'])
df_invalid.select(dbf.h3_try_coverash3('wkt', 'res').alias('result')).collect()
Output
[Row(result=None)]