h3_try_coverash3
Returns an array of H3 cell IDs represented as long integers, corresponding to hexagons or pentagons of the specified resolution that minimally cover the input linear or areal geography. The expression returns None if the geography is not linear (linestring or multilinestring), areal (polygon or multipolygon) or if an error is found when parsing the input. The expression returns an error if the input resolution is invalid. representation of the geography.
Acceptable input representations are WKT, GeoJSON, and WKB. In the first two cases the input is expected to be of type string, whereas in the last case the input is expected to be of type BINARY.
For the corresponding Databricks SQL function, see h3_try_coverash3 function.
Syntax
from pyspark.databricks.sql import functions as dbf
dbf.h3_try_coverash3(col1=<col1>, col2=<col2>)
Parameters
Parameter | Type | Description |
|---|---|---|
|
| A string representing a linear or areal geography in the WGS84 coordinate reference system in WKT or GeoJSON format, or a BINARY representing a linear or areal geography in the WGS84 coordinate reference system in WKB format. |
|
| The resolution of the H3 cell IDs that cover the geography. |
Examples
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([('POLYGON((-122.4194 37.7749,-118.2437 34.0522,-74.0060 40.7128,-122.4194 37.7749))', 1),],['wkt', 'res'])
df.select(dbf.h3_try_coverash3('wkt', 'res').alias('result')).collect()
[Row(result=[581650447186526207, 581672437419081727, 581698825698148351, 581707621791170559, 581716417884192767, 582248581512036351, 581637253046992895, 581641651093503999, 581646049140015103])]
df_invalid = spark.createDataFrame([('invalid input', 1),], ['wkt', 'res'])
df_invalid.select(dbf.h3_try_coverash3('wkt', 'res').alias('result')).collect()
[Row(result=None)]