PySpark data types
This page provides a list of PySpark data types available on Databricks with links to corresponding reference documentation.
Data type | Description |
|---|---|
Array data type | |
Binary (byte array) data type | |
Boolean data type | |
Byte data type, representing signed 8-bit integers | |
Calendar intervals | |
Char data type | |
Base class for data types | |
Date (datetime.date) data type | |
DayTimeIntervalType (datetime.timedelta) | |
Decimal (decimal.Decimal) data type | |
Double data type, representing double precision floats | |
Float data type, representing single precision floats | |
Geography data type | |
Geometry data type | |
Int data type, representing signed 32-bit integers | |
Long data type, representing signed 64-bit integers | |
Map data type | |
Null type | |
Short data type, representing signed 16-bit integers | |
String data type | |
A field in StructType | |
Struct type, consisting of a list of StructField | |
Timestamp (datetime.datetime) data type | |
Timestamp (datetime.datetime) data type without timezone information | |
Varchar data type | |
Variant data type, representing semi-structured values | |
YearMonthIntervalType, represents year-month intervals of the SQL standard |