Databricks Runtime 11.1 (EoS)
Note
Support for this Databricks Runtime version has ended. For the end-of-support date, see End-of-support history. For all supported Databricks Runtime versions, see Databricks Runtime release notes versions and compatibility.
The following release notes provide information about Databricks Runtime 11.1, powered by Apache Spark 3.3.0. Databricks released this version in July 2022.
New features and improvements
Change data feed can now automatically handle out-of-range timestamps
Describe and show SQL functions now show Unity Catalog names in their output (Public Preview)
Schema inference and evolution for Parquet files in Auto Loader (Public Preview)
Information schema support for objects created in Unity Catalog
Informational constraints on Delta Lake tables with Unity Catalog (Public Preview)
Photon is GA
Photon is now generally available, beginning with Databricks Runtime 11.1. Photon is the native vectorized query engine on Databricks, written to be directly compatible with Apache Spark APIs so it works with your existing code. Photon is developed in C++ to take advantage of modern hardware, and uses the latest techniques in vectorized query processing to capitalize on data- and instruction-level parallelism in CPUs, enhancing performance on real-world data and applications—all natively on your data lake.
Photon is part of a high-performance runtime that runs your existing SQL and DataFrame API calls faster and reduces your total cost per workload. Photon is used by default in Databricks SQL warehouses.
New features and improvements include:
New vectorized sort operator
New vectorized window functions
New instance types and sizes across all clouds
Limitations:
Scala/Python UDFs are not supported by Photon
RDD is not supported by Photon
Structured Streaming is not supported by Photon
For more information, see the following Photon announcements.
Photon: New vectorized sort operator
Photon now supports a vectorized sort for when a query contains SORT_BY
, CLUSTER_BY
, or a window function with an ORDER BY
.
Limitations: Photon does not support a global ORDER BY
clause. Sorts for window evaluation will photonize, but the global sort will continue to run in Spark.
Photon: New vectorized window functions
Photon now supports vectorized window function evaluation for many frame types and functions. New window functions include: row_number
, rank
, dense_rank
, lag
, lead
, percent_rank
, ntile
, and nth_value
. Supported window frame types: running (UNBOUNDED PRECEDING AND CURRENT ROW
), unbounded (UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING
), growing (UNBOUNDED PRECEDING AND <OFFSET> FOLLOWING
), and shrinking (<OFFSET> PRECEDING AND UNBOUNDED FOLLOWING
).
Limitations:
Photon supports only
ROWS
versions of all the frame types.Photon does not yet support the sliding frame type (
<OFFSET> PRECEDING AND <OFFSET> FOLLOWING
).
Photon: Supported instance types
i3 |
i3en |
i4i |
im4gn |
is4gen |
m5ad |
m5d |
m5dn |
m6gd |
|
r5d |
r5dn |
r6gd |
x2gd |
Change data feed can now automatically handle out-of-range timestamps
Change data feed (CDF) now has a new mode for you to provide timestamps or versions past a latest commit’s version without throwing errors. This mode is disabled by default. You can enable it by setting the configuration spark.databricks.delta.changeDataFeed.timestampOutOfRange.enabled
to true
.
Describe and show SQL functions now show Unity Catalog names in their output (Public Preview)
The commands DESC TABLE
, DESC DATABASE
, DESC SCHEMA
, DESC NAMESPACE
, DESC FUNCTION
, EXPLAIN
, and SHOW CREATE TABLE
now always show the catalog name in their output.
Schema inference and evolution for Parquet files in Auto Loader (Public Preview)
Auto Loader now supports schema inference and evolution for Parquet files. Just like JSON, CSV, and Avro formats, you can now use the rescued data column to rescue unexpected data that may appear in your Parquet files. This includes data that cannot be parsed in the data type that’s expected, columns that have a different casing, or additional columns that are not part of the expected schema. You can configure Auto Loader to evolve the schema automatically when encountering adding new columns in the incoming data. See Configure schema inference and evolution in Auto Loader.
Auto Loader now supports schema evolution for Avro (GA)
See Configure schema inference and evolution in Auto Loader.
Delta Lake support for dynamic partition overwrites
Delta Lake now enables dynamic partition overwrite mode to overwrite all existing data in each logical partition for which the write will commit new data. See Selectively overwrite data with Delta Lake.
Information schema support for objects created in Unity Catalog
Information schema provides a SQL based, self describing API to the metadata of various database objects, including tables and views, constraints and routines.
Within the information schema you find a set of views describing the objects known to the schema’s catalog that you are privileged the see.
The information schema of the SYSTEM
catalog returns information about objects across all catalogs within the metastore.
See Information schema.
Informational constraints on Delta Lake tables with Unity Catalog (Public Preview)
You can now define informational primary key and foreign key constraints on Delta Lake tables with Unity Catalog. Informational constraints are not enforced. See CONSTRAINT clause.
Unity Catalog is GA
Unity Catalog is now generally available beginning with Databricks Runtime 11.1. See What is Unity Catalog?.
Delta Sharing is GA
Delta Sharing is now generally available beginning with Databricks Runtime 11.1.
Databricks to Databricks Delta Sharing is fully managed without the need for exchanging tokens. You can create and manage providers, recipients, and shares in the UI or with SQL and REST APIs.
Some features include restricting recipient access, querying data with IP access lists and region restrictions, and delegating Delta Sharing management to non-admins. You can also query changes to data or share incremental versions with Change Data Feeds. See What is Delta Sharing?.
Behavior changes
Sensitive properties redaction for DESCRIBE TABLE and SHOW TABLE PROPERTIES
Sensitive properties are redacted in DataFrames and the output of the
DESCRIBE TABLE
and SHOW TABLE PROPERTIES
commands.
Job clusters default to single user access mode with Databricks Runtime 11.1 and higher
To be Unity Catalog capable, job clusters using Databricks Runtime 11.1 and higher created through the jobs UI or jobs API will default to single user access mode. Single User access mode supports most programming languages, cluster features and data governance features. You can still configure shared access mode through the UI or API, but languages or features might be limited.
Library upgrades
Upgraded Python libraries:
filelock from 3.6.0 to 3.7.1
plotly from 5.6.0 to 5.8.2
protobuf from 3.20.1 to 4.21.2
Upgraded R libraries:
chron from 2.3-56 to 2.3-57
DBI from 1.1.2 to 1.1.3
dbplyr from 2.1.1 to 2.2.0
e1071 from 1.7-9 to 1.7-11
future from 1.25.0 to 1.26.1
globals from 0.14.0 to 0.15.1
hardhat from 0.2.0 to 1.1.0
ipred from 0.9-12 to 0.9-13
openssl from 2.0.0 to 2.0.2
parallelly from 1.31.1 to 1.32.0
processx from 3.5.3 to 3.6.1
progressr from 0.10.0 to 0.10.1
proxy from 0.4-26 to 0.4-27
ps from 1.7.0 to 1.7.1
randomForest from 4.7-1 to 4.7-1.1
roxygen2 from 7.1.2 to 7.2.0
Rserve from 1.8-10 to 1.8-11
RSQLite from 2.2.13 to 2.2.14
sparklyr from 1.7.5 to 1.7.7
tinytex from 0.38 to 0.40
usethis from 2.1.5 to 2.1.6
xfun from 0.30 to 0.31
Upgraded Java libraries:
io.delta.delta-sharing-spark_2.12 from 0.4.0 to 0.5.0
Apache Spark
Databricks Runtime 11.2 includes Apache Spark 3.3.0. This release includes all Spark fixes and improvements included in Databricks Runtime 11.1 (EoS), as well as the following additional bug fixes and improvements made to Spark:
[SPARK-40054] [SQL] Restore the error handling syntax of try_cast()
[SPARK-39489] [CORE] Improve event logging JsonProtocol performance by using Jackson instead of Json4s
[SPARK-39319] [CORE][SQL] Make query contexts as a part of
SparkThrowable
[SPARK-40085] [SQL] Use INTERNAL_ERROR error class instead of IllegalStateException to indicate bugs
[SPARK-40001] [SQL] Make NULL writes to JSON DEFAULT columns write ‘null’ to storage
[SPARK-39635] [SQL] Support driver metrics in DS v2 custom metric API
[SPARK-39184] [SQL] Handle undersized result array in date and timestamp sequences
[SPARK-40019] [SQL] Refactor comment of ArrayType’s containsNull and refactor the misunderstanding logics in collectionOperator’s expression about
containsNull
[SPARK-39989] [SQL] Support estimate column statistics if it is foldable expression
[SPARK-39926] [SQL] Fix bug in column DEFAULT support for non-vectorized Parquet scans
[SPARK-40052] [SQL] Handle direct byte buffers in VectorizedDeltaBinaryPackedReader
[SPARK-40044] [SQL] Fix the target interval type in cast overflow errors
[SPARK-39835] [SQL] Fix EliminateSorts remove global sort below the local sort
[SPARK-40002] [SQL] Don’t push down limit through window using ntile
[SPARK-39976] [SQL] ArrayIntersect should handle null in left expression correctly
[SPARK-39985] [SQL] Enable implicit DEFAULT column values in inserts from DataFrames
[SPARK-39776] [SQL] JOIN verbose string should add Join type
[SPARK-38901] [SQL] DS V2 supports push down misc functions
[SPARK-40028] [SQL][FollowUp] Improve examples of string functions
[SPARK-39983] [CORE][SQL] Do not cache unserialized broadcast relations on the driver
[SPARK-39812] [SQL] Simplify code which construct
AggregateExpression
withtoAggregateExpression
[SPARK-40028] [SQL] Add binary examples for string expressions
[SPARK-39981] [SQL] Throw the exception QueryExecutionErrors.castingCauseOverflowErrorInTableInsert in Cast
[SPARK-40007] [PYTHON][SQL] Add ‘mode’ to functions
[SPARK-40008] [SQL] Support casting of integrals to ANSI intervals
[SPARK-40003] [PYTHON][SQL] Add ‘median’ to functions
[SPARK-39952] [SQL] SaveIntoDataSourceCommand should recache result relation
[SPARK-39951] [SQL] Update Parquet V2 columnar check for nested fields
[SPARK-39775] [CORE][AVRO] Disable validate default values when parsing Avro schemas
[SPARK-33236] [shuffle] Backport to DBR 11.x: Enable Push-based shuffle service to store state in NM level DB for work preserving restart
[SPARK-39836] [SQL] Simplify V2ExpressionBuilder by extract common method.
[SPARK-39867] [SQL] Global limit should not inherit OrderPreservingUnaryNode
[SPARK-39873] [SQL] Remove
OptimizeLimitZero
and merge it intoEliminateLimits
[SPARK-39961] [SQL] DS V2 push-down translate Cast if the cast is safe
[SPARK-39872] [SQL] Change to use
BytePackerForLong#unpack8Values
with Array input api inVectorizedDeltaBinaryPackedReader
[SPARK-39858] [SQL] Remove unnecessary
AliasHelper
orPredicateHelper
for some rules[SPARK-39962] [WARMFIX][ES-393486][PYTHON][SQL] Apply projection when group attributes are empty
[SPARK-39900] [SQL] Address partial or negated condition in binary format’s predicate pushdown
[SPARK-39904] [SQL] Rename inferDate to prefersDate and clarify semantics of the option in CSV data source
[SPARK-39958] [SQL] Add warning log when unable to load custom metric object
[SPARK-39936] [SQL] Store schema in properties for Spark Views
[SPARK-39932] [SQL] WindowExec should clear the final partition buffer
[SPARK-37194] [SQL] Avoid unnecessary sort in v1 write if it’s not dynamic partition
[SPARK-39902] [SQL] Add Scan details to spark plan scan node in SparkUI
[SPARK-39865] [SQL] Show proper error messages on the overflow errors of table insert
[SPARK-39940] [SS] Refresh catalog table on streaming query with DSv1 sink
[SPARK-39827] [SQL] Use the error class
ARITHMETIC_OVERFLOW
on int overflow inadd_months()
[SPARK-39914] [SQL] Add DS V2 Filter to V1 Filter conversion
[SPARK-39857] [SQL] Manual DBR 11.x backport; V2ExpressionBuilder uses the wrong LiteralValue data type for In predicate #43454
[SPARK-39840] [SQL][PYTHON] Factor PythonArrowInput out as a symmetry to PythonArrowOutput
[SPARK-39651] [SQL] Prune filter condition if compare with rand is deterministic
[SPARK-39877] [PYTHON] Add unpivot to PySpark DataFrame API
[SPARK-39847] [WARMFIX][SS] Fix race condition in RocksDBLoader.loadLibrary() if caller thread is interrupted
[SPARK-39909] [SQL] Organize the check of push down information for JDBCV2Suite
[SPARK-39834] [SQL][SS] Include the origin stats and constraints for LogicalRDD if it comes from DataFrame
[SPARK-39849] [SQL] Dataset.as(StructType) fills missing new columns with null value
[SPARK-39860] [SQL] More expressions should extend Predicate
[SPARK-39823] [SQL][PYTHON] Rename Dataset.as as Dataset.to and add DataFrame.to in PySpark
[SPARK-39918] [SQL][MINOR] Replace the wording “un-comparable” with “incomparable” in error message
[SPARK-39857] [SQL][3.3] V2ExpressionBuilder uses the wrong LiteralValue data type for In predicate
[SPARK-39862] [SQL] Manual backport for PR 43654 targeting DBR 11.x: Update SQLConf.DEFAULT_COLUMN_ALLOWED_PROVIDERS to allow/deny ALTER TABLE … ADD COLUMN commands separately.
[SPARK-39844] [SQL] Manual backport for PR 43652 targeting DBR 11.x
[SPARK-39899] [SQL] Fix passing of message parameters to
InvalidUDFClassException
[SPARK-39890] [SQL] Make TakeOrderedAndProjectExec inherit AliasAwareOutputOrdering
[SPARK-39809] [PYTHON] Support CharType in PySpark
[SPARK-38864] [SQL] Add unpivot / melt to Dataset
[SPARK-39864] [SQL] Lazily register ExecutionListenerBus
[SPARK-39808] [SQL] Support aggregate function MODE
[SPARK-39839] [SQL] Handle special case of null variable-length Decimal with non-zero offsetAndSize in UnsafeRow structural integrity check
[SPARK-39875] [SQL] Change
protected
method in final class toprivate
orpackage-visible
[SPARK-39731] [SQL] Fix issue in CSV and JSON data sources when parsing dates in “yyyyMMdd” format with CORRECTED time parser policy
[SPARK-39805] [SS] Deprecate Trigger.Once and Promote Trigger.AvailableNow
[SPARK-39784] [SQL] Put Literal values on the right side of the data source filter after translating Catalyst Expression to data source filter
[SPARK-39672] [SQL][3.1] Fix removing project before filter with correlated subquery
[SPARK-39552] [SQL] Unify v1 and v2
DESCRIBE TABLE
[SPARK-39806] [SQL] Accessing
_metadata
on partitioned table can crash a query[SPARK-39810] [SQL] Catalog.tableExists should handle nested namespace
[SPARK-37287] [SQL] Pull out dynamic partition and bucket sort from FileFormatWriter
[SPARK-39469] [SQL] Infer date type for CSV schema inference
[SPARK-39148] [SQL] DS V2 aggregate push down can work with OFFSET or LIMIT
[SPARK-39818] [SQL] Fix bug in ARRAY, STRUCT, MAP types with DEFAULT values with NULL field(s)
[SPARK-39792] [SQL] Add DecimalDivideWithOverflowCheck for decimal average
[SPARK-39798] [SQL] Replcace
toSeq.toArray
with.toArray[Any]
in constructor ofGenericArrayData
[SPARK-39759] [SQL] Implement listIndexes in JDBC (H2 dialect)
[SPARK-39385] [SQL] Supports push down
REGR_AVGX
andREGR_AVGY
[SPARK-39787] [SQL] Use error class in the parsing error of function to_timestamp
[SPARK-39760] [PYTHON] Support Varchar in PySpark
[SPARK-39557] [SQL] Manual backport to DBR 11.x: Support ARRAY, STRUCT, MAP types as DEFAULT values
[SPARK-39758] [SQL][3.3] Fix NPE from the regexp functions on invalid patterns
[SPARK-39749] [SQL] ANSI SQL mode: Use plain string representation on casting Decimal to String
[SPARK-39704] [SQL] Implement createIndex & dropIndex & indexExists in JDBC (H2 dialect)
[SPARK-39803] [SQL] Use
LevenshteinDistance
instead ofStringUtils.getLevenshteinDistance
[SPARK-39339] [SQL] Support TimestampNTZ type in JDBC data source
[SPARK-39781] [SS] Add support for providing max_open_files to rocksdb state store provider
[SPARK-39719] [R] Implement databaseExists/getDatabase in SparkR support 3L namespace
[SPARK-39751] [SQL] Rename hash aggregate key probes metric
[SPARK-39772] [SQL] namespace should be null when database is null in the old constructors
[SPARK-39625] [SPARK-38904][SQL] Add Dataset.as(StructType)
[SPARK-39384] [SQL] Compile built-in linear regression aggregate functions for JDBC dialect
[SPARK-39720] [R] Implement tableExists/getTable in SparkR for 3L namespace
[SPARK-39744] [SQL] Add the
REGEXP_INSTR
function[SPARK-39716] [R] Make currentDatabase/setCurrentDatabase/listCatalogs in SparkR support 3L namespace
[SPARK-39788] [SQL] Rename
catalogName
todialectName
forJdbcUtils
[SPARK-39647] [CORE] Register the executor with ESS before registering the BlockManager
[SPARK-39754] [CORE][SQL] Remove unused
import
or unnecessary{}
[SPARK-39706] [SQL] Set missing column with defaultValue as constant in
ParquetColumnVector
[SPARK-39699] [SQL] Make CollapseProject smarter about collection creation expressions
[SPARK-39737] [SQL]
PERCENTILE_CONT
andPERCENTILE_DISC
should support aggregate filter[SPARK-39579] [SQL][PYTHON][R] Make ListFunctions/getFunction/functionExists compatible with 3 layer namespace
[SPARK-39627] [SQL] JDBC V2 pushdown should unify the compile API
[SPARK-39748] [SQL][SS] Include the origin logical plan for LogicalRDD if it comes from DataFrame
[SPARK-39385] [SQL] Translate linear regression aggregate functions for pushdown
[SPARK-39695] [SQL] Add the
REGEXP_SUBSTR
function[SPARK-39667] [SQL] Add another workaround when there is not enough memory to build and broadcast the table
[SPARK-39666] [ES-337834][SQL] Use UnsafeProjection.create to respect
spark.sql.codegen.factoryMode
in ExpressionEncoder[SPARK-39643] [SQL] Prohibit subquery expressions in DEFAULT values
[SPARK-38647] [SQL] Add SupportsReportOrdering mix in interface for Scan (DataSourceV2)
[SPARK-39497] [SQL] Improve the analysis exception of missing map key column
[SPARK-39661] [SQL] Avoid creating unnecessary SLF4J Logger
[SPARK-39713] [SQL] ANSI mode: add suggestion of using try_element_at for INVALID_ARRAY_INDEX error
[SPARK-38899] [SQL]DS V2 supports push down datetime functions
[SPARK-39638] [SQL] Change to use
ConstantColumnVector
to store partition columns inOrcColumnarBatchReader
[SPARK-39653] [SQL] Clean up
ColumnVectorUtils#populate(WritableColumnVector, InternalRow, int)
fromColumnVectorUtils
[SPARK-39231] [SQL] Use
ConstantColumnVector
instead ofOn/OffHeapColumnVector
to store partition columns inVectorizedParquetRecordReader
[SPARK-39547] [SQL] V2SessionCatalog should not throw NoSuchDatabaseException in loadNamspaceMetadata
[SPARK-39447] [SQL] Avoid AssertionError in AdaptiveSparkPlanExec.doExecuteBroadcast
[SPARK-39492] [SQL] Rework MISSING_COLUMN
[SPARK-39679] [SQL] TakeOrderedAndProjectExec should respect child output ordering
[SPARK-39606] [SQL] Use child stats to estimate order operator
[SPARK-39611] [PYTHON][PS] Fix wrong aliases in array_ufunc
[SPARK-39656] [SQL][3.3] Fix wrong namespace in DescribeNamespaceExec
[SPARK-39675] [SQL] Switch ‘spark.sql.codegen.factoryMode’ configuration from testing purpose to internal purpose
[SPARK-39139] [SQL] DS V2 supports push down DS V2 UDF
[SPARK-39434] [SQL] Provide runtime error query context when array index is out of bounding
[SPARK-39479] [SQL] DS V2 supports push down math functions(non ANSI)
[SPARK-39618] [SQL] Add the
REGEXP_COUNT
function[SPARK-39553] [CORE] Multi-thread unregister shuffle shouldn’t throw NPE when using Scala 2.13
[SPARK-38755] [PYTHON][3.3] Add file to address missing pandas general functions
[SPARK-39444] [SQL] Add OptimizeSubqueries into nonExcludableRules list
[SPARK-39316] [SQL] Merge PromotePrecision and CheckOverflow into decimal binary arithmetic
[SPARK-39505] [UI] Escape log content rendered in UI
[SPARK-39448] [SQL] Add
ReplaceCTERefWithRepartition
intononExcludableRules
list[SPARK-37961] [SQL] Override maxRows/maxRowsPerPartition for some logical operators
[SPARK-35223] Revert Add IssueNavigationLink
[SPARK-39633] [SQL] Support timestamp in seconds for TimeTravel using Dataframe options
[SPARK-38796] [SQL] Update documentation for number format strings with the {try_}to_number functions
[SPARK-39650] [SS] Fix incorrect value schema in streaming deduplication with backward compatibility
[SPARK-39636] [CORE][UI] Fix multiple bugs in JsonProtocol, impacting off heap StorageLevels and Task/Executor ResourceRequests
[SPARK-39432] [SQL] Return ELEMENT_AT_BY_INDEX_ZERO from element_at(*, 0)
[SPARK-39349] Add a centralized CheckError method for QA of error path
[SPARK-39453] [SQL] DS V2 supports push down misc non-aggregate functions(non ANSI)
[SPARK-38978] [SQL] DS V2 supports push down OFFSET operator
[SPARK-39567] [SQL] Support ANSI intervals in the percentile functions
[SPARK-39383] [SQL] Support DEFAULT columns in ALTER TABLE ALTER COLUMNS to V2 data sources
[SPARK-39396] [SQL] Fix LDAP login exception ‘error code 49 - invalid credentials’
[SPARK-39548] [SQL] CreateView Command with a window clause query hit a wrong window definition not found issue
[SPARK-39575] [AVRO] add ByteBuffer#rewind after ByteBuffer#get in Avr…
[SPARK-39543] The option of DataFrameWriterV2 should be passed to storage properties if fallback to v1
[SPARK-39564] [SS] Expose the information of catalog table to the logical plan in streaming query
[SPARK-39582] [SQL] Fix “Since” marker for
array_agg
[SPARK-39388] [SQL] Reuse
orcSchema
when push down Orc predicates[SPARK-39511] [SQL] Enhance push down local limit 1 for right side of left semi/anti join if join condition is empty
[SPARK-38614] [SQL] Don’t push down limit through window that’s using percent_rank
[SPARK-39551] [SQL] Add AQE invalid plan check
[SPARK-39383] [SQL] Support DEFAULT columns in ALTER TABLE ADD COLUMNS to V2 data sources
[SPARK-39538] [SQL] Avoid creating unnecessary SLF4J Logger
[SPARK-39383] [SQL] Manual backport to DBR 11.x: Refactor DEFAULT column support to skip passing the primary Analyzer around
[SPARK-39397] [SQL] Relax AliasAwareOutputExpression to support alias with expression
[SPARK-39496] [SQL] Handle null struct in
Inline.eval
[SPARK-39545] [SQL] Override
concat
method forExpressionSet
in Scala 2.13 to improve the performance[SPARK-39340] [SQL] DS v2 agg pushdown should allow dots in the name of top-level columns
[SPARK-39488] [SQL] Simplify the error handling of TempResolvedColumn
[SPARK-38846] [SQL] Add explicit data mapping between Teradata Numeric Type and Spark DecimalType
[SPARK-39520] [SQL] Override
--
method forExpressionSet
in Scala 2.13[SPARK-39470] [SQL] Support cast of ANSI intervals to decimals
[SPARK-39477] [SQL] Remove “Number of queries” info from the golden files of SQLQueryTestSuite
[SPARK-39419] [SQL] Fix ArraySort to throw an exception when the comparator returns null
[SPARK-39061] [SQL] Set nullable correctly for
Inline
output attributes[SPARK-39320] [SQL] Support aggregate function
MEDIAN
[SPARK-39261] [CORE] Improve newline formatting for error messages
[SPARK-39355] [SQL] Single column uses quoted to construct UnresolvedAttribute
[SPARK-39351] [SQL] SHOW CREATE TABLE should redact properties
[SPARK-37623] [SQL] Support ANSI Aggregate Function: regr_intercept
[SPARK-39374] [SQL] Improve error message for user specified column list
[SPARK-39255] [SQL][3.3] Improve error messages
[SPARK-39321] [SQL] Refactor TryCast to use RuntimeReplaceable
[SPARK-39406] [PYTHON] Accept NumPy array in createDataFrame
[SPARK-39267] [SQL] Clean up dsl unnecessary symbol
[SPARK-39171] [SQL] Unify the Cast expression
[SPARK-28330] [SQL] Support ANSI SQL: result offset clause in query expression
[SPARK-39203] [SQL] Rewrite table location to absolute URI based on database URI
[SPARK-39313] [SQL]
toCatalystOrdering
should fail if V2Expression can not be translated[SPARK-39301] [SQL][PYTHON] Leverage LocalRelation and respect Arrow batch size in createDataFrame with Arrow optimization
[SPARK-39400] [SQL] spark-sql should remove hive resource dir in all case
System environment
Operating System: Ubuntu 20.04.4 LTS
Java: Zulu 8.56.0.21-CA-linux64
Scala: 2.12.14
Python: 3.9.5
R: 4.1.3
Delta Lake: 1.2.1
Installed Python libraries
Library |
Version |
Library |
Version |
Library |
Version |
---|---|---|---|---|---|
Antergos Linux |
2015.10 (ISO-Rolling) |
argon2-cffi |
20.1.0 |
async-generator |
1.10 |
attrs |
21.2.0 |
backcall |
0.2.0 |
backports.entry-points-selectable |
1.1.1 |
black |
22.3.0 |
bleach |
4.0.0 |
boto3 |
1.21.18 |
botocore |
1.24.18 |
certifi |
2021.10.8 |
cffi |
1.14.6 |
chardet |
4.0.0 |
charset-normalizer |
2.0.4 |
click |
8.0.3 |
cryptography |
3.4.8 |
cycler |
0.10.0 |
Cython |
0.29.24 |
dbus-python |
1.2.16 |
debugpy |
1.4.1 |
decorator |
5.1.0 |
defusedxml |
0.7.1 |
distlib |
0.3.5 |
distro-info |
0.23ubuntu1 |
entrypoints |
0.3 |
facets-overview |
1.0.0 |
filelock |
3.8.0 |
idna |
3.2 |
ipykernel |
6.12.1 |
ipython |
7.32.0 |
ipython-genutils |
0.2.0 |
ipywidgets |
7.7.0 |
jedi |
0.18.0 |
Jinja2 |
2.11.3 |
jmespath |
0.10.0 |
joblib |
1.0.1 |
jsonschema |
3.2.0 |
jupyter-client |
6.1.12 |
jupyter-core |
4.8.1 |
jupyterlab-pygments |
0.1.2 |
jupyterlab-widgets |
1.0.0 |
kiwisolver |
1.3.1 |
MarkupSafe |
2.0.1 |
matplotlib |
3.4.3 |
matplotlib-inline |
0.1.2 |
mistune |
0.8.4 |
mypy-extensions |
0.4.3 |
nbclient |
0.5.3 |
nbconvert |
6.1.0 |
nbformat |
5.1.3 |
nest-asyncio |
1.5.1 |
notebook |
6.4.5 |
numpy |
1.20.3 |
packaging |
21.0 |
pandas |
1.3.4 |
pandocfilters |
1.4.3 |
parso |
0.8.2 |
pathspec |
0.9.0 |
patsy |
0.5.2 |
pexpect |
4.8.0 |
pickleshare |
0.7.5 |
Pillow |
8.4.0 |
pip |
21.2.4 |
platformdirs |
2.5.2 |
plotly |
5.9.0 |
prometheus-client |
0.11.0 |
prompt-toolkit |
3.0.20 |
protobuf |
4.21.5 |
psutil |
5.8.0 |
psycopg2 |
2.9.3 |
ptyprocess |
0.7.0 |
pyarrow |
7.0.0 |
pycparser |
2.20 |
Pygments |
2.10.0 |
PyGObject |
3.36.0 |
pyodbc |
4.0.31 |
pyparsing |
3.0.4 |
pyrsistent |
0.18.0 |
python-apt |
2.0.0+ubuntu0.20.4.7 |
python-dateutil |
2.8.2 |
pytz |
2021.3 |
pyzmq |
22.2.1 |
requests |
2.26.0 |
requests-unixsocket |
0.2.0 |
s3transfer |
0.5.2 |
scikit-learn |
0.24.2 |
scipy |
1.7.1 |
seaborn |
0.11.2 |
Send2Trash |
1.8.0 |
setuptools |
58.0.4 |
six |
1.16.0 |
ssh-import-id |
5.10 |
statsmodels |
0.12.2 |
tenacity |
8.0.1 |
terminado |
0.9.4 |
testpath |
0.5.0 |
threadpoolctl |
2.2.0 |
tokenize-rt |
4.2.1 |
tomli |
2.0.1 |
tornado |
6.1 |
traitlets |
5.1.0 |
typing-extensions |
3.10.0.2 |
unattended-upgrades |
0.1 |
urllib3 |
1.26.7 |
virtualenv |
20.8.0 |
wcwidth |
0.2.5 |
webencodings |
0.5.1 |
wheel |
0.37.0 |
widgetsnbextension |
3.6.0 |
Installed R libraries
R libraries are installed from the Microsoft CRAN snapshot on 2022-08-15.
Library |
Version |
Library |
Version |
Library |
Version |
---|---|---|---|---|---|
askpass |
1.1 |
assertthat |
0.2.1 |
backports |
1.4.1 |
base |
4.1.3 |
base64enc |
0.1-3 |
bit |
4.0.4 |
bit64 |
4.0.5 |
blob |
1.2.3 |
boot |
1.3-28 |
brew |
1.0-7 |
brio |
1.1.3 |
broom |
1.0.0 |
bslib |
0.4.0 |
cachem |
1.0.6 |
callr |
3.7.1 |
caret |
6.0-93 |
cellranger |
1.1.0 |
chron |
2.3-57 |
class |
7.3-20 |
cli |
3.3.0 |
clipr |
0.8.0 |
cluster |
2.1.3 |
codetools |
0.2-18 |
colorspace |
2.0-3 |
commonmark |
1.8.0 |
compiler |
4.1.3 |
config |
0.3.1 |
cpp11 |
0.4.2 |
crayon |
1.5.1 |
credentials |
1.3.2 |
curl |
4.3.2 |
data.table |
1.14.2 |
datasets |
4.1.3 |
DBI |
1.1.3 |
dbplyr |
2.2.1 |
desc |
1.4.1 |
devtools |
2.4.4 |
diffobj |
0.3.5 |
digest |
0.6.29 |
downlit |
0.4.2 |
dplyr |
1.0.9 |
dtplyr |
1.2.1 |
e1071 |
1.7-11 |
ellipsis |
0.3.2 |
evaluate |
0.16 |
fansi |
1.0.3 |
farver |
2.1.1 |
fastmap |
1.1.0 |
fontawesome |
0.3.0 |
forcats |
0.5.1 |
foreach |
1.5.2 |
foreign |
0.8-82 |
forge |
0.2.0 |
fs |
1.5.2 |
future |
1.27.0 |
future.apply |
1.9.0 |
gargle |
1.2.0 |
generics |
0.1.3 |
gert |
1.7.0 |
ggplot2 |
3.3.6 |
gh |
1.3.0 |
gitcreds |
0.1.1 |
glmnet |
4.1-4 |
globals |
0.16.0 |
glue |
1.6.2 |
googledrive |
2.0.0 |
googlesheets4 |
1.0.1 |
gower |
1.0.0 |
graphics |
4.1.3 |
grDevices |
4.1.3 |
grid |
4.1.3 |
gridExtra |
2.3 |
gsubfn |
0.7 |
gtable |
0.3.0 |
hardhat |
1.2.0 |
haven |
2.5.0 |
highr |
0.9 |
hms |
1.1.1 |
htmltools |
0.5.3 |
htmlwidgets |
1.5.4 |
httpuv |
1.6.5 |
httr |
1.4.3 |
ids |
1.0.1 |
ini |
0.3.1 |
ipred |
0.9-13 |
isoband |
0.2.5 |
iterators |
1.0.14 |
jquerylib |
0.1.4 |
jsonlite |
1.8.0 |
KernSmooth |
2.23-20 |
knitr |
1.39 |
labeling |
0.4.2 |
later |
1.3.0 |
lattice |
0.20-45 |
lava |
1.6.10 |
lifecycle |
1.0.1 |
listenv |
0.8.0 |
lubridate |
1.8.0 |
magrittr |
2.0.3 |
markdown |
1.1 |
MASS |
7.3-56 |
Matrix |
1.4-1 |
memoise |
2.0.1 |
methods |
4.1.3 |
mgcv |
1.8-40 |
mime |
0.12 |
miniUI |
0.1.1.1 |
ModelMetrics |
1.2.2.2 |
modelr |
0.1.8 |
munsell |
0.5.0 |
nlme |
3.1-157 |
nnet |
7.3-17 |
numDeriv |
2016.8-1.1 |
openssl |
2.0.2 |
parallel |
4.1.3 |
parallelly |
1.32.1 |
pillar |
1.8.0 |
pkgbuild |
1.3.1 |
pkgconfig |
2.0.3 |
pkgdown |
2.0.6 |
pkgload |
1.3.0 |
plogr |
0.2.0 |
plyr |
1.8.7 |
praise |
1.0.0 |
prettyunits |
1.1.1 |
pROC |
1.18.0 |
processx |
3.7.0 |
prodlim |
2019.11.13 |
profvis |
0.3.7 |
progress |
1.2.2 |
progressr |
0.10.1 |
promises |
1.2.0.1 |
proto |
1.0.0 |
proxy |
0.4-27 |
ps |
1.7.1 |
purrr |
0.3.4 |
r2d3 |
0.2.6 |
R6 |
2.5.1 |
ragg |
1.2.2 |
randomForest |
4.7-1.1 |
rappdirs |
0.3.3 |
rcmdcheck |
1.4.0 |
RColorBrewer |
1.1-3 |
Rcpp |
1.0.9 |
RcppEigen |
0.3.3.9.2 |
readr |
2.1.2 |
readxl |
1.4.0 |
recipes |
1.0.1 |
rematch |
1.0.1 |
rematch2 |
2.1.2 |
remotes |
2.4.2 |
reprex |
2.0.1 |
reshape2 |
1.4.4 |
rlang |
1.0.4 |
rmarkdown |
2.14 |
RODBC |
1.3-19 |
roxygen2 |
7.2.1 |
rpart |
4.1.16 |
rprojroot |
2.0.3 |
Rserve |
1.8-11 |
RSQLite |
2.2.15 |
rstudioapi |
0.13 |
rversions |
2.1.1 |
rvest |
1.0.2 |
sass |
0.4.2 |
scales |
1.2.0 |
selectr |
0.4-2 |
sessioninfo |
1.2.2 |
shape |
1.4.6 |
shiny |
1.7.2 |
sourcetools |
0.1.7 |
sparklyr |
1.7.7 |
SparkR |
3.3.0 |
spatial |
7.3-11 |
splines |
4.1.3 |
sqldf |
0.4-11 |
SQUAREM |
2021.1 |
stats |
4.1.3 |
stats4 |
4.1.3 |
stringi |
1.7.8 |
stringr |
1.4.0 |
survival |
3.2-13 |
sys |
3.4 |
systemfonts |
1.0.4 |
tcltk |
4.1.3 |
testthat |
3.1.4 |
textshaping |
0.3.6 |
tibble |
3.1.8 |
tidyr |
1.2.0 |
tidyselect |
1.1.2 |
tidyverse |
1.3.2 |
timeDate |
4021.104 |
tinytex |
0.40 |
tools |
4.1.3 |
tzdb |
0.3.0 |
urlchecker |
1.0.1 |
usethis |
2.1.6 |
utf8 |
1.2.2 |
utils |
4.1.3 |
uuid |
1.1-0 |
vctrs |
0.4.1 |
viridisLite |
0.4.0 |
vroom |
1.5.7 |
waldo |
0.4.0 |
whisker |
0.4 |
withr |
2.5.0 |
xfun |
0.32 |
xml2 |
1.3.3 |
xopen |
1.0.0 |
xtable |
1.8-4 |
yaml |
2.3.5 |
zip |
2.2.0 |
Installed Java and Scala libraries (Scala 2.12 cluster version)
Group ID |
Artifact ID |
Version |
---|---|---|
antlr |
antlr |
2.7.7 |
com.amazonaws |
amazon-kinesis-client |
1.12.0 |
com.amazonaws |
aws-java-sdk-autoscaling |
1.12.189 |
com.amazonaws |
aws-java-sdk-cloudformation |
1.12.189 |
com.amazonaws |
aws-java-sdk-cloudfront |
1.12.189 |
com.amazonaws |
aws-java-sdk-cloudhsm |
1.12.189 |
com.amazonaws |
aws-java-sdk-cloudsearch |
1.12.189 |
com.amazonaws |
aws-java-sdk-cloudtrail |
1.12.189 |
com.amazonaws |
aws-java-sdk-cloudwatch |
1.12.189 |
com.amazonaws |
aws-java-sdk-cloudwatchmetrics |
1.12.189 |
com.amazonaws |
aws-java-sdk-codedeploy |
1.12.189 |
com.amazonaws |
aws-java-sdk-cognitoidentity |
1.12.189 |
com.amazonaws |
aws-java-sdk-cognitosync |
1.12.189 |
com.amazonaws |
aws-java-sdk-config |
1.12.189 |
com.amazonaws |
aws-java-sdk-core |
1.12.189 |
com.amazonaws |
aws-java-sdk-datapipeline |
1.12.189 |
com.amazonaws |
aws-java-sdk-directconnect |
1.12.189 |
com.amazonaws |
aws-java-sdk-directory |
1.12.189 |
com.amazonaws |
aws-java-sdk-dynamodb |
1.12.189 |
com.amazonaws |
aws-java-sdk-ec2 |
1.12.189 |
com.amazonaws |
aws-java-sdk-ecs |
1.12.189 |
com.amazonaws |
aws-java-sdk-efs |
1.12.189 |
com.amazonaws |
aws-java-sdk-elasticache |
1.12.189 |
com.amazonaws |
aws-java-sdk-elasticbeanstalk |
1.12.189 |
com.amazonaws |
aws-java-sdk-elasticloadbalancing |
1.12.189 |
com.amazonaws |
aws-java-sdk-elastictranscoder |
1.12.189 |
com.amazonaws |
aws-java-sdk-emr |
1.12.189 |
com.amazonaws |
aws-java-sdk-glacier |
1.12.189 |
com.amazonaws |
aws-java-sdk-glue |
1.12.189 |
com.amazonaws |
aws-java-sdk-iam |
1.12.189 |
com.amazonaws |
aws-java-sdk-importexport |
1.12.189 |
com.amazonaws |
aws-java-sdk-kinesis |
1.12.189 |
com.amazonaws |
aws-java-sdk-kms |
1.12.189 |
com.amazonaws |
aws-java-sdk-lambda |
1.12.189 |
com.amazonaws |
aws-java-sdk-logs |
1.12.189 |
com.amazonaws |
aws-java-sdk-machinelearning |
1.12.189 |
com.amazonaws |
aws-java-sdk-opsworks |
1.12.189 |
com.amazonaws |
aws-java-sdk-rds |
1.12.189 |
com.amazonaws |
aws-java-sdk-redshift |
1.12.189 |
com.amazonaws |
aws-java-sdk-route53 |
1.12.189 |
com.amazonaws |
aws-java-sdk-s3 |
1.12.189 |
com.amazonaws |
aws-java-sdk-ses |
1.12.189 |
com.amazonaws |
aws-java-sdk-simpledb |
1.12.189 |
com.amazonaws |
aws-java-sdk-simpleworkflow |
1.12.189 |
com.amazonaws |
aws-java-sdk-sns |
1.12.189 |
com.amazonaws |
aws-java-sdk-sqs |
1.12.189 |
com.amazonaws |
aws-java-sdk-ssm |
1.12.189 |
com.amazonaws |
aws-java-sdk-storagegateway |
1.12.189 |
com.amazonaws |
aws-java-sdk-sts |
1.12.189 |
com.amazonaws |
aws-java-sdk-support |
1.12.189 |
com.amazonaws |
aws-java-sdk-swf-libraries |
1.11.22 |
com.amazonaws |
aws-java-sdk-workspaces |
1.12.189 |
com.amazonaws |
jmespath-java |
1.12.189 |
com.chuusai |
shapeless_2.12 |
2.3.3 |
com.clearspring.analytics |
stream |
2.9.6 |
com.databricks |
Rserve |
1.8-3 |
com.databricks |
jets3t |
0.7.1-0 |
com.databricks.scalapb |
compilerplugin_2.12 |
0.4.15-10 |
com.databricks.scalapb |
scalapb-runtime_2.12 |
0.4.15-10 |
com.esotericsoftware |
kryo-shaded |
4.0.2 |
com.esotericsoftware |
minlog |
1.3.0 |
com.fasterxml |
classmate |
1.3.4 |
com.fasterxml.jackson.core |
jackson-annotations |
2.13.3 |
com.fasterxml.jackson.core |
jackson-core |
2.13.3 |
com.fasterxml.jackson.core |
jackson-databind |
2.13.3 |
com.fasterxml.jackson.dataformat |
jackson-dataformat-cbor |
2.13.3 |
com.fasterxml.jackson.datatype |
jackson-datatype-joda |
2.13.3 |
com.fasterxml.jackson.datatype |
jackson-datatype-jsr310 |
2.13.3 |
com.fasterxml.jackson.module |
jackson-module-paranamer |
2.13.3 |
com.fasterxml.jackson.module |
jackson-module-scala_2.12 |
2.13.3 |
com.github.ben-manes.caffeine |
caffeine |
2.3.4 |
com.github.fommil |
jniloader |
1.1 |
com.github.fommil.netlib |
core |
1.1.2 |
com.github.fommil.netlib |
native_ref-java |
1.1 |
com.github.fommil.netlib |
native_ref-java-natives |
1.1 |
com.github.fommil.netlib |
native_system-java |
1.1 |
com.github.fommil.netlib |
native_system-java-natives |
1.1 |
com.github.fommil.netlib |
netlib-native_ref-linux-x86_64-natives |
1.1 |
com.github.fommil.netlib |
netlib-native_system-linux-x86_64-natives |
1.1 |
com.github.luben |
zstd-jni |
1.5.2-1 |
com.github.wendykierp |
JTransforms |
3.1 |
com.google.code.findbugs |
jsr305 |
3.0.0 |
com.google.code.gson |
gson |
2.8.6 |
com.google.crypto.tink |
tink |
1.6.1 |
com.google.flatbuffers |
flatbuffers-java |
1.12.0 |
com.google.guava |
guava |
15.0 |
com.google.protobuf |
protobuf-java |
2.6.1 |
com.h2database |
h2 |
2.0.204 |
com.helger |
profiler |
1.1.1 |
com.jcraft |
jsch |
0.1.50 |
com.jolbox |
bonecp |
0.8.0.RELEASE |
com.lihaoyi |
sourcecode_2.12 |
0.1.9 |
com.microsoft.azure |
azure-data-lake-store-sdk |
2.3.9 |
com.ning |
compress-lzf |
1.1 |
com.sun.mail |
javax.mail |
1.5.2 |
com.tdunning |
json |
1.8 |
com.thoughtworks.paranamer |
paranamer |
2.8 |
com.trueaccord.lenses |
lenses_2.12 |
0.4.12 |
com.twitter |
chill-java |
0.10.0 |
com.twitter |
chill_2.12 |
0.10.0 |
com.twitter |
util-app_2.12 |
7.1.0 |
com.twitter |
util-core_2.12 |
7.1.0 |
com.twitter |
util-function_2.12 |
7.1.0 |
com.twitter |
util-jvm_2.12 |
7.1.0 |
com.twitter |
util-lint_2.12 |
7.1.0 |
com.twitter |
util-registry_2.12 |
7.1.0 |
com.twitter |
util-stats_2.12 |
7.1.0 |
com.typesafe |
config |
1.2.1 |
com.typesafe.scala-logging |
scala-logging_2.12 |
3.7.2 |
com.uber |
h3 |
3.7.0 |
com.univocity |
univocity-parsers |
2.9.1 |
com.zaxxer |
HikariCP |
4.0.3 |
commons-cli |
commons-cli |
1.5.0 |
commons-codec |
commons-codec |
1.15 |
commons-collections |
commons-collections |
3.2.2 |
commons-dbcp |
commons-dbcp |
1.4 |
commons-fileupload |
commons-fileupload |
1.3.3 |
commons-httpclient |
commons-httpclient |
3.1 |
commons-io |
commons-io |
2.11.0 |
commons-lang |
commons-lang |
2.6 |
commons-logging |
commons-logging |
1.1.3 |
commons-pool |
commons-pool |
1.5.4 |
dev.ludovic.netlib |
arpack |
2.2.1 |
dev.ludovic.netlib |
blas |
2.2.1 |
dev.ludovic.netlib |
lapack |
2.2.1 |
hadoop3 |
jets3t-0.7 |
liball_deps_2.12 |
info.ganglia.gmetric4j |
gmetric4j |
1.0.10 |
io.airlift |
aircompressor |
0.21 |
io.delta |
delta-sharing-spark_2.12 |
0.5.0 |
io.dropwizard.metrics |
metrics-core |
4.1.1 |
io.dropwizard.metrics |
metrics-graphite |
4.1.1 |
io.dropwizard.metrics |
metrics-healthchecks |
4.1.1 |
io.dropwizard.metrics |
metrics-jetty9 |
4.1.1 |
io.dropwizard.metrics |
metrics-jmx |
4.1.1 |
io.dropwizard.metrics |
metrics-json |
4.1.1 |
io.dropwizard.metrics |
metrics-jvm |
4.1.1 |
io.dropwizard.metrics |
metrics-servlets |
4.1.1 |
io.netty |
netty-all |
4.1.74.Final |
io.netty |
netty-buffer |
4.1.74.Final |
io.netty |
netty-codec |
4.1.74.Final |
io.netty |
netty-common |
4.1.74.Final |
io.netty |
netty-handler |
4.1.74.Final |
io.netty |
netty-resolver |
4.1.74.Final |
io.netty |
netty-tcnative-classes |
2.0.48.Final |
io.netty |
netty-transport |
4.1.74.Final |
io.netty |
netty-transport-classes-epoll |
4.1.74.Final |
io.netty |
netty-transport-classes-kqueue |
4.1.74.Final |
io.netty |
netty-transport-native-epoll-linux-aarch_64 |
4.1.74.Final |
io.netty |
netty-transport-native-epoll-linux-x86_64 |
4.1.74.Final |
io.netty |
netty-transport-native-kqueue-osx-aarch_64 |
4.1.74.Final |
io.netty |
netty-transport-native-kqueue-osx-x86_64 |
4.1.74.Final |
io.netty |
netty-transport-native-unix-common |
4.1.74.Final |
io.prometheus |
simpleclient |
0.7.0 |
io.prometheus |
simpleclient_common |
0.7.0 |
io.prometheus |
simpleclient_dropwizard |
0.7.0 |
io.prometheus |
simpleclient_pushgateway |
0.7.0 |
io.prometheus |
simpleclient_servlet |
0.7.0 |
io.prometheus.jmx |
collector |
0.12.0 |
jakarta.annotation |
jakarta.annotation-api |
1.3.5 |
jakarta.servlet |
jakarta.servlet-api |
4.0.3 |
jakarta.validation |
jakarta.validation-api |
2.0.2 |
jakarta.ws.rs |
jakarta.ws.rs-api |
2.1.6 |
javax.activation |
activation |
1.1.1 |
javax.annotation |
javax.annotation-api |
1.3.2 |
javax.el |
javax.el-api |
2.2.4 |
javax.jdo |
jdo-api |
3.0.1 |
javax.transaction |
jta |
1.1 |
javax.transaction |
transaction-api |
1.1 |
javax.xml.bind |
jaxb-api |
2.2.11 |
javolution |
javolution |
5.5.1 |
jline |
jline |
2.14.6 |
joda-time |
joda-time |
2.10.13 |
mvn |
hadoop3 |
liball_deps_2.12 |
net.java.dev.jna |
jna |
5.8.0 |
net.razorvine |
pickle |
1.2 |
net.sf.jpam |
jpam |
1.1 |
net.sf.opencsv |
opencsv |
2.3 |
net.sf.supercsv |
super-csv |
2.2.0 |
net.snowflake |
snowflake-ingest-sdk |
0.9.6 |
net.snowflake |
snowflake-jdbc |
3.13.14 |
net.snowflake |
spark-snowflake_2.12 |
2.10.0-spark_3.2 |
net.sourceforge.f2j |
arpack_combined_all |
0.1 |
org.acplt.remotetea |
remotetea-oncrpc |
1.1.2 |
org.antlr |
ST4 |
4.0.4 |
org.antlr |
antlr-runtime |
3.5.2 |
org.antlr |
antlr4-runtime |
4.8 |
org.antlr |
stringtemplate |
3.2.1 |
org.apache.ant |
ant |
1.9.2 |
org.apache.ant |
ant-jsch |
1.9.2 |
org.apache.ant |
ant-launcher |
1.9.2 |
org.apache.arrow |
arrow-format |
7.0.0 |
org.apache.arrow |
arrow-memory-core |
7.0.0 |
org.apache.arrow |
arrow-memory-netty |
7.0.0 |
org.apache.arrow |
arrow-vector |
7.0.0 |
org.apache.avro |
avro |
1.11.0 |
org.apache.avro |
avro-ipc |
1.11.0 |
org.apache.avro |
avro-mapred |
1.11.0 |
org.apache.commons |
commons-collections4 |
4.4 |
org.apache.commons |
commons-compress |
1.21 |
org.apache.commons |
commons-crypto |
1.1.0 |
org.apache.commons |
commons-lang3 |
3.12.0 |
org.apache.commons |
commons-math3 |
3.6.1 |
org.apache.commons |
commons-text |
1.9 |
org.apache.curator |
curator-client |
2.13.0 |
org.apache.curator |
curator-framework |
2.13.0 |
org.apache.curator |
curator-recipes |
2.13.0 |
org.apache.derby |
derby |
10.14.2.0 |
org.apache.hadoop |
hadoop-client-api |
3.3.2-databricks |
org.apache.hadoop |
hadoop-client-runtime |
3.3.2 |
org.apache.hive |
hive-beeline |
2.3.9 |
org.apache.hive |
hive-cli |
2.3.9 |
org.apache.hive |
hive-jdbc |
2.3.9 |
org.apache.hive |
hive-llap-client |
2.3.9 |
org.apache.hive |
hive-llap-common |
2.3.9 |
org.apache.hive |
hive-serde |
2.3.9 |
org.apache.hive |
hive-shims |
2.3.9 |
org.apache.hive |
hive-storage-api |
2.7.2 |
org.apache.hive.shims |
hive-shims-0.23 |
2.3.9 |
org.apache.hive.shims |
hive-shims-common |
2.3.9 |
org.apache.hive.shims |
hive-shims-scheduler |
2.3.9 |
org.apache.httpcomponents |
httpclient |
4.5.13 |
org.apache.httpcomponents |
httpcore |
4.4.14 |
org.apache.ivy |
ivy |
2.5.0 |
org.apache.logging.log4j |
log4j-1.2-api |
2.17.2 |
org.apache.logging.log4j |
log4j-api |
2.17.2 |
org.apache.logging.log4j |
log4j-core |
2.17.2 |
org.apache.logging.log4j |
log4j-slf4j-impl |
2.17.2 |
org.apache.mesos |
mesos-shaded-protobuf |
1.4.0 |
org.apache.orc |
orc-core |
1.7.5 |
org.apache.orc |
orc-mapreduce |
1.7.5 |
org.apache.orc |
orc-shims |
1.7.5 |
org.apache.parquet |
parquet-column |
1.12.0-databricks-0004 |
org.apache.parquet |
parquet-common |
1.12.0-databricks-0004 |
org.apache.parquet |
parquet-encoding |
1.12.0-databricks-0004 |
org.apache.parquet |
parquet-format-structures |
1.12.0-databricks-0004 |
org.apache.parquet |
parquet-hadoop |
1.12.0-databricks-0004 |
org.apache.parquet |
parquet-jackson |
1.12.0-databricks-0004 |
org.apache.thrift |
libfb303 |
0.9.3 |
org.apache.thrift |
libthrift |
0.12.0 |
org.apache.xbean |
xbean-asm9-shaded |
4.20 |
org.apache.yetus |
audience-annotations |
0.5.0 |
org.apache.zookeeper |
zookeeper |
3.6.2 |
org.apache.zookeeper |
zookeeper-jute |
3.6.2 |
org.checkerframework |
checker-qual |
3.5.0 |
org.codehaus.jackson |
jackson-core-asl |
1.9.13 |
org.codehaus.jackson |
jackson-mapper-asl |
1.9.13 |
org.codehaus.janino |
commons-compiler |
3.0.16 |
org.codehaus.janino |
janino |
3.0.16 |
org.datanucleus |
datanucleus-api-jdo |
4.2.4 |
org.datanucleus |
datanucleus-core |
4.1.17 |
org.datanucleus |
datanucleus-rdbms |
4.1.19 |
org.datanucleus |
javax.jdo |
3.2.0-m3 |
org.eclipse.jetty |
jetty-client |
9.4.46.v20220331 |
org.eclipse.jetty |
jetty-continuation |
9.4.46.v20220331 |
org.eclipse.jetty |
jetty-http |
9.4.46.v20220331 |
org.eclipse.jetty |
jetty-io |
9.4.46.v20220331 |
org.eclipse.jetty |
jetty-jndi |
9.4.46.v20220331 |
org.eclipse.jetty |
jetty-plus |
9.4.46.v20220331 |
org.eclipse.jetty |
jetty-proxy |
9.4.46.v20220331 |
org.eclipse.jetty |
jetty-security |
9.4.46.v20220331 |
org.eclipse.jetty |
jetty-server |
9.4.46.v20220331 |
org.eclipse.jetty |
jetty-servlet |
9.4.46.v20220331 |
org.eclipse.jetty |
jetty-servlets |
9.4.46.v20220331 |
org.eclipse.jetty |
jetty-util |
9.4.46.v20220331 |
org.eclipse.jetty |
jetty-util-ajax |
9.4.46.v20220331 |
org.eclipse.jetty |
jetty-webapp |
9.4.46.v20220331 |
org.eclipse.jetty |
jetty-xml |
9.4.46.v20220331 |
org.eclipse.jetty.websocket |
websocket-api |
9.4.46.v20220331 |
org.eclipse.jetty.websocket |
websocket-client |
9.4.46.v20220331 |
org.eclipse.jetty.websocket |
websocket-common |
9.4.46.v20220331 |
org.eclipse.jetty.websocket |
websocket-server |
9.4.46.v20220331 |
org.eclipse.jetty.websocket |
websocket-servlet |
9.4.46.v20220331 |
org.fusesource.leveldbjni |
leveldbjni-all |
1.8 |
org.glassfish.hk2 |
hk2-api |
2.6.1 |
org.glassfish.hk2 |
hk2-locator |
2.6.1 |
org.glassfish.hk2 |
hk2-utils |
2.6.1 |
org.glassfish.hk2 |
osgi-resource-locator |
1.0.3 |
org.glassfish.hk2.external |
aopalliance-repackaged |
2.6.1 |
org.glassfish.hk2.external |
jakarta.inject |
2.6.1 |
org.glassfish.jersey.containers |
jersey-container-servlet |
2.34 |
org.glassfish.jersey.containers |
jersey-container-servlet-core |
2.34 |
org.glassfish.jersey.core |
jersey-client |
2.34 |
org.glassfish.jersey.core |
jersey-common |
2.34 |
org.glassfish.jersey.core |
jersey-server |
2.34 |
org.glassfish.jersey.inject |
jersey-hk2 |
2.34 |
org.hibernate.validator |
hibernate-validator |
6.1.0.Final |
org.javassist |
javassist |
3.25.0-GA |
org.jboss.logging |
jboss-logging |
3.3.2.Final |
org.jdbi |
jdbi |
2.63.1 |
org.jetbrains |
annotations |
17.0.0 |
org.joda |
joda-convert |
1.7 |
org.jodd |
jodd-core |
3.5.2 |
org.json4s |
json4s-ast_2.12 |
3.7.0-M11 |
org.json4s |
json4s-core_2.12 |
3.7.0-M11 |
org.json4s |
json4s-jackson_2.12 |
3.7.0-M11 |
org.json4s |
json4s-scalap_2.12 |
3.7.0-M11 |
org.lz4 |
lz4-java |
1.8.0 |
org.mariadb.jdbc |
mariadb-java-client |
2.7.4 |
org.mlflow |
mlflow-spark |
1.27.0 |
org.objenesis |
objenesis |
2.5.1 |
org.postgresql |
postgresql |
42.3.3 |
org.roaringbitmap |
RoaringBitmap |
0.9.25 |
org.roaringbitmap |
shims |
0.9.25 |
org.rocksdb |
rocksdbjni |
6.24.2 |
org.rosuda.REngine |
REngine |
2.1.0 |
org.scala-lang |
scala-compiler_2.12 |
2.12.14 |
org.scala-lang |
scala-library_2.12 |
2.12.14 |
org.scala-lang |
scala-reflect_2.12 |
2.12.14 |
org.scala-lang.modules |
scala-collection-compat_2.12 |
2.4.3 |
org.scala-lang.modules |
scala-parser-combinators_2.12 |
1.1.2 |
org.scala-lang.modules |
scala-xml_2.12 |
1.2.0 |
org.scala-sbt |
test-interface |
1.0 |
org.scalacheck |
scalacheck_2.12 |
1.14.2 |
org.scalactic |
scalactic_2.12 |
3.0.8 |
org.scalanlp |
breeze-macros_2.12 |
1.2 |
org.scalanlp |
breeze_2.12 |
1.2 |
org.scalatest |
scalatest_2.12 |
3.0.8 |
org.slf4j |
jcl-over-slf4j |
1.7.36 |
org.slf4j |
jul-to-slf4j |
1.7.36 |
org.slf4j |
slf4j-api |
1.7.36 |
org.spark-project.spark |
unused |
1.0.0 |
org.threeten |
threeten-extra |
1.5.0 |
org.tukaani |
xz |
1.8 |
org.typelevel |
algebra_2.12 |
2.0.1 |
org.typelevel |
cats-kernel_2.12 |
2.1.1 |
org.typelevel |
macro-compat_2.12 |
1.1.1 |
org.typelevel |
spire-macros_2.12 |
0.17.0 |
org.typelevel |
spire-platform_2.12 |
0.17.0 |
org.typelevel |
spire-util_2.12 |
0.17.0 |
org.typelevel |
spire_2.12 |
0.17.0 |
org.wildfly.openssl |
wildfly-openssl |
1.0.7.Final |
org.xerial |
sqlite-jdbc |
3.8.11.2 |
org.xerial.snappy |
snappy-java |
1.1.8.4 |
org.yaml |
snakeyaml |
1.24 |
oro |
oro |
2.0.8 |
pl.edu.icm |
JLargeArrays |
1.5 |
software.amazon.ion |
ion-java |
1.0.2 |
stax |
stax-api |
1.0.1 |