October 2023
These features and Databricks platform improvements were released in October 2023.
Note
Releases are staged. Your Databricks workspace might not be updated until a week or more after the initial release date.
Published partner OAuth applications are easier to use by default
October 31, 2023
Previously, account admins had to run one-off commands to explicitly enable individual published partner OAuth applications for their Databricks accounts. This meant that users couldn’t seamlessly log in from their favorite BI tools or Databricks utilities without using a personal access token. Published partner OAuth applications are now enabled by default for all Databricks accounts. Account admins can disable published partner OAuth applications for their accounts. See Databricks sign-on from partner solutions.
View the YAML source for a Databricks job
October 30, 2023
You can now view and copy the YAML source for a job by clicking on the job details page and selecting View YAML/JSON. You can use the YAML source to create CI/CD workflows with Databricks Asset Bundles. See What are Databricks Asset Bundles?.
Add conditional logic to your Databricks workflows
October 30, 2023
You can now use the If/else condition
task to conditionally run tasks in a Databricks job based on the results of a boolean expression. See Add branching logic to a job with the If/else task.
Configure parameters on a Databricks job that can be referenced by all job tasks
October 30, 2023
You can now add parameters to your Databricks jobs that are automatically passed to all job tasks that accept key-value pairs. See Configure job parameters. Additionally, you can now use an expanded set of value references to pass context and state between job tasks. See What is a dynamic value reference?.
Support for new GPU instance types
October 30, 2023
Databricks has added support for P4de and P5 instance types. See GPU-enabled compute.
Auto-enable deletion vectors
October 30, 2023
You can now enable deletion vectors on all new Delta tables in Databricks Runtime 14.0 and above with the workspace admin setting Auto-Enable Deletion Vectors. This behavior is currently opt-in but will change to opt-out in the future. Databricks recommends manually configuring an option for this setting. See Auto-enable deletion vectors.
Unity Catalog support for UNDROP TABLE
is GA
October 25, 2023
You can undrop a dropped managed or external table in an existing schema within seven days of dropping. Requires Databricks Runtime 12.1 and above. See UNDROP TABLE and SHOW TABLES DROPPED.
Partner Connect supports Dataiku
October 25, 2023
You can now use Partner Connect to connect your Databricks workspace to Dataiku. See Connect to Dataiku.
Mosaic AutoML generated notebooks are now saved as MLflow artifacts
October 24, 2023
Mosaic AutoML generated notebooks are now saved as MLflow artifacts in all Databricks Runtime for Machine Learning versions.
Predictive optimization (Public Preview)
October 24, 2023
Predictive optimization removes the need to manually manage maintenance operations for Delta tables. Maintenance operations are only run as necessary, eliminating both unnecessary runs for maintenance operations and burden associated with tracking and troubleshooting performance. See Predictive optimization for Unity Catalog managed tables.
Compute system tables are now available (Public Preview)
October 23, 2023
The system.compute
schema contains two new tables you can use to monitor the compute resources in your account: clusters
and node_types
.
The clusters table is a slow-changing dimension table that contains the full history of cluster configurations over time for all-purpose and jobs clusters. The node types table captures the currently available node types with their basic hardware information. If you don’t have access to these system tables, ensure you have enabled the compute schema in your account (see Enable system table schemas).
For more information on compute system tables, see Compute system tables reference.
On-demand feature computation is GA
October 19, 2023
Machine learning features can now be computed on-demand at inference time. This enables models to compute features using inputs that are only available at inference time, such as a user’s current location, or to use features that are cost-prohibitive to precompute, store, and refresh. Model training code can define an arbitrary graph of feature lookups and computations that are executed during model training and inference.
Feature computation logic, models, and data are all governed by Unity Catalog. For more information, see Compute features on demand using Python user-defined functions.
Feature Engineering in Unity Catalog is GA
October 19, 2023
With Feature Engineering in Unity Catalog, Unity Catalog becomes your feature store. You can use any Delta table with a primary key as a feature table for model training or inference. Unity Catalog provides feature discovery and governance.
AI-generated table comments (Public Preview)
October 18, 2023
As part of the initiative to use AI to assist you as you work with Databricks, Databricks is introducing AI-generated table and column comments to Public Preview. In Catalog Explorer, you can view, edit, and add an AI-generated comment for any table or table column managed by Unity Catalog. Comments are powered by a large language model (LLM) that takes into account the table metadata, such as the table schema and column names. In compliance security profile workspaces, AI-generated comments may use external model partners to provide responses. Data sent to these services is not used for model training. For all other workspaces on AWS, AI-generated comments use an internal model.
Compliance security profile works with serverless SQL warehouses in ap-southeast-2
region (Public Preview)
October 18, 2023
If your workspace has the compliance security profile enabled, you can use serverless SQL warehouses to process regulated data under HIPAA, PCI-DSS, and FedRAMP Moderate. In addition to the us-east-1
region, this Public Preview is now available in the ap-southeast-2
region. See Which compute resources get enhanced security.
Serverless SQL warehouse support for the compliance security profile will be incrementally rolled out to all customers in the two regions over several weeks. To prioritize the enrollment of your account for using the preview, contact your Databricks account team.
Models in Unity Catalog is GA
October 17, 2023
ML Models in Unity Catalog are now generally available. Unity Catalog provides centralized access control, auditing, lineage, model sharing across workspaces, and better MLOps deployment workflows. Databricks recommends using Models in Unity Catalog instead of the Workspace Model Registry. See Manage model lifecycle in Unity Catalog for details.
Libraries now supported in compute policies (Public Preview)
October 17, 2023
Workspace admins can now add libraries to compute policies. Compute that use the policy will automatically install the library. Users can’t install or uninstall compute-scoped libraries on compute that use the policy. Previously installed libraries will be uninstalled.
Databricks recommends migrating all libraries installed with init scripts to use compute policies. See Add libraries to a policy.
Partner Connect supports Monte Carlo
October 16, 2023
You can now use Partner Connect to connect your Databricks workspace to Monte Carlo. For more information, see Connect Databricks to Monte Carlo.
Semantic search (Public Preview)
October 16, 2023
You can now use natural language to search Unity Catalog tables in the advanced Search dialog. See Semantic search.
Enable Databricks Assistant at the workspace level
October 11, 2023
A workspace admin can now enable or disable Databricks Assistant for an individual workspace if the account admin has allowed it. For details, see How do I enable Databricks Assistant?.
IP access lists for the account console is GA
October 11, 2023
IP access lists is GA. The feature allows you to control access to the account console by IP address ranges. See Configure IP access lists for the account console.
New Photon defaults
October 11, 2023
When creating a new cluster through the UI, the default Databricks Runtime engine is Photon enabled. This applies to all-purpose and job clusters.
New clusters created with a cluster policy that is Photon-compatible have Photon enabled by default. A cluster policy is Photon-compatible if Databricks Runtime supports it, node type is supported, and the runtime_engine
is not explicitly set to STANDARD
.
Databricks Runtime 14.1 is GA
October 11, 2023
Databricks Runtime 14.1 and Databricks Runtime 14.1 ML are now generally available.
See Databricks Runtime 14.1 and Databricks Runtime 14.1 for Machine Learning.
Developer tools release notes have moved
October 10, 2023
Release notes for Databricks developer tools after October 10, 2023 are now posted in the Databricks developer tools and SDKs release notes instead of the Databricks platform release notes.
Databricks extension for Visual Studio Code updated to version 1.1.5
October 9, 2023
The Databricks extension for Visual Studio Code version 1.1.5 contains a few minor fixes. For details, see the changelog for version 1.1.5.
Predictive I/O for updates is GA
October 9, 2023
Predictive I/O for updates is now generally available on Databricks Runtime 14.0 and above. See What is predictive I/O?.
Deletion vectors are GA
October 9, 2023
Deletion vectors are now generally available on Databricks Runtime 14.0 and above. See What are deletion vectors?.
Automatic enablement of Unity Catalog for new workspaces
November 8, 2023
Databricks has begun to enable Unity Catalog automatically for new workspaces. This removes the need for account admins to configure Unity Catalog after a workspace is created. Rollout will proceed gradually across accounts and regions. At first, only new workspaces on new accounts will be enabled. See Automatic enablement of Unity Catalog.
Infosec Registered Assessors Program (IRAP) compliance controls
October 9, 2023
Infosec Registered Assessors Program (IRAP) provides high-quality information and communications technology (ICT) security assessment services to the Australian government. IRAP compliance controls are available only in the ap-southeast-2
region. See IRAP compliance controls.
Partner Connect supports RudderStack
October 5, 2023
You can now use Partner Connect to connect your Databricks workspace to RudderStack. For more information, see Connect to RudderStack.
Databricks CLI updated to version 0.207.0 (Public Preview)
October 4, 2023
The Databricks command-line interface (Databricks CLI) has been updated to version 0.207.0. This release contains feature updates and fixes for Databricks Asset Bundles, makes additions and changes to several command groups and commands, and more. For details, see the changelog for version 0.207.0.
Run selected cells in a notebook
October 4, 2023
You can now run only selected cells in a notebook. See Run selected cells.
Use workspace-catalog binding to give read-only access to a catalog
October 4, 2023
When you use workspace-catalog binding to limit catalog access to specific workspaces in your account, you can now make that access read-only. Read-only workspace-catalog binding is helpful for scenarios like giving users read-only access to production data from a developer workspace to enable development and testing.
This update also deprecates the /api/2.1/unity-catalog/workspace-bindings/
API endpoint and replaces it with /api/2.1/unity-catalog/bindings/
.
New in-product Help experience (Public Preview)
October 4, 2023
The new in-product Help experience is now in Public Preview. See Get help.
Databricks extension for Visual Studio Code updated to version 1.1.4
October 2, 2023
The Databricks extension for Visual Studio Code version 1.1.4 adds support for custom Databricks workspace URLs, and more. For details, see the changelog for version 1.1.4.
Databricks SDK for Python updated to version 0.10.0 (Beta)
October 3, 2023
Databricks SDK for Python version 0.10.0 introduces 7 breaking changes, adds 10 dataclasses, adds 6 fields, and adds one service. For details, see the changelog for version 0.10.0.
Databricks SDK for Go updated to version 0.22.0 (Beta)
October 3, 2023
Databricks SDK for Go version 0.22.0 introduces one breaking API change and adds one API. For details, see the changelog for version 0.22.0.