Skip to main content

January 2026

These features and Databricks platform improvements were released in January 2026.

note

Releases are staged. Your Databricks account might not be updated until a week or more after the initial release date.

Compute sizing for Databricks Apps is now generally available

January 28, 2026

Compute sizing for Databricks Apps is now generally available. Choose between Medium (2 vCPUs, 6 GB) and Large (4 vCPUs, 12 GB) compute sizes to match your workload requirements.

See Configure the compute size for a Databricks app.

Databricks Runtime maintenance updates (01/27)

January 27, 2026

New maintenance updates are available for supported Databricks Runtime versions. These updates include bug fixes, security patches, and performance improvements. For details, see:

Agent Bricks: Knowledge Assistant is now available in more regions

January 27, 2026

Agent Bricks: Knowledge Assistant is now available in the following AWS regions: us-east-2, ca-central-1, eu-central-1, eu-west-1, ap-southeast-1 (requires cross-geo processing), and ap-southeast-2 (requires cross-geo processing).

Users in these regions can now use Agent Bricks: Knowledge Assistant to create a production-grade AI agent that can answer questions about their documents and provide high-quality responses with citations.

See Use Agent Bricks: Knowledge Assistant to create a high-quality chatbot over your documents.

Managed MCP servers is now Public Preview

January 27, 2026

Databricks Managed MCP servers is now Public Preview. Managed MCP servers allow your AI agents to securely connect to Databricks resources and external APIs. See Model Context Protocol (MCP) on Databricks.

Store and query MLflow traces in Unity Catalog (Beta)

January 26, 2026

You can now store MLflow traces in Unity Catalog tables using OpenTelemetry format and query them using Databricks SQL. This provides several benefits:

  • Store unlimited traces in Delta tables for long-term retention and analysis
  • Query trace data directly using SQL through a Databricks SQL warehouse
  • Manage access control through Unity Catalog schema and table permissions
  • Ensure compatibility with other OpenTelemetry clients and tools

See Store MLflow traces in Unity Catalog and Query MLflow traces using MLflow Databricks SQL.

Google Drive connector (Beta)

January 23, 2026

The standard Google Drive connector in Lakeflow Connect allows you ingest Google Drive files into Databricks. You can use read_files, spark.read, COPY INTO, and Auto Loader to create Spark DataFrames, materialized views, and streaming tables, enabling you to build custom pipelines for common file ingestion use cases. See Ingest Google Drive files into Databricks.

January 22, 2026

You can now use AWS front-end PrivateLink for private connectivity to performance-intensive services like Zerobus Ingest and Lakebase Autoscaling. See Configure front-end PrivateLink for performance-intensive services.

Lakebase is now generally available

January 22, 2026

Lakebase is now generally available. Lakebase Autoscaling and Lakebase Provisioned are now unified in a single UI. GA includes autoscaling, scale-to-zero, instant branching, automated backups, point-in-time recovery, storage up to 8 TB, and expanded region availability.

See Get started with Lakebase Postgres.

Serverless egress IPs available via public JSON endpoint (Public Preview)

January 22, 2026

Starting mid-February 2026, you can retrieve serverless egress IP addresses from a public JSON endpoint. This replaces the existing stable IPs shared with customers enrolled in the Public Preview. See Configure a firewall for serverless compute access.

You can now use custom base environments for Python, Python Wheels, and notebook tasks in serverless jobs

January 20, 2026

Serverless jobs now support custom base environments defined with YAML files for Python, Python wheel, and notebook tasks. For notebook tasks, you can either select a custom base environment in the job’s environment configuration or use the notebook’s own environment settings, which support both workspace environments and custom base environments.

For more information, see Manage serverless base environments.

Unified Lakebase interface

January 15, 2026

Lakebase Provisioned instances are now accessible through the Lakebase App (via the apps switcher in the Databricks UI). The new unified interface consolidates Lakebase Provisioned and Lakebase Autoscaling management in one location, replacing the previous workflow of navigating to the Compute tab in the Lakehouse UI.

Databricks Runtime 18.0 is now GA

January 15, 2026

Databricks Runtime 18.0 is now generally available. See Databricks Runtime 18.0 and Databricks Runtime 18.0 for Machine Learning.

Ingest Salesforce formula fields incrementally (Beta)

January 13, 2026

By default, Lakeflow Connect doesn't ingest Salesforce formula fields incrementally. Instead, it takes a snapshot of these fields on each pipeline run, then joins them with the rest of the table. However, you can now enable incremental formula field ingestion, which often significantly improves performance and reduces costs. See Ingest Salesforce formula fields incrementally.

Agent Bricks: Knowledge Assistant is now generally available

January 13, 2026

Agent Bricks provides a streamlined approach to operationalize data into production-grade AI agents. Use Agent Bricks: Knowledge Assistant to create a chatbot that can answer questions about your documents and provide high-quality responses with citations.

Knowledge Assistant is now generally available in select US regions for workspaces without Enhanced Security and Compliance features. For workspaces with enhanced security and compliance, Knowledge Assistant will be generally available soon. See What's coming.

Row filtering for managed ingestion connectors (Beta)

January 13, 2026

Lakeflow Connect now offers row filtering for managed ingestion connectors to improve performance and minimize data duplication. Row filtering applies conditions similar to a SQL WHERE clause, allowing you to ingest only the data you need from your source systems.

This feature is available for the Google Analytics, Salesforce, and ServiceNow connectors.

See Select rows to ingest.

OpenAI GPT-5.1 Codex Max and Codex Mini now available as Databricks-hosted models

January 12, 2026

Mosaic AI Model Serving now supports OpenAI GPT-5.1 Codex Max and GPT-5.1 Codex Mini as Databricks-hosted models. These code-specialized models excel at code generation, refactoring, and software engineering tasks. You can access these models using Foundation Model APIs pay-per-token.

Customers are responsible for ensuring their compliance with the terms of OpenAI's Acceptable Use Policy.

Databricks Runtime maintenance updates (01/09)

January 9, 2026

New maintenance updates are available for supported Databricks Runtime versions. These updates include bug fixes, security patches, and performance improvements. For details, see:

Create skills for Databricks Assistant

January 6, 2026

You can now create skills to extend Databricks Assistant in agent mode with specialized capabilities for domain-specific tasks. User skills follow the open Agent Skills standard and are automatically loaded when relevant.

See Extend the Assistant with agent skills.

Automatic email notifications for expiring personal access tokens (GA)

January 6, 2026

Automatic email notifications for expiring personal access tokens are now generally available. For more information, see Set the maximum lifetime of new personal access tokens.