Feature Store and Model Serving
Lakebase Autoscaling is the latest version of Lakebase, with autoscaling compute, scale-to-zero, branching, and instant restore. For supported regions, see Region availability. If you are a Lakebase Provisioned user, see Lakebase Provisioned.
Databricks Online Feature Stores are powered by Lakebase Autoscaling. When you create an online store using the Feature Engineering client, Databricks provisions a Lakebase Autoscaling project as the underlying storage backend, giving you low-latency access to feature data for real-time ML inference.
Use cases
- Real-time model inference: Serve the latest feature values to model serving endpoints with low latency. Models trained with Databricks Feature Engineering automatically track lineage to their features and use Unity Catalog to locate the appropriate online store at serving time.
- Feature serving endpoints: Serve features directly to external applications and services without a model, using Feature Serving Endpoints.
- Recommendation systems, fraud detection, personalization: Any application requiring consistent, high-throughput feature lookups against offline feature tables.
How it works
Online Feature Stores sync data from offline Unity Catalog feature tables into a Lakebase Autoscaling project. You control how often data syncs using publish modes:
- TRIGGERED (default): Incremental sync on a schedule or via API.
- CONTINUOUS: Streaming pipeline that updates the online store as new data is written to the offline table.
- SNAPSHOT: One-time full copy, efficient for bulk updates.
Because the online store is a Lakebase Autoscaling project, it benefits from automatic compute scaling, scale to zero during inactivity, and Unity Catalog governance.
New Online Feature Stores are created as Lakebase Autoscaling projects. If you have existing Lakebase Provisioned online stores, see Autoscaling by default for migration details.
Implementation
For full setup instructions, API reference, and notebook examples, see Databricks Online Feature Stores.