Working with pub/sub and message queues on Databricks

Databricks can integrate with stream messaging services for near-real time data ingestion into the Databricks Lakehouse. It can also sync enriched and transformed data in the lakehouse with other streaming systems.

Ingesting streaming messages to Delta Lake allows you to retain messages indefinitely, allowing you to replay data streams without fear of losing data due to retention thresholds.

Databricks has specific features for working with semi-structured data fields contained in Avro and JSON data payloads. To learn more, see:

To learn more about specific configurations for streaming from or to message queues, see: