What are workspace files?
A workspace file is any file in the Databricks workspace that is not a Databricks notebook. Workspace files can be any file type. Common examples include:
.py
files used in custom modules..md
files, such asREADME.md
..csv
or other small data files..txt
files..whl
libraries.Log files.
Workspace files include files formerly referred to as “Files in Repos.”
Important
Workspace files are enabled everywhere by default in Databricks Runtime version 11.2, but can be disabled by admins using the REST API. For production workloads, use Databricks Runtime 11.3 LTS or above. Contact your workspace administrator if you cannot access this functionality.
What you can do with workspace files
Databricks provides functionality similar to local development for many workspace file types, including a built-in file editor. Not all use cases for all file types are supported. For example, while you can include images in an imported directory or repository, you cannot embed images in notebooks.
You can create, edit, and manage access to workspace files using familiar patterns from notebook interactions. You can use relative paths for library imports from workspace files, similar to local development. For more details, see:
Init scripts stored in workspace files have special behavior. You can use workspace files to store and reference init scripts in any Databricks Runtime versions. See Store init scripts in workspace files.
Note
In Databricks Runtime 14.0 and above, the the default current working directory (CWD) for code executed locally is the directory containing the notebook or script being run. This is a change in behavior from Databricks Runtime 13.3 LTS and below. See What is the default current working directory in Databricks Runtime 14.0 and above?.
Databricks Runtime versions for files in Repos with a cluster with Databricks Container Services
On clusters running Databricks Runtime 11.3 LTS and above, the default settings allow you to use workspace files in Repos with Databricks Container Services (DCS).
On clusters running Databricks Runtime versions 10.4 LTS and 9.1 LTS, you must configure the dockerfile to access workspace files in Repos on a cluster with DCS. Refer to the following dockerfiles for the desired Databricks Runtime version: