Follow the guidance below to respond to common error messages or to troubleshoot issues with Databricks Repos.
Try the following:
Confirm that the Git integration settings (User Settings > Linked accounts) are correct.
You must enter both your Git provider username and token.
Legacy Git integrations did not require a username, so you might need to add a username to work with Databricks Repos.
Confirm that you have selected the correct Git provider in **User Settings** > **Linked accounts**.
Ensure your personal access token or app password has the correct repo access.
If SSO is enabled on your Git provider, authorize your tokens for SSO.
Test your token with the Git command line. Replace the text strings in angle brackets:
git clone https://<username>:<personal-access-token>@github.com/<org>/<repo-name>.git
This error occurs if your Git server is not accessible from Databricks. To access a private Git server get in touch with your Databricks representative
<link>: Secure connection to <link> could not be established because of SSL problems
Expensive operations such as cloning a large repo or checking out a large branch might result in timeout errors, but the operation might complete in the background. You can also try again later if the workspace was under heavy load at the time.
To work with a large repo, try sparse checkout.
If you get a 404 error when you try to open a non-notebook file, try waiting a few minutes and then trying again. There is a delay of a few minutes between when the workspace is enabled and when the webapp picks up the configuration flag.
Different notebooks with similar or the same filename can cause an error when you create a repo or pull request:
Cannot perform Git operation due to conflicting names….
A folder cannot contain a notebook with the same name as a notebook, file, or folder (excluding file extensions).
A naming conflict can occur even with different file extensions. For example, these two files conflict:
If you get
Resource not found errors after pulling non-notebook files into Databricks Repos, you might not be using Databricks Runtime 8.4 or above. A cluster running Databricks Runtime 8.4 or above is required to work with non-notebook files in a repo.
There was a problem with deleting folders. The repo could be in an inconsistent state and re-cloning is recommended.
This error indicates that a problem occurred while deleting folders from the repo. This could leave the repo in an inconsistent state, where folders that should have been deleted still exist. If this error occurs, Databricks recommends deleting and re-cloning the repo to reset its state.
Unable to set repo to most recent state. This may be due to force pushes overriding commit history on the remote repo. Repo may be out of sync and re-cloning is recommended.
This error indicates that the local and remote Git state have diverged. This can happen when a force push on the remote overrides recent commits that still exist on the local repo. Databricks does not support a hard reset within Repos and recommends deleting and re-cloning the repo if this error occurs.
Files do not appear after cloning a remote repos or pulling files into an existing one. If you know your workspace admin enabled Databricks Repos and support for arbitrary files, try the following:
Confirm your cluster is running Databricks Runtime 8.4 or above.
Refresh your browser and restart your cluster to pick up the new configuration.
You might see a Databricks error message
No experiment for node found or an error in MLflow when you work on an
MLflow notebook experiment last logged to before the 3.72 platform release.
To resolve the error, log a new run in the notebook associated with that experiment.
This applies only to notebook experiments. Creation of new experiments in Repos is unsupported.