Developer tools and guidance
Learn about tools and guidance you can use to work with Databricks resources and data and to develop Databricks applications.
Section |
Use this section when you want to… |
---|---|
Authenticate with Databricks from your tools, scripts, and apps. You must authenticate with Databricks before you can work with Databricks resources and data. |
|
Connect to Databricks by using popular integrated development environments (IDEs) such as Visual Studio Code, PyCharm, IntelliJ IDEA, Eclipse, and RStudio, as well as automate Databricks by using IDE plugins. |
|
Automate Databricks from code libraries written for popular languages such as Go. |
|
Run SQL commands on Databricks from code written in popular languages such as Python, Go, JavaScript, and TypeScript. Connect tools and clients to Databricks through ODBC and JDBC connections. |
|
Automate Databricks by using command-line interfaces (CLIs). |
|
Use Databricks Utilities from within notebooks to do things such as work with object storage efficiently, chain and parameterize notebooks, and work with sensitive credential information. |
|
Look up reference information for the Databricks REST APIs. |
|
Automate the provision and maintenance of Databricks infrastructure and resources by using popular infrastructure-as-code (IaC) products such as Terraform, the Cloud Development Kit for Terraform, and Pulumi. |
|
Implement industry-standard continuous integration and continuous delivery (CI/CD) practices for Databricks by using popular systems such as GitHub Actions, Azure Pipelines, GitLab CI/CD, Jenkins, and Apache Airflow. |
|
Run SQL commands and scripts in Databricks by using Databricks CLIs, as well as popular tools such as DataGrip, DBeaver, and SQL Workbench/J. |
|
Databricks recommends that you use service principals instead of users to authenticate automated scripts, tools, apps, and systems with Databricks workspaces and resources. |
Tip
You can also connect many additional popular third-party tools to clusters and SQL warehouses to access data in Databricks. See the Technology partners.