• Databricks
  • Databricks
  • Support
  • Feedback
  • Try Databricks
  • Help Center
  • Documentation
  • Knowledge Base
Databricks on AWS

Getting started

  • Introduction
  • Get started
  • Tutorials and best practices

User guides

  • Data Science & Engineering
    • Navigate the workspace
    • Runtimes
    • Clusters
    • Notebooks
      • Manage notebooks
      • Use notebooks
      • Visualizations
      • Dashboards
      • Widgets
      • Notebook workflows
      • Package cells
      • IPython kernel
    • Workflows
    • Libraries
    • Repos for Git integration
    • Databricks File System (DBFS)
    • Migration
    • Applications: Genomics
  • Machine Learning
  • Databricks SQL
  • Data
  • Delta Lake
  • Developer tools
  • Integrations

Administration guides

  • Accounts and workspaces
  • Security
  • Data governance
  • Data sharing

Reference guides

  • API reference
  • SQL reference
  • CLI and utilities

Resources

  • Release notes
  • Other resources

Updated May 13, 2022

Send us feedback

  • Documentation
  • Databricks Data Science & Engineering guide
  • Notebooks

Notebooks

A notebook is a web-based interface to a document that contains runnable code, visualizations, and narrative text.

For a quick introduction to notebooks, view this video:

This section describes how to manage and use notebooks. It also contains articles on creating data visualizations, sharing visualizations as dashboards, parameterizing notebooks and dashboards with widgets, building complex pipelines using notebook workflows, and best practices for defining classes in Scala notebooks.

  • Manage notebooks
    • Create a notebook
    • Open a notebook
    • Delete a notebook
    • Copy notebook path
    • Rename a notebook
    • Control access to a notebook
    • Notebook external formats
    • Notebooks and clusters
    • Schedule a notebook
    • Distribute notebooks
  • Use notebooks
    • Develop notebooks
    • Run notebooks
    • Share code in notebooks
    • Manage notebook state and results
    • Revision history
    • Version control with Git
  • Visualizations
    • display function
    • displayHTML function
    • Visualizations by language
  • Dashboards
    • Dashboards notebook
    • Create a scheduled job to refresh a dashboard
    • View a specific dashboard version
  • Widgets
    • Widget types
    • Widget API
    • Configure widget settings
    • Widgets in dashboards
    • Use widgets with %run
  • Notebook workflows
    • API
    • Example
    • Pass structured data
    • Handle errors
    • Run multiple notebooks concurrently
  • Package cells
    • Package Cells notebook
  • IPython kernel
    • Benefits of using the IPython kernel
    • How to use the IPython kernel with Databricks
    • Known issues


© Databricks 2022. All rights reserved. Apache, Apache Spark, Spark, and the Spark logo are trademarks of the Apache Software Foundation.

Send us feedback | Privacy Policy | Terms of Use