Use Shiny inside Databricks notebooks

Preview

This feature is in Public Preview.

You can develop, host, and share Shiny applications directly from a Databricks notebook.

To get started with Shiny, see the Shiny tutorials. You can run these tutorials on Databricks notebooks.

Requirements

  • Databricks Runtime 8.3 or above.

Use Shiny inside R notebooks

The Shiny package is included with Databricks Runtime. You can interactively develop and test Shiny applications inside Databricks R notebooks similarly to hosted RStudio.

Follow these steps to get started:

  1. Create an R notebook.

  2. Run this code:

      library(shiny)
      runExample("01_hello")
    
  3. When the app is ready, the output includes the Shiny app URL as a clickable link which opens a new tab. See Share Shiny app URL for information about sharing this app with other users.

    Example Shiny app

Note

  • Log messages appear in the command result, similar to the default log message (Listening on http://0.0.0.0:5150) shown in the example.
  • To stop the Shiny application, click Cancel.
  • The Shiny application uses the notebook R process. If you detach the notebook from the cluster, or if you cancel the cell running the application, the Shiny application teminates. You cannot run other cells while the Shiny application is running.

Run Shiny applications from files

If your Shiny application code is part of a project managed by version control, you can run it inside the notebook.

Note

You must use absolute path or set the working directory with setwd().

  1. Check out the code from a repository using code similar to:

      %sh git clone https://github.com/rstudio/shiny-examples.git
      cloning into 'shiny-examples'...
    
  2. To run the application, enter code similar to the following code in another cell:

    library(shiny)
    runApp("/databricks/driver/shiny-examples/007-widgets/")
    

Use Apache Spark inside Shiny applications

You can use Apache Spark inside Shiny applications with either SparkR or sparklyr. For more details see install the Shiny R package.

Use SparkR with Shiny in a notebook

library(shiny)
library(SparkR)
sparkR.session()

ui <- fluidPage(
  mainPanel(
    textOutput("value")
  )
)

server <- function(input, output) {
  output$value <- renderText({ nrow(createDataFrame(iris)) })
}

shinyApp(ui = ui, server = server)

Use sparklyr with Shiny in a notebook

library(shiny)
library(sparklyr)

sc <- spark_connect(method = "databricks")

ui <- fluidPage(
  mainPanel(
    textOutput("value")
  )
)

server <- function(input, output) {
  output$value <- renderText({
    df <- sdf_len(sc, 5, repartition = 1) %>%
      spark_apply(function(e) sum(e)) %>%
      collect()
    df$result
  })
}

shinyApp(ui = ui, server = server)

Share Shiny app URL

The Shiny app URL generated when you start an app is shareable with other users. Any Databricks user with Can Attach To permission on the cluster can view and interact with the app as long as both the app and the cluster are running.

If the cluster that the app is running on terminates, the app is no longer accessible. You can disable automatic termination in the cluster settings.

If you attach and run the notebook hosting the Shiny app on a different cluster, the Shiny URL changes. Also, if you restart the app on the same cluster, Shiny might pick a different random port. To ensure a stable URL, you can set the shiny.port option, or, when restarting the app on the same cluster, you can specify the port argument.