configure-access-resources-from-model-serving (1)(Python)

Loading...

Configure access to resources from model serving endpoints

This notebook example demonstrates how you can securely store credentials in a Databricks secrets scope and reference those secrets in model serving. This allows credentials to be fetched at serving time from model serving endpoints.

Requirements

 

  • This functionality currently is only supported via the Databricks REST API.
  • To use this feature, you must store credentials like your API key or other tokens as a Databricks secret.
  • The endpoint creator must have Read access to the Databricks secrets being referenced in the endpoint configuration.
  • This notebook requireds Databricks SDK version 0.6.0 and above.

Step 0: Install and upgrade any dependencies if necessary

 

4

    5

      Step 1: Add secrets to Databricks Secret Store

      7

        You can modify the following variables to assign your secret and its corresponding key and values.

        9

        Step 2: Upload sample document data into DBFS

         

        11

        Step 3: Log and register the LangChain model

          This example model is adapted from https://github.com/mlflow/mlflow/blob/master/examples/langchain/retrieval_qa_chain.py.

        13

        In this section, you can create a vector database and persist that database to a local file store folder. You also create a RetrievalQA chain and log it start your model run.

        15

        Next, you can load that logged model using MLflow pyfunc.

        17

        Step 4: Create and query the serving endpoint

        In this section you create a serving endpoint to serve your model and query it.

        20

          21

          You can use the following code to check the endpoint status to verify it is ready.

          23

          Finally, you can query the endpoint with sample data.

          25