Databricks KB - willsql4food/LakehouseToolkit GitHub Wiki

Databricks KB

Secrets & Scopes

Setup Example

Here's what I had to do to get a new Azure Databricks workspace to be able to read from a storage account:

  • Create Az Databricks workspace
  • Generate a Personal Access Token and store in Azure Key Vault
  • Generate a Shared Access Signature for the storage account and store in Azure Key Vault
  • Install Databricks CLI (cloned the Git Repo https://github.com/databricks/setup-cli.git)
    • Also monkeyed around with curl for installation and querying secrets & scopes, YMMV
  • Setup a .databrickscfg file to hold configuration information for my workspace
    • used databricks config command and filled in the two prompts (DBX instance and Personal Access Token above)
  • Create a secret scope
    • databricks secrets create-scope log_analytics --initial-manage-principal users
  • Create a secret in that scope
    • databricks secrets put-secret log_analytics sas_key_staab09289802
    • then supply the SAS key for the storage account

Organization

Scopes can be setup in Databricks or Azure Key Vault.

  • For Databricks to access Azure Key Vault, the permissions model in the Key Vault must be set to Vault access policy

In Azure Key Vault

  • The scope represents the entirety of the Azure Key Vault and presents all secrets to Databricks
    • Suggestion: give the Scope the same name as the Key Vault itself.
    • If there are secrets that Databricks should not access, have these in separate key vaults
⚠️ **GitHub.com Fallback** ⚠️