Databricks CLI - willsql4food/LakehouseToolkit GitHub Wiki

Databricks CLI

Usage: databricks [command]

Databricks Workspace git-credentials Registers personal access token for Databricks to do operations on behalf of the user. repos The Repos API allows users to manage their git repos. secrets The Secrets API allows you to manage secrets, secret scopes, and access permissions. workspace The Workspace API allows you to list, import, export, and delete notebooks and folders.

Compute cluster-policies You can use cluster policies to control users' ability to configure clusters based on a set of rules. clusters The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. global-init-scripts The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. instance-pools Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. instance-profiles The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. libraries The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. policy-families View available policy families.

Jobs jobs The Jobs API allows you to create, edit, and delete jobs.

Delta Live Tables pipelines The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines.

Machine Learning experiments Experiments are the primary unit of organization in MLflow; all MLflow runs belong to an experiment. model-registry Note: This API reference documents APIs for the Workspace Model Registry.

Real-time Serving serving-endpoints The Serving Endpoints API allows you to create, update, and delete model serving endpoints.

Identity and Access Management current-user This API allows retrieving information about currently authenticated user or service principal. groups Groups simplify identity management, making it easier to assign access to Databricks workspace, data, and other securable objects. permissions Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. service-principals Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. users User identities recognized by Databricks and represented by email addresses.

Databricks SQL alerts The alerts API can be used to perform CRUD operations on alerts. dashboards In general, there is little need to modify dashboards using the API. data-sources This API is provided to assist you in making new query objects. queries These endpoints are used for CRUD operations on query definitions. query-history Access the history of queries through SQL warehouses. warehouses A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL.

Unity Catalog artifact-allowlists In Databricks Runtime 13.3 and above, you can add libraries and init scripts to the allowlist in UC so that users can leverage these artifacts on compute configured with shared access mode. catalogs A catalog is the first layer of Unity Catalog???s three-level namespace. connections Connections allow for creating a connection to an external data source. external-locations An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. functions Functions implement User-Defined Functions (UDFs) in Unity Catalog. grants In Unity Catalog, data is secure by default. metastores A metastore is the top-level container of objects in Unity Catalog. model-versions Databricks provides a hosted version of MLflow Model Registry in Unity Catalog. registered-models Databricks provides a hosted version of MLflow Model Registry in Unity Catalog. schemas A schema (also called a database) is the second layer of Unity Catalog???s three-level namespace. storage-credentials A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. system-schemas A system schema is a schema that lives within the system catalog. table-constraints Primary key and foreign key constraints encode relationships between fields in tables. tables A table resides in the third layer of Unity Catalog???s three-level namespace. volumes Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. workspace-bindings A securable in Databricks can be configured as OPEN or ISOLATED.

Delta Sharing providers A data provider is an object representing the organization in the real world who shares the data. recipient-activation The Recipient Activation API is only applicable in the open sharing model where the recipient object has the authentication type of TOKEN. recipients A recipient is an object you create using :method:recipients/create to represent an organization which you want to allow access shares. shares A share is a container instantiated with :method:shares/create.

Settings ip-access-lists IP Access List enables admins to configure IP access lists. settings The default namespace setting API allows users to configure the default namespace for a Databricks workspace. token-management Enables administrators to get all tokens and delete tokens for other users. tokens The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. workspace-conf This API allows updating known workspace settings for advanced users.

Additional Commands: account Databricks Account Commands api Perform Databricks API call auth Authentication related commands bundle Databricks Asset Bundles completion Generate the autocompletion script for the specified shell configure Configure authentication fs Filesystem related commands help Help about any command labs Manage Databricks Labs installations sync Synchronize a local directory to a workspace directory version Retrieve information about the current version of this CLI

Flags: --debug enable debug logging -h, --help help for databricks -o, --output type output type: text or json (default text) -p, --profile string ~/.databrickscfg profile -t, --target string bundle target to use (if applicable) -v, --version version for databricks

Use "databricks [command] --help" for more information about a command.

⚠️ **GitHub.com Fallback** ⚠️