Databricks CLI

Usage:
  databricks [command]

Databricks Workspace
  fs                                     Filesystem related commands
  git-credentials                        Registers personal access token for Databricks to do operations on behalf of the user.
  repos                                  The Repos API allows users to manage their git repos.
  secrets                                The Secrets API allows you to manage secrets, secret scopes, and access permissions.
  workspace                              The Workspace API allows you to list, import, export, and delete notebooks and folders.

Compute
  cluster-policies                       You can use cluster policies to control users' ability to configure clusters based on a set of rules.
  clusters                               The Clusters API allows you to create, start, edit, list, terminate, and delete clusters.
  global-init-scripts                    The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace.
  instance-pools                         Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times.
  instance-profiles                      The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with.
  libraries                              The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster.
  policy-compliance-for-clusters         The policy compliance APIs allow you to view and manage the policy compliance status of clusters in your workspace.
  policy-families                        View available policy families.

Workflows
  jobs                                   The Jobs API allows you to create, edit, and delete jobs.
  policy-compliance-for-jobs             The compliance APIs allow you to view and manage the policy compliance status of jobs in your workspace.

Delta Live Tables
  pipelines                              The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines.

Machine Learning
  experiments                            Experiments are the primary unit of organization in MLflow; all MLflow runs belong to an experiment.
  model-registry                         Note: This API reference documents APIs for the Workspace Model Registry.

Real-time Serving
  serving-endpoints                      The Serving Endpoints API allows you to create, update, and delete model serving endpoints.

Identity and Access Management
  current-user                           This API allows retrieving information about currently authenticated user or service principal.
  groups                                 Groups simplify identity management, making it easier to assign access to Databricks workspace, data, and other securable objects.
  permissions                            Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints.
  service-principals                     Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms.
  users                                  User identities recognized by Databricks and represented by email addresses.

Databricks SQL
  alerts                                 The alerts API can be used to perform CRUD operations on alerts.
  alerts-legacy                          The alerts API can be used to perform CRUD operations on alerts.
  dashboards                             In general, there is little need to modify dashboards using the API.
  data-sources                           This API is provided to assist you in making new query objects.
  queries                                The queries API can be used to perform CRUD operations on queries.
  queries-legacy                         These endpoints are used for CRUD operations on query definitions.
  query-history                          A service responsible for storing and retrieving the list of queries run against SQL endpoints and serverless compute.
  warehouses                             A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL.

Unity Catalog
  artifact-allowlists                    In Databricks Runtime 13.3 and above, you can add libraries and init scripts to the allowlist in UC so that users can leverage these artifacts on compute configured with shared access mode.
  catalogs                               A catalog is the first layer of Unity Catalog’s three-level namespace.
  connections                            Connections allow for creating a connection to an external data source.
  credentials                            A credential represents an authentication and authorization mechanism for accessing services on your cloud tenant.
  external-locations                     An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path.
  functions                              Functions implement User-Defined Functions (UDFs) in Unity Catalog.
  grants                                 In Unity Catalog, data is secure by default.
  metastores                             A metastore is the top-level container of objects in Unity Catalog.
  model-versions                         Databricks provides a hosted version of MLflow Model Registry in Unity Catalog.
  online-tables                          Online tables provide lower latency and higher QPS access to data from Delta tables.
  quality-monitors                       A monitor computes and monitors data or model quality metrics for a table over time.
  registered-models                      Databricks provides a hosted version of MLflow Model Registry in Unity Catalog.
  resource-quotas                        Unity Catalog enforces resource quotas on all securable objects, which limits the number of resources that can be created.
  schemas                                A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace.
  storage-credentials                    A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant.
  system-schemas                         A system schema is a schema that lives within the system catalog.
  table-constraints                      Primary key and foreign key constraints encode relationships between fields in tables.
  tables                                 A table resides in the third layer of Unity Catalog’s three-level namespace.
  temporary-table-credentials            Temporary Table Credentials refer to short-lived, downscoped credentials used to access cloud storage locationswhere table data is stored in Databricks.
  volumes                                Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files.
  workspace-bindings                     A securable in Databricks can be configured as __OPEN__ or __ISOLATED__.

Delta Sharing
  providers                              A data provider is an object representing the organization in the real world who shares the data.
  recipient-activation                   The Recipient Activation API is only applicable in the open sharing model where the recipient object has the authentication type of TOKEN.
  recipients                             A recipient is an object you create using :method:recipients/create to represent an organization which you want to allow access shares.
  shares                                 A share is a container instantiated with :method:shares/create.

Settings
  ip-access-lists                        IP Access List enables admins to configure IP access lists.
  notification-destinations              The notification destinations API lets you programmatically manage a workspace's notification destinations.
  settings                               Workspace Settings API allows users to manage settings at the workspace level.
  token-management                       Enables administrators to get all tokens and delete tokens for other users.
  tokens                                 The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs.
  workspace-conf                         This API allows updating known workspace settings for advanced users.

Developer Tools
  bundle                                 Databricks Asset Bundles let you express data/AI/analytics projects as code.
  sync                                   Synchronize a local directory to a workspace directory

Vector Search
  vector-search-endpoints                **Endpoint**: Represents the compute resources to host vector search indexes.
  vector-search-indexes                  **Index**: An efficient representation of your embedding vectors that supports real-time and efficient approximate nearest neighbor (ANN) search queries.

Dashboards
  lakeview                               These APIs provide specific management operations for Lakeview dashboards.

Marketplace
  consumer-fulfillments                  Fulfillments are entities that allow consumers to preview installations.
  consumer-installations                 Installations are entities that allow consumers to interact with Databricks Marketplace listings.
  consumer-listings                      Listings are the core entities in the Marketplace.
  consumer-personalization-requests      Personalization Requests allow customers to interact with the individualized Marketplace listing flow.
  consumer-providers                     Providers are the entities that publish listings to the Marketplace.
  provider-exchange-filters              Marketplace exchanges filters curate which groups can access an exchange.
  provider-exchanges                     Marketplace exchanges allow providers to share their listings with a curated set of customers.
  provider-files                         Marketplace offers a set of file APIs for various purposes such as preview notebooks and provider icons.
  provider-listings                      Listings are the core entities in the Marketplace.
  provider-personalization-requests      Personalization requests are an alternate to instantly available listings.
  provider-provider-analytics-dashboards Manage templated analytics solution for providers.
  provider-providers                     Providers are entities that manage assets in Marketplace.

Apps
  apps                                   Apps run directly on a customer’s Databricks instance, integrate with their data, use and extend Databricks services, and enable users to interact through single sign-on.

Clean Rooms
  clean-room-assets                      Clean room assets are data and code objects — Tables, volumes, and notebooks that are shared with the clean room.
  clean-room-task-runs                   Clean room task runs are the executions of notebooks in a clean room.
  clean-rooms                            A clean room uses Delta Sharing and serverless compute to provide a secure and privacy-protecting environment where multiple parties can work together on sensitive enterprise data without direct access to each other’s data.

Additional Commands:
  account                                Databricks Account Commands
  api                                    Perform Databricks API call
  auth                                   Authentication related commands
  completion                             Generate the autocompletion script for the specified shell
  configure                              Configure authentication
  help                                   Help about any command
  labs                                   Manage Databricks Labs installations
  version                                Retrieve information about the current version of this CLI

Flags:
      --debug            enable debug logging
  -h, --help             help for databricks
  -o, --output type      output type: text or json (default text)
  -p, --profile string   ~/.databrickscfg profile
  -t, --target string    bundle target to use (if applicable)
  -v, --version          version for databricks

Use "databricks [command] --help" for more information about a command.