Added OpenAPI command coverage (#357)
This PR adds the following command groups:
## Workspace-level command groups
* `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts.
* `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace.
* `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules.
* `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters.
* `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal.
* `bricks dashboards` - In general, there is little need to modify dashboards using the API.
* `bricks data-sources` - This API is provided to assist you in making new query objects.
* `bricks experiments` - MLflow Experiment tracking.
* `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path.
* `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog.
* `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user.
* `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace.
* `bricks grants` - In Unity Catalog, data is secure by default.
* `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects.
* `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times.
* `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with.
* `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists.
* `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs.
* `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster.
* `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog.
* `bricks model-registry` - MLflow Model Registry commands.
* `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints.
* `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines.
* `bricks policy-families` - View available policy families.
* `bricks providers` - Databricks Providers REST API.
* `bricks queries` - These endpoints are used for CRUD operations on query definitions.
* `bricks query-history` - Access the history of queries through SQL warehouses.
* `bricks recipient-activation` - Databricks Recipient Activation REST API.
* `bricks recipients` - Databricks Recipients REST API.
* `bricks repos` - The Repos API allows users to manage their git repos.
* `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace.
* `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions.
* `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms.
* `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints.
* `bricks shares` - Databricks Shares REST API.
* `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant.
* `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables.
* `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace.
* `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users.
* `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs.
* `bricks users` - User identities recognized by Databricks and represented by email addresses.
* `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files.
* `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL.
* `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders.
* `bricks workspace-conf` - This API allows updating known workspace settings for advanced users.
## Account-level command groups
* `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range.
* `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period.
* `bricks account credentials` - These APIs manage credential configurations for this workspace.
* `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud.
* `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional).
* `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects.
* `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console.
* `bricks account log-delivery` - These APIs manage log delivery configurations for this account.
* `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace.
* `bricks account metastores` - These APIs manage Unity Catalog metastores for an account.
* `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional).
* `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration.
* `bricks account private-access` - These APIs manage private access settings for this account.
* `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud.
* `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms.
* `bricks account storage` - These APIs manage storage configurations for this workspace.
* `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore.
* `bricks account users` - User identities recognized by Databricks and represented by email addresses.
* `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account.
* `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account.
* `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
|
|
|
// Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT.
|
|
|
|
|
2023-06-15 14:47:24 +00:00
|
|
|
package workspace
|
Added OpenAPI command coverage (#357)
This PR adds the following command groups:
## Workspace-level command groups
* `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts.
* `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace.
* `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules.
* `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters.
* `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal.
* `bricks dashboards` - In general, there is little need to modify dashboards using the API.
* `bricks data-sources` - This API is provided to assist you in making new query objects.
* `bricks experiments` - MLflow Experiment tracking.
* `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path.
* `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog.
* `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user.
* `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace.
* `bricks grants` - In Unity Catalog, data is secure by default.
* `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects.
* `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times.
* `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with.
* `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists.
* `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs.
* `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster.
* `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog.
* `bricks model-registry` - MLflow Model Registry commands.
* `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints.
* `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines.
* `bricks policy-families` - View available policy families.
* `bricks providers` - Databricks Providers REST API.
* `bricks queries` - These endpoints are used for CRUD operations on query definitions.
* `bricks query-history` - Access the history of queries through SQL warehouses.
* `bricks recipient-activation` - Databricks Recipient Activation REST API.
* `bricks recipients` - Databricks Recipients REST API.
* `bricks repos` - The Repos API allows users to manage their git repos.
* `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace.
* `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions.
* `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms.
* `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints.
* `bricks shares` - Databricks Shares REST API.
* `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant.
* `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables.
* `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace.
* `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users.
* `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs.
* `bricks users` - User identities recognized by Databricks and represented by email addresses.
* `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files.
* `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL.
* `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders.
* `bricks workspace-conf` - This API allows updating known workspace settings for advanced users.
## Account-level command groups
* `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range.
* `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period.
* `bricks account credentials` - These APIs manage credential configurations for this workspace.
* `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud.
* `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional).
* `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects.
* `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console.
* `bricks account log-delivery` - These APIs manage log delivery configurations for this account.
* `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace.
* `bricks account metastores` - These APIs manage Unity Catalog metastores for an account.
* `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional).
* `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration.
* `bricks account private-access` - These APIs manage private access settings for this account.
* `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud.
* `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms.
* `bricks account storage` - These APIs manage storage configurations for this workspace.
* `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore.
* `bricks account users` - User identities recognized by Databricks and represented by email addresses.
* `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account.
* `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account.
* `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
|
|
|
|
|
|
|
import (
|
2023-05-16 16:35:39 +00:00
|
|
|
alerts "github.com/databricks/cli/cmd/workspace/alerts"
|
2023-10-26 11:41:28 +00:00
|
|
|
apps "github.com/databricks/cli/cmd/workspace/apps"
|
2023-09-05 09:43:57 +00:00
|
|
|
artifact_allowlists "github.com/databricks/cli/cmd/workspace/artifact-allowlists"
|
2023-05-16 16:35:39 +00:00
|
|
|
catalogs "github.com/databricks/cli/cmd/workspace/catalogs"
|
2023-07-18 17:13:48 +00:00
|
|
|
clean_rooms "github.com/databricks/cli/cmd/workspace/clean-rooms"
|
2023-05-16 16:35:39 +00:00
|
|
|
cluster_policies "github.com/databricks/cli/cmd/workspace/cluster-policies"
|
|
|
|
clusters "github.com/databricks/cli/cmd/workspace/clusters"
|
2023-06-12 12:23:21 +00:00
|
|
|
connections "github.com/databricks/cli/cmd/workspace/connections"
|
2023-10-03 11:46:16 +00:00
|
|
|
credentials_manager "github.com/databricks/cli/cmd/workspace/credentials-manager"
|
2023-05-16 16:35:39 +00:00
|
|
|
current_user "github.com/databricks/cli/cmd/workspace/current-user"
|
2023-09-05 09:43:57 +00:00
|
|
|
dashboard_widgets "github.com/databricks/cli/cmd/workspace/dashboard-widgets"
|
2023-05-16 16:35:39 +00:00
|
|
|
dashboards "github.com/databricks/cli/cmd/workspace/dashboards"
|
|
|
|
data_sources "github.com/databricks/cli/cmd/workspace/data-sources"
|
|
|
|
experiments "github.com/databricks/cli/cmd/workspace/experiments"
|
|
|
|
external_locations "github.com/databricks/cli/cmd/workspace/external-locations"
|
|
|
|
functions "github.com/databricks/cli/cmd/workspace/functions"
|
|
|
|
git_credentials "github.com/databricks/cli/cmd/workspace/git-credentials"
|
|
|
|
global_init_scripts "github.com/databricks/cli/cmd/workspace/global-init-scripts"
|
|
|
|
grants "github.com/databricks/cli/cmd/workspace/grants"
|
|
|
|
groups "github.com/databricks/cli/cmd/workspace/groups"
|
|
|
|
instance_pools "github.com/databricks/cli/cmd/workspace/instance-pools"
|
|
|
|
instance_profiles "github.com/databricks/cli/cmd/workspace/instance-profiles"
|
|
|
|
ip_access_lists "github.com/databricks/cli/cmd/workspace/ip-access-lists"
|
|
|
|
jobs "github.com/databricks/cli/cmd/workspace/jobs"
|
2024-02-13 14:33:59 +00:00
|
|
|
lakehouse_monitors "github.com/databricks/cli/cmd/workspace/lakehouse-monitors"
|
2024-01-11 08:16:25 +00:00
|
|
|
lakeview "github.com/databricks/cli/cmd/workspace/lakeview"
|
2023-05-16 16:35:39 +00:00
|
|
|
libraries "github.com/databricks/cli/cmd/workspace/libraries"
|
|
|
|
metastores "github.com/databricks/cli/cmd/workspace/metastores"
|
|
|
|
model_registry "github.com/databricks/cli/cmd/workspace/model-registry"
|
2023-09-05 09:43:57 +00:00
|
|
|
model_versions "github.com/databricks/cli/cmd/workspace/model-versions"
|
2024-02-19 14:30:06 +00:00
|
|
|
online_tables "github.com/databricks/cli/cmd/workspace/online-tables"
|
2023-05-16 16:35:39 +00:00
|
|
|
permissions "github.com/databricks/cli/cmd/workspace/permissions"
|
|
|
|
pipelines "github.com/databricks/cli/cmd/workspace/pipelines"
|
|
|
|
policy_families "github.com/databricks/cli/cmd/workspace/policy-families"
|
|
|
|
providers "github.com/databricks/cli/cmd/workspace/providers"
|
|
|
|
queries "github.com/databricks/cli/cmd/workspace/queries"
|
|
|
|
query_history "github.com/databricks/cli/cmd/workspace/query-history"
|
2023-09-05 09:43:57 +00:00
|
|
|
query_visualizations "github.com/databricks/cli/cmd/workspace/query-visualizations"
|
2023-05-16 16:35:39 +00:00
|
|
|
recipient_activation "github.com/databricks/cli/cmd/workspace/recipient-activation"
|
|
|
|
recipients "github.com/databricks/cli/cmd/workspace/recipients"
|
2023-09-05 09:43:57 +00:00
|
|
|
registered_models "github.com/databricks/cli/cmd/workspace/registered-models"
|
2023-05-16 16:35:39 +00:00
|
|
|
repos "github.com/databricks/cli/cmd/workspace/repos"
|
|
|
|
schemas "github.com/databricks/cli/cmd/workspace/schemas"
|
|
|
|
secrets "github.com/databricks/cli/cmd/workspace/secrets"
|
|
|
|
service_principals "github.com/databricks/cli/cmd/workspace/service-principals"
|
|
|
|
serving_endpoints "github.com/databricks/cli/cmd/workspace/serving-endpoints"
|
2023-10-03 11:46:16 +00:00
|
|
|
settings "github.com/databricks/cli/cmd/workspace/settings"
|
2023-05-16 16:35:39 +00:00
|
|
|
shares "github.com/databricks/cli/cmd/workspace/shares"
|
|
|
|
storage_credentials "github.com/databricks/cli/cmd/workspace/storage-credentials"
|
2023-06-12 12:23:21 +00:00
|
|
|
system_schemas "github.com/databricks/cli/cmd/workspace/system-schemas"
|
2023-05-16 16:35:39 +00:00
|
|
|
table_constraints "github.com/databricks/cli/cmd/workspace/table-constraints"
|
|
|
|
tables "github.com/databricks/cli/cmd/workspace/tables"
|
|
|
|
token_management "github.com/databricks/cli/cmd/workspace/token-management"
|
|
|
|
tokens "github.com/databricks/cli/cmd/workspace/tokens"
|
|
|
|
users "github.com/databricks/cli/cmd/workspace/users"
|
2024-01-11 08:16:25 +00:00
|
|
|
vector_search_endpoints "github.com/databricks/cli/cmd/workspace/vector-search-endpoints"
|
|
|
|
vector_search_indexes "github.com/databricks/cli/cmd/workspace/vector-search-indexes"
|
2023-05-16 16:35:39 +00:00
|
|
|
volumes "github.com/databricks/cli/cmd/workspace/volumes"
|
|
|
|
warehouses "github.com/databricks/cli/cmd/workspace/warehouses"
|
|
|
|
workspace "github.com/databricks/cli/cmd/workspace/workspace"
|
2023-05-22 19:27:22 +00:00
|
|
|
workspace_bindings "github.com/databricks/cli/cmd/workspace/workspace-bindings"
|
2023-05-16 16:35:39 +00:00
|
|
|
workspace_conf "github.com/databricks/cli/cmd/workspace/workspace-conf"
|
2023-07-25 18:19:07 +00:00
|
|
|
"github.com/spf13/cobra"
|
Added OpenAPI command coverage (#357)
This PR adds the following command groups:
## Workspace-level command groups
* `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts.
* `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace.
* `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules.
* `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters.
* `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal.
* `bricks dashboards` - In general, there is little need to modify dashboards using the API.
* `bricks data-sources` - This API is provided to assist you in making new query objects.
* `bricks experiments` - MLflow Experiment tracking.
* `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path.
* `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog.
* `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user.
* `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace.
* `bricks grants` - In Unity Catalog, data is secure by default.
* `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects.
* `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times.
* `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with.
* `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists.
* `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs.
* `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster.
* `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog.
* `bricks model-registry` - MLflow Model Registry commands.
* `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints.
* `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines.
* `bricks policy-families` - View available policy families.
* `bricks providers` - Databricks Providers REST API.
* `bricks queries` - These endpoints are used for CRUD operations on query definitions.
* `bricks query-history` - Access the history of queries through SQL warehouses.
* `bricks recipient-activation` - Databricks Recipient Activation REST API.
* `bricks recipients` - Databricks Recipients REST API.
* `bricks repos` - The Repos API allows users to manage their git repos.
* `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace.
* `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions.
* `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms.
* `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints.
* `bricks shares` - Databricks Shares REST API.
* `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant.
* `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables.
* `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace.
* `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users.
* `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs.
* `bricks users` - User identities recognized by Databricks and represented by email addresses.
* `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files.
* `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL.
* `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders.
* `bricks workspace-conf` - This API allows updating known workspace settings for advanced users.
## Account-level command groups
* `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range.
* `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period.
* `bricks account credentials` - These APIs manage credential configurations for this workspace.
* `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud.
* `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional).
* `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects.
* `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console.
* `bricks account log-delivery` - These APIs manage log delivery configurations for this account.
* `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace.
* `bricks account metastores` - These APIs manage Unity Catalog metastores for an account.
* `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional).
* `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration.
* `bricks account private-access` - These APIs manage private access settings for this account.
* `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud.
* `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms.
* `bricks account storage` - These APIs manage storage configurations for this workspace.
* `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore.
* `bricks account users` - User identities recognized by Databricks and represented by email addresses.
* `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account.
* `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account.
* `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
|
|
|
)
|
|
|
|
|
2023-07-25 18:19:07 +00:00
|
|
|
func All() []*cobra.Command {
|
|
|
|
var out []*cobra.Command
|
2023-06-15 14:47:24 +00:00
|
|
|
|
2023-07-25 18:19:07 +00:00
|
|
|
out = append(out, alerts.New())
|
2023-10-26 11:41:28 +00:00
|
|
|
out = append(out, apps.New())
|
2023-09-05 09:43:57 +00:00
|
|
|
out = append(out, artifact_allowlists.New())
|
2023-07-25 18:19:07 +00:00
|
|
|
out = append(out, catalogs.New())
|
|
|
|
out = append(out, clean_rooms.New())
|
|
|
|
out = append(out, cluster_policies.New())
|
|
|
|
out = append(out, clusters.New())
|
|
|
|
out = append(out, connections.New())
|
2023-10-03 11:46:16 +00:00
|
|
|
out = append(out, credentials_manager.New())
|
2023-07-25 18:19:07 +00:00
|
|
|
out = append(out, current_user.New())
|
2023-09-05 09:43:57 +00:00
|
|
|
out = append(out, dashboard_widgets.New())
|
2023-07-25 18:19:07 +00:00
|
|
|
out = append(out, dashboards.New())
|
|
|
|
out = append(out, data_sources.New())
|
|
|
|
out = append(out, experiments.New())
|
|
|
|
out = append(out, external_locations.New())
|
|
|
|
out = append(out, functions.New())
|
|
|
|
out = append(out, git_credentials.New())
|
|
|
|
out = append(out, global_init_scripts.New())
|
|
|
|
out = append(out, grants.New())
|
|
|
|
out = append(out, groups.New())
|
|
|
|
out = append(out, instance_pools.New())
|
|
|
|
out = append(out, instance_profiles.New())
|
|
|
|
out = append(out, ip_access_lists.New())
|
|
|
|
out = append(out, jobs.New())
|
2024-02-13 14:33:59 +00:00
|
|
|
out = append(out, lakehouse_monitors.New())
|
2024-01-11 08:16:25 +00:00
|
|
|
out = append(out, lakeview.New())
|
2023-07-25 18:19:07 +00:00
|
|
|
out = append(out, libraries.New())
|
|
|
|
out = append(out, metastores.New())
|
|
|
|
out = append(out, model_registry.New())
|
2023-09-05 09:43:57 +00:00
|
|
|
out = append(out, model_versions.New())
|
2024-02-19 14:30:06 +00:00
|
|
|
out = append(out, online_tables.New())
|
2023-07-25 18:19:07 +00:00
|
|
|
out = append(out, permissions.New())
|
|
|
|
out = append(out, pipelines.New())
|
|
|
|
out = append(out, policy_families.New())
|
|
|
|
out = append(out, providers.New())
|
|
|
|
out = append(out, queries.New())
|
|
|
|
out = append(out, query_history.New())
|
2023-09-05 09:43:57 +00:00
|
|
|
out = append(out, query_visualizations.New())
|
2023-07-25 18:19:07 +00:00
|
|
|
out = append(out, recipient_activation.New())
|
|
|
|
out = append(out, recipients.New())
|
2023-09-05 09:43:57 +00:00
|
|
|
out = append(out, registered_models.New())
|
2023-07-25 18:19:07 +00:00
|
|
|
out = append(out, repos.New())
|
|
|
|
out = append(out, schemas.New())
|
|
|
|
out = append(out, secrets.New())
|
|
|
|
out = append(out, service_principals.New())
|
|
|
|
out = append(out, serving_endpoints.New())
|
2023-10-03 11:46:16 +00:00
|
|
|
out = append(out, settings.New())
|
2023-07-25 18:19:07 +00:00
|
|
|
out = append(out, shares.New())
|
|
|
|
out = append(out, storage_credentials.New())
|
|
|
|
out = append(out, system_schemas.New())
|
|
|
|
out = append(out, table_constraints.New())
|
|
|
|
out = append(out, tables.New())
|
|
|
|
out = append(out, token_management.New())
|
|
|
|
out = append(out, tokens.New())
|
|
|
|
out = append(out, users.New())
|
2024-01-11 08:16:25 +00:00
|
|
|
out = append(out, vector_search_endpoints.New())
|
|
|
|
out = append(out, vector_search_indexes.New())
|
2023-07-25 18:19:07 +00:00
|
|
|
out = append(out, volumes.New())
|
|
|
|
out = append(out, warehouses.New())
|
|
|
|
out = append(out, workspace.New())
|
|
|
|
out = append(out, workspace_bindings.New())
|
|
|
|
out = append(out, workspace_conf.New())
|
2023-06-15 14:47:24 +00:00
|
|
|
|
2023-07-25 18:19:07 +00:00
|
|
|
return out
|
Added OpenAPI command coverage (#357)
This PR adds the following command groups:
## Workspace-level command groups
* `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts.
* `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace.
* `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules.
* `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters.
* `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal.
* `bricks dashboards` - In general, there is little need to modify dashboards using the API.
* `bricks data-sources` - This API is provided to assist you in making new query objects.
* `bricks experiments` - MLflow Experiment tracking.
* `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path.
* `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog.
* `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user.
* `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace.
* `bricks grants` - In Unity Catalog, data is secure by default.
* `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects.
* `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times.
* `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with.
* `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists.
* `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs.
* `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster.
* `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog.
* `bricks model-registry` - MLflow Model Registry commands.
* `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints.
* `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines.
* `bricks policy-families` - View available policy families.
* `bricks providers` - Databricks Providers REST API.
* `bricks queries` - These endpoints are used for CRUD operations on query definitions.
* `bricks query-history` - Access the history of queries through SQL warehouses.
* `bricks recipient-activation` - Databricks Recipient Activation REST API.
* `bricks recipients` - Databricks Recipients REST API.
* `bricks repos` - The Repos API allows users to manage their git repos.
* `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace.
* `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions.
* `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms.
* `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints.
* `bricks shares` - Databricks Shares REST API.
* `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant.
* `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables.
* `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace.
* `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users.
* `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs.
* `bricks users` - User identities recognized by Databricks and represented by email addresses.
* `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files.
* `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL.
* `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders.
* `bricks workspace-conf` - This API allows updating known workspace settings for advanced users.
## Account-level command groups
* `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range.
* `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period.
* `bricks account credentials` - These APIs manage credential configurations for this workspace.
* `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud.
* `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional).
* `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects.
* `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console.
* `bricks account log-delivery` - These APIs manage log delivery configurations for this account.
* `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace.
* `bricks account metastores` - These APIs manage Unity Catalog metastores for an account.
* `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional).
* `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration.
* `bricks account private-access` - These APIs manage private access settings for this account.
* `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud.
* `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms.
* `bricks account storage` - These APIs manage storage configurations for this workspace.
* `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore.
* `bricks account users` - User identities recognized by Databricks and represented by email addresses.
* `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account.
* `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account.
* `bricks account workspaces` - These APIs manage workspaces for this account.
2023-04-26 11:06:16 +00:00
|
|
|
}
|