mirror of https://github.com/databricks/cli.git
13 Commits
Author | SHA1 | Message | Date |
---|---|---|---|
shreyas-goenka |
6002f49c87
|
Move bundle schema update to an internal module (#1012)
## Changes This PR: 1. Move code to load bundle JSON Schema descriptions from the OpenAPI spec to an internal Go module 2. Remove command line flags from the `bundle schema` command. These flags were meant for internal processes and at no point were meant for customer use. 3. Regenerate `bundle_descriptions.json` 4. Add support for `bundle: "deprecated"`. The `environments` field is tagged as deprecated in this PR and consequently will no longer be a part of the bundle schema. ## Tests Tested by regenerating the CLI against its current OpenAPI spec (as defined in `__openapi_sha`). The `bundle_descriptions.json` in this PR was generated from the code generator. Manually checked that the autocompletion / descriptions from the new bundle schema are correct. |
|
shreyas-goenka |
76840176e3
|
Add documentation for positional args in commands generated from the Databricks OpenAPI specification (#1033)
## Changes This PR adds documentation for positional arguments in commands that are generated from the openapi spec. Note: the changes to `.gitattributes` will be revert / properly fixed in https://github.com/databricks/cli/pull/1012 |
|
dependabot[bot] |
c3ced68c60
|
Bump github.com/databricks/databricks-sdk-go from 0.24.0 to 0.25.0 (#980)
Bumps [github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go) from 0.24.0 to 0.25.0. <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's changelog</a>.</em></p> <blockquote> <h2>0.25.0</h2> <ul> <li>Make sure path parameters are first in order in RequiredFields (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/669">#669</a>).</li> <li>Added Field.IsRequestBodyField method for code generation (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/670">#670</a>).</li> <li>Added regressions question to the issue template (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/676">#676</a>).</li> <li>Added telemetry for CI/CD platform to useragent (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/665">#665</a>).</li> <li>Skiped GCP Integration Tests using Statement Execution API (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/678">#678</a>).</li> <li>Added more detailed error message on default credentials not found error (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/679">#679</a>).</li> <li>Updated SDK to latest OpenAPI Spec (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/685">#685</a>).</li> </ul> <p>API Changes:</p> <ul> <li>Changed <code>Create</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FunctionsAPI">w.Functions</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#MetastoresAPI">w.Metastores</a> workspace-level service with new required argument order.</li> <li>Changed <code>InputParams</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateFunction">catalog.CreateFunction</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FunctionInfo">catalog.FunctionInfo</a> to <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FunctionParameterInfos">catalog.FunctionParameterInfos</a>.</li> <li>Changed <code>Properties</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateFunction">catalog.CreateFunction</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FunctionInfo">catalog.FunctionInfo</a> to <code>string</code>.</li> <li>Changed <code>ReturnParams</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateFunction">catalog.CreateFunction</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FunctionInfo">catalog.FunctionInfo</a> to <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FunctionParameterInfos">catalog.FunctionParameterInfos</a></li> <li>Changed <code>StorageRoot</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateMetastore">catalog.CreateMetastore</a> to no longer be required.</li> <li>Added <code>SkipValidation</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation">catalog.UpdateExternalLocation</a>.</li> <li>Added <code>Libraries</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#CreatePolicy">compute.CreatePolicy</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EditPolicy">compute.EditPolicy</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#Policy">compute.Policy</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#InitScriptEventDetails">compute.InitScriptEventDetails</a>.</li> <li>Added <code>InitScripts</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EventDetails">compute.EventDetails</a>.</li> <li>Added <code>File</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#InitScriptInfo">compute.InitScriptInfo</a>.</li> <li>Added <code>ZoneId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#InstancePoolGcpAttributes">compute.InstancePoolGcpAttributes</a>.</li> <li>Added <code>IncludeResolvedValues</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetRunRequest">jobs.GetRunRequest</a>.</li> <li>Added <code>EditMode</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob">jobs.CreateJob</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings">jobs.JobSettings</a>.</li> <li>Added <code>NetworkConnectivityConfigId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/provisioning#UpdateWorkspaceRequest">provisioning.UpdateWorkspaceRequest</a>.</li> <li>Added <code>ContainerLogs</code> and <code>ExtraInfo</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeploymentStatus">serving.DeploymentStatus</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateFunctionRequest">catalog.CreateFunctionRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DependencyList">catalog.DependencyList</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FunctionParameterInfos">catalog.FunctionParameterInfos</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#InitScriptExecutionDetails">compute.InitScriptExecutionDetails</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#InitScriptExecutionDetailsStatus">compute.InitScriptExecutionDetailsStatus</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#InitScriptInfoAndExecutionDetails">compute.InitScriptInfoAndExecutionDetails</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LocalFileInfo">compute.LocalFileInfo</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJobEditMode">jobs.CreateJobEditMode</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettingsEditMode">jobs.JobSettingsEditMode</a>.</li> <li>Added <code>DeleteApp</code>, <code>GetApp</code>, <code>GetAppDeploymentStatus</code>, <code>GetApps</code> and <code>GetEvents</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppEvents">serving.AppEvents</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppServiceStatus">serving.AppServiceStatus</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeleteAppResponse">serving.DeleteAppResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetAppDeploymentStatusRequest">serving.GetAppDeploymentStatusRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetAppResponse">serving.GetAppResponse</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetEventsRequest">serving.GetEventsRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ListAppEventsResponse">serving.ListAppEventsResponse</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ListAppsResponse">serving.ListAppsResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NetworkConnectivityAPI">a.NetworkConnectivity</a> account-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#CreateNetworkConnectivityConfigRequest">settings.CreateNetworkConnectivityConfigRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#CreatePrivateEndpointRuleRequest">settings.CreatePrivateEndpointRuleRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#CreatePrivateEndpointRuleRequestGroupId">settings.CreatePrivateEndpointRuleRequestGroupId</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#DeleteNetworkConnectivityConfigurationRequest">settings.DeleteNetworkConnectivityConfigurationRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#DeletePrivateEndpointRuleRequest">settings.DeletePrivateEndpointRuleRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#GetNetworkConnectivityConfigurationRequest">settings.GetNetworkConnectivityConfigurationRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#GetPrivateEndpointRuleRequest">settings.GetPrivateEndpointRuleRequest</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccAzurePrivateEndpointRule">settings.NccAzurePrivateEndpointRule</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccAzurePrivateEndpointRuleConnectionState">settings.NccAzurePrivateEndpointRuleConnectionState</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccAzurePrivateEndpointRuleGroupId">settings.NccAzurePrivateEndpointRuleGroupId</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccAzureServiceEndpointRule">settings.NccAzureServiceEndpointRule</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccEgressConfig">settings.NccEgressConfig</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccEgressDefaultRules">settings.NccEgressDefaultRules</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NccEgressTargetRules">settings.NccEgressTargetRules</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#NetworkConnectivityConfiguration">settings.NetworkConnectivityConfiguration</a>.</li> <li>Removed <code>Delete</code>, <code>Get</code>, method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettingsUiState">jobs.JobSettingsUiState</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJobUiState">jobs.CreateJobUiState</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#OAuthEnrollmentAPI">a.OAuthEnrollment</a> account-level service.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#CreateOAuthEnrollment">oauth2.CreateOAuthEnrollment</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#OAuthEnrollmentStatus">oauth2.OAuthEnrollmentStatus</a>.</li> <li>Removed <code>UiState</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob">jobs.CreateJob</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings">jobs.JobSettings</a>.</li> </ul> <p>OpenAPI SHA: e7b127cb07af8dd4d8c61c7cc045c8910cdbb02a, Date: 2023-11-08 Dependency updates:</p> <ul> <li>Bump google.golang.org/api from 0.146.0 to 0.150.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/683">#683</a>).</li> <li>Bump golang.org/x/mod from 0.13.0 to 0.14.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/681">#681</a>).</li> <li>Bump google.golang.org/grpc from 1.58.2 to 1.58.3 in /examples/slog (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/672">#672</a>).</li> <li>Bump google.golang.org/grpc to 1.58.3 in /examples/zerolog (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/684">#684</a>).</li> <li>Bump golang.org/x/time from 0.3.0 to 0.4.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/680">#680</a>).</li> </ul> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href=" |
|
shreyas-goenka |
bb662fadbb
|
Bump Terraform provider to v1.29.0 (#926)
This PR: 1. Regenerates go structs using provider version 1.29 2. Adds QOL autogenerated diff labels for github 3. Adds a small SOP for doing the tf provider bump for go structs |
|
Miles Yucht |
9b16e9bd45
|
Bump the Go SDK in the CLI (#919)
## Changes Bump the Databricks Go SDK version from v0.23.0 to v0.24.0. ## Tests <!-- How is this tested? --> |
|
hectorcast-db |
36f30c8b47
|
Update Go SDK to 0.23.0 and use custom marshaller (#772)
## Changes Update Go SDK to 0.23.0 and use custom marshaller. ## Tests * Run unit tests * Run nightly * Manual test: ``` ./cli jobs create --json @myjob.json ``` with ``` { "name": "my-job-marshal-test-go", "tasks": [{ "task_key": "testgomarshaltask", "new_cluster": { "num_workers": 0, "spark_version": "10.4.x-scala2.12", "node_type_id": "Standard_DS3_v2" }, "libraries": [ { "jar": "dbfs:/max/jars/exampleJarTask.jar" } ], "spark_jar_task": { "main_class_name": "com.databricks.quickstart.exampleTask" } }] } ``` Main branch: ``` Error: Cluster validation error: Missing required field: settings.cluster_spec.new_cluster.size ``` This branch: ``` { "job_id":<jobid> } ``` --------- Co-authored-by: Miles Yucht <miles@databricks.com> |
|
Andrew Nester |
e1d1e95525
|
Updated Go SDK to 0.22.0 (#831)
## Changes Updated Go SDK to 0.22.0 |
|
Pieter Noordhuis |
1752e29885
|
Update Go SDK to v0.19.0 (#729)
## Changes * Update Go SDK to v0.19.0 * Update commands per OpenAPI spec from Go SDK * Incorporate `client.Do()` signature change to include a (nil) header map * Update `workspace.WorkspaceService` mock with permissions methods * Skip `files` service in codegen; already implemented under the `fs` command ## Tests Unit and integration tests pass. |
|
Miles Yucht |
bb415ce6bb
|
Bump OpenAPI specification & Go SDK Version (#624)
## Changes Bump the OpenAPI specification and Go SDK version to the latest version. ## Tests <!-- How is this tested? --> |
|
Serge Smertin |
acf292da37
|
Release v0.201.0 (#586)
* Add development runs ([#522](https://github.com/databricks/cli/pull/522)). * Support tab completion for profiles ([#572](https://github.com/databricks/cli/pull/572)). * Correctly use --profile flag passed for all bundle commands ([#571](https://github.com/databricks/cli/pull/571)). * Disallow notebooks in paths where files are expected ([#573](https://github.com/databricks/cli/pull/573)). * Improve auth login experience ([#570](https://github.com/databricks/cli/pull/570)). * Remove base path checks during sync ([#576](https://github.com/databricks/cli/pull/576)). * First look for databricks.yml before falling back to bundle.yml ([#580](https://github.com/databricks/cli/pull/580)). * Integrate with auto-release infra ([#581](https://github.com/databricks/cli/pull/581)). API Changes: * Removed `databricks metastores maintenance` command. * Added `databricks metastores enable-optimization` command. * Added `databricks tables update` command. * Changed `databricks account settings delete-personal-compute-setting` command with new required argument order. * Changed `databricks account settings read-personal-compute-setting` command with new required argument order. * Added `databricks clean-rooms` command group. OpenAPI SHA: 850a075ed9758d21a6bc4409506b48c8b9f93ab4, Date: 2023-07-18 Dependency updates: * Bump golang.org/x/term from 0.9.0 to 0.10.0 ([#567](https://github.com/databricks/cli/pull/567)). * Bump golang.org/x/oauth2 from 0.9.0 to 0.10.0 ([#566](https://github.com/databricks/cli/pull/566)). * Bump golang.org/x/mod from 0.11.0 to 0.12.0 ([#568](https://github.com/databricks/cli/pull/568)). * Bump github.com/databricks/databricks-sdk-go from 0.12.0 to 0.13.0 ([#585](https://github.com/databricks/cli/pull/585)). |
|
Serge Smertin |
2aa61a7c1b
|
Update with the latest Go SDK (#457)
## Changes - removed deprecated methods - regenerated with the latest OpenAPI spec - picked up the latest go SDK version ## Tests `make test` |
|
Pieter Noordhuis |
46df551816
|
Update to Go SDK v0.9.0 (#396)
## Changes See https://github.com/databricks/databricks-sdk-go/releases/tag/v0.9.0. ## Tests Ran integration tests manually. |
|
Serge Smertin |
4c4a293015
|
Added OpenAPI command coverage (#357)
This PR adds the following command groups: ## Workspace-level command groups * `bricks alerts` - The alerts API can be used to perform CRUD operations on alerts. * `bricks catalogs` - A catalog is the first layer of Unity Catalog’s three-level namespace. * `bricks cluster-policies` - Cluster policy limits the ability to configure clusters based on a set of rules. * `bricks clusters` - The Clusters API allows you to create, start, edit, list, terminate, and delete clusters. * `bricks current-user` - This API allows retrieving information about currently authenticated user or service principal. * `bricks dashboards` - In general, there is little need to modify dashboards using the API. * `bricks data-sources` - This API is provided to assist you in making new query objects. * `bricks experiments` - MLflow Experiment tracking. * `bricks external-locations` - An external location is an object that combines a cloud storage path with a storage credential that authorizes access to the cloud storage path. * `bricks functions` - Functions implement User-Defined Functions (UDFs) in Unity Catalog. * `bricks git-credentials` - Registers personal access token for Databricks to do operations on behalf of the user. * `bricks global-init-scripts` - The Global Init Scripts API enables Workspace administrators to configure global initialization scripts for their workspace. * `bricks grants` - In Unity Catalog, data is secure by default. * `bricks groups` - Groups simplify identity management, making it easier to assign access to Databricks Workspace, data, and other securable objects. * `bricks instance-pools` - Instance Pools API are used to create, edit, delete and list instance pools by using ready-to-use cloud instances which reduces a cluster start and auto-scaling times. * `bricks instance-profiles` - The Instance Profiles API allows admins to add, list, and remove instance profiles that users can launch clusters with. * `bricks ip-access-lists` - IP Access List enables admins to configure IP access lists. * `bricks jobs` - The Jobs API allows you to create, edit, and delete jobs. * `bricks libraries` - The Libraries API allows you to install and uninstall libraries and get the status of libraries on a cluster. * `bricks metastores` - A metastore is the top-level container of objects in Unity Catalog. * `bricks model-registry` - MLflow Model Registry commands. * `bricks permissions` - Permissions API are used to create read, write, edit, update and manage access for various users on different objects and endpoints. * `bricks pipelines` - The Delta Live Tables API allows you to create, edit, delete, start, and view details about pipelines. * `bricks policy-families` - View available policy families. * `bricks providers` - Databricks Providers REST API. * `bricks queries` - These endpoints are used for CRUD operations on query definitions. * `bricks query-history` - Access the history of queries through SQL warehouses. * `bricks recipient-activation` - Databricks Recipient Activation REST API. * `bricks recipients` - Databricks Recipients REST API. * `bricks repos` - The Repos API allows users to manage their git repos. * `bricks schemas` - A schema (also called a database) is the second layer of Unity Catalog’s three-level namespace. * `bricks secrets` - The Secrets API allows you to manage secrets, secret scopes, and access permissions. * `bricks service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks serving-endpoints` - The Serving Endpoints API allows you to create, update, and delete model serving endpoints. * `bricks shares` - Databricks Shares REST API. * `bricks storage-credentials` - A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. * `bricks table-constraints` - Primary key and foreign key constraints encode relationships between fields in tables. * `bricks tables` - A table resides in the third layer of Unity Catalog’s three-level namespace. * `bricks token-management` - Enables administrators to get all tokens and delete tokens for other users. * `bricks tokens` - The Token API allows you to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs. * `bricks users` - User identities recognized by Databricks and represented by email addresses. * `bricks volumes` - Volumes are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files. * `bricks warehouses` - A SQL warehouse is a compute resource that lets you run SQL commands on data objects within Databricks SQL. * `bricks workspace` - The Workspace API allows you to list, import, export, and delete notebooks and folders. * `bricks workspace-conf` - This API allows updating known workspace settings for advanced users. ## Account-level command groups * `bricks account billable-usage` - This API allows you to download billable usage logs for the specified account and date range. * `bricks account budgets` - These APIs manage budget configuration including notifications for exceeding a budget for a period. * `bricks account credentials` - These APIs manage credential configurations for this workspace. * `bricks account custom-app-integration` - These APIs enable administrators to manage custom oauth app integrations, which is required for adding/using Custom OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account encryption-keys` - These APIs manage encryption key configurations for this workspace (optional). * `bricks account groups` - Groups simplify identity management, making it easier to assign access to Databricks Account, data, and other securable objects. * `bricks account ip-access-lists` - The Accounts IP Access List API enables account admins to configure IP access lists for access to the account console. * `bricks account log-delivery` - These APIs manage log delivery configurations for this account. * `bricks account metastore-assignments` - These APIs manage metastore assignments to a workspace. * `bricks account metastores` - These APIs manage Unity Catalog metastores for an account. * `bricks account networks` - These APIs manage network configurations for customer-managed VPCs (optional). * `bricks account o-auth-enrollment` - These APIs enable administrators to enroll OAuth for their accounts, which is required for adding/using any OAuth published/custom application integration. * `bricks account private-access` - These APIs manage private access settings for this account. * `bricks account published-app-integration` - These APIs enable administrators to manage published oauth app integrations, which is required for adding/using Published OAuth App Integration like Tableau Cloud for Databricks in AWS cloud. * `bricks account service-principals` - Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms. * `bricks account storage` - These APIs manage storage configurations for this workspace. * `bricks account storage-credentials` - These APIs manage storage credentials for a particular metastore. * `bricks account users` - User identities recognized by Databricks and represented by email addresses. * `bricks account vpc-endpoints` - These APIs manage VPC endpoint configurations for this account. * `bricks account workspace-assignment` - The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account. * `bricks account workspaces` - These APIs manage workspaces for this account. |