mirror of https://github.com/databricks/cli.git
1289 Commits
Author | SHA1 | Message | Date |
---|---|---|---|
Andrew Nester |
3f8036f2df
|
Fixed seg fault when specifying environment key for tasks (#1443)
## Changes Fixed seg fault when specifying environment key for tasks |
|
Gleb Kanterov |
09aa3cb9e9
|
Add more tests for `merge.Override` (#1439)
## Changes Add test coverage to ensure we respect return value and error ## Tests Unit tests |
|
dependabot[bot] |
3ce833f826
|
Bump github.com/hashicorp/terraform-exec from 0.20.0 to 0.21.0 (#1442)
Bumps [github.com/hashicorp/terraform-exec](https://github.com/hashicorp/terraform-exec) from 0.20.0 to 0.21.0. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/hashicorp/terraform-exec/releases">github.com/hashicorp/terraform-exec's releases</a>.</em></p> <blockquote> <h2>v0.21.0</h2> <p>ENHANCEMENTS:</p> <ul> <li>tfexec: Add <code>-allow-deferral</code> to <code>(Terraform).Apply()</code> and <code>(Terraform).Plan()</code> methods (<a href="https://redirect.github.com/hashicorp/terraform-exec/pull/447">#447</a>)</li> </ul> </blockquote> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/hashicorp/terraform-exec/blob/main/CHANGELOG.md">github.com/hashicorp/terraform-exec's changelog</a>.</em></p> <blockquote> <h1>0.21.0 (May 17, 2024)</h1> <p>ENHANCEMENTS:</p> <ul> <li>tfexec: Add <code>-allow-deferral</code> to <code>(Terraform).Apply()</code> and <code>(Terraform).Plan()</code> methods (<a href="https://redirect.github.com/hashicorp/terraform-exec/pull/447">#447</a>)</li> </ul> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href=" |
|
dependabot[bot] |
7262138b4d
|
Bump github.com/hashicorp/terraform-json from 0.21.0 to 0.22.1 (#1440)
Bumps [github.com/hashicorp/terraform-json](https://github.com/hashicorp/terraform-json) from 0.21.0 to 0.22.1. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/hashicorp/terraform-json/releases">github.com/hashicorp/terraform-json's releases</a>.</em></p> <blockquote> <h2>v0.22.1</h2> <p>BUG FIXES:</p> <ul> <li>tfjson: Update <code>Complete</code> to a pointer value for older Terraform versions by <a href="https://github.com/austinvalle"><code>@austinvalle</code></a> in <a href="https://redirect.github.com/hashicorp/terraform-json/pull/131">hashicorp/terraform-json#131</a></li> </ul> <p><strong>Full Changelog</strong>: <a href="https://github.com/hashicorp/terraform-json/compare/v0.22.0...v0.22.1">https://github.com/hashicorp/terraform-json/compare/v0.22.0...v0.22.1</a></p> <h2>v0.22.0</h2> <p>ENHANCEMENTS:</p> <ul> <li>tfjson: Add <code>DeferredChanges</code> and <code>Complete</code> to <code>Plan</code> JSON by <a href="https://github.com/austinvalle"><code>@austinvalle</code></a> in <a href="https://redirect.github.com/hashicorp/terraform-json/pull/123">hashicorp/terraform-json#123</a></li> </ul> <p>INTERNAL:</p> <ul> <li>Bump github.com/zclconf/go-cty from 1.14.1 to 1.14.2 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/hashicorp/terraform-json/pull/120">hashicorp/terraform-json#120</a></li> <li>Bump github.com/zclconf/go-cty from 1.14.2 to 1.14.3 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/hashicorp/terraform-json/pull/121">hashicorp/terraform-json#121</a></li> <li>Bump github.com/zclconf/go-cty from 1.14.3 to 1.14.4 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/hashicorp/terraform-json/pull/122">hashicorp/terraform-json#122</a></li> <li>build(deps): Bump workflows to latest trusted versions by <a href="https://github.com/hashicorp-tsccr"><code>@hashicorp-tsccr</code></a> in <a href="https://redirect.github.com/hashicorp/terraform-json/pull/124">hashicorp/terraform-json#124</a></li> <li>build(deps): Bump workflows to latest trusted versions by <a href="https://github.com/hashicorp-tsccr"><code>@hashicorp-tsccr</code></a> in <a href="https://redirect.github.com/hashicorp/terraform-json/pull/127">hashicorp/terraform-json#127</a></li> <li>github: Set up Dependabot to manage HashiCorp-owned Actions versions by <a href="https://github.com/xiehan"><code>@xiehan</code></a> in <a href="https://redirect.github.com/hashicorp/terraform-json/pull/128">hashicorp/terraform-json#128</a></li> <li>Bump hashicorp/setup-copywrite from 1.1.2 to 1.1.3 in the github-actions-backward-compatible group by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/hashicorp/terraform-json/pull/130">hashicorp/terraform-json#130</a></li> </ul> <p>New Contributors</p> <ul> <li><a href="https://github.com/austinvalle"><code>@austinvalle</code></a> made their first contribution in <a href="https://redirect.github.com/hashicorp/terraform-json/pull/123">hashicorp/terraform-json#123</a></li> <li><a href="https://github.com/xiehan"><code>@xiehan</code></a> made their first contribution in <a href="https://redirect.github.com/hashicorp/terraform-json/pull/128">hashicorp/terraform-json#128</a></li> </ul> <p><strong>Full Changelog</strong>: <a href="https://github.com/hashicorp/terraform-json/compare/v0.21.0...v0.22.0">https://github.com/hashicorp/terraform-json/compare/v0.21.0...v0.22.0</a></p> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href=" |
|
dependabot[bot] |
dc13e4b37e
|
Bump github.com/fatih/color from 1.16.0 to 1.17.0 (#1441)
Bumps [github.com/fatih/color](https://github.com/fatih/color) from 1.16.0 to 1.17.0. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/fatih/color/releases">github.com/fatih/color's releases</a>.</em></p> <blockquote> <h2>v1.17.0</h2> <h2>What's Changed</h2> <ul> <li>Fix multi-parameter println spacing by <a href="https://github.com/klauspost"><code>@klauspost</code></a> in <a href="https://redirect.github.com/fatih/color/pull/228">fatih/color#228</a></li> <li>ci: update Go and Staticcheck versions by <a href="https://github.com/fatih"><code>@fatih</code></a> in <a href="https://redirect.github.com/fatih/color/pull/222">fatih/color#222</a></li> <li>Bump golang.org/x/sys from 0.14.0 to 0.17.0 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/fatih/color/pull/221">fatih/color#221</a></li> <li>Bump actions/setup-go from 4 to 5 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/fatih/color/pull/217">fatih/color#217</a></li> <li>Bump golang.org/x/sys from 0.17.0 to 0.18.0 by <a href="https://github.com/dependabot"><code>@dependabot</code></a> in <a href="https://redirect.github.com/fatih/color/pull/224">fatih/color#224</a></li> </ul> <h2>New Contributors</h2> <ul> <li><a href="https://github.com/klauspost"><code>@klauspost</code></a> made their first contribution in <a href="https://redirect.github.com/fatih/color/pull/228">fatih/color#228</a></li> </ul> <p><strong>Full Changelog</strong>: <a href="https://github.com/fatih/color/compare/v1.16.0...v1.17.0">https://github.com/fatih/color/compare/v1.16.0...v1.17.0</a></p> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href=" |
|
Andrew Nester |
a014d50a6a
|
Fixed panic when loading incorrectly defined jobs (#1402)
## Changes If only key was defined for a job in YAML config, validate previously failed with segfault. This PR validates that jobs are correctly defined and returns an error if not. ## Tests Added regression test |
|
Gleb Kanterov |
04e56aa472
|
Add `merge.Override` transform (#1428)
## Changes Add `merge.Override` transform. It allows the override one `dyn.Value` with another, preserving source locations for parts of the sub-tree where nothing has changed. This is different from merging, where values are concatenated. `OverrideVisitor` is visiting the changes during the override process and allows to control of what changes are allowed or update the effective value. The primary use case is Python code updating bundle configuration. During override, we update locations only for changed values. This allows us to keep track of locations where values were initially defined and used for error reporting. For instance, merging: ```yaml resources: # location=left.yaml:0 jobs: # location=left.yaml:1 job_0: # location=left.yaml:2 name: "job_0" # location=left.yaml:3 ``` with ```yaml resources: # location=right.yaml:0 jobs: # location=right.yaml:1 job_0: # location=right.yaml:2 name: "job_0" # location=right.yaml:3 description: job 0 # location=right.yaml:4 job_1: # location=right.yaml:5 name: "job_1" # location=right.yaml:5 ``` produces ```yaml resources: # location=left.yaml:0 jobs: # location=left.yaml:1 job_0: # location=left.yaml:2 name: "job_0" # location=left.yaml:3 description: job 0 # location=right.yaml:4 job_1: # location=right.yaml:5 name: "job_1" # location=right.yaml:5 ``` ## Tests Unit tests |
|
Pieter Noordhuis |
dd94107853
|
Remove dependency on `ConfigFilePath` from path translation mutator (#1437)
## Changes This is one step toward removing the `path.Paths` struct embedding from resource types. Going forward, we'll exclusively use the `dyn.Value` tree for location information. ## Tests Existing unit tests that cover path resolution with fallback behavior pass. |
|
Pieter Noordhuis |
4556d33e6b
|
Don't hide commands of services that are already hidden (#1438)
## Changes Currently, the help output of services in preview doesn't show any of their commands because the commands themselves are hidden as well. This change updates that behavior to not hide commands in preview if the service itself is also in preview. This makes the help output of services in preview actually usable. ## Tests n/a |
|
Miles Yucht |
f7d4b272f4
|
Improve token refresh flow (#1434)
## Changes Currently, there are a number of issues with the non-happy-path flows for token refresh in the CLI. If the token refresh fails, the raw error message is presented to the user, as seen below. This message is very difficult for users to interpret and doesn't give any clear direction on how to resolve this issue. ``` Error: token refresh: Post "https://adb-<WSID>.azuredatabricks.net/oidc/v1/token": http 400: {"error":"invalid_request","error_description":"Refresh token is invalid"} ``` When logging in again, I've noticed that the timeout for logging in is very short, only 45 seconds. If a user is using a password manager and needs to login to that first, or needs to do MFA, 45 seconds may not be enough time. to an account-level profile, it is quite frustrating for users to need to re-enter account ID information when that information is already stored in the user's `.databrickscfg` file. This PR tackles these two issues. First, the presentation of error messages from `databricks auth token` is improved substantially by converting the `error` into a human-readable message. When the refresh token is invalid, it will present a command for the user to run to reauthenticate. If the token fetching failed for some other reason, that reason will be presented in a nice way, providing front-line debugging steps and ultimately redirecting users to file a ticket at this repo if they can't resolve the issue themselves. After this PR, the new error message is: ``` Error: a new access token could not be retrieved because the refresh token is invalid. To reauthenticate, run `.databricks/databricks auth login --host https://adb-<WSID>.azuredatabricks.net` ``` To improve the login flow, this PR modifies `databricks auth login` to auto-complete the account ID from the profile when present. Additionally, it increases the login timeout from 45 seconds to 1 hour to give the user sufficient time to login as needed. To test this change, I needed to refactor some components of the CLI around profile management, the token cache, and the API client used to fetch OAuth tokens. These are now settable in the context, and a demonstration of how they can be set and used is found in `auth_test.go`. Separately, this also demonstrates a sort-of integration test of the CLI by executing the Cobra command for `databricks auth token` from tests, which may be useful for testing other end-to-end functionality in the CLI. In particular, I believe this is necessary in order to set flag values (like the `--profile` flag in this case) for use in testing. ## Tests Unit tests cover the unhappy and happy paths using the mocked API client, token cache, and profiler. Manually tested --------- Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com> |
|
Ilia Babanov |
157877a152
|
Fix bundle destroy integration test (#1435)
I've updated the `deploy_then_remove_resources` test template in the previous PR, but didn't notice that it was used in the destroy test too. Now destroy test also checks deletion of jobs |
|
dependabot[bot] |
216d2b058a
|
Bump github.com/databricks/databricks-sdk-go from 0.39.0 to 0.40.1 (#1431)
Bumps [github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go) from 0.39.0 to 0.40.1. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's releases</a>.</em></p> <blockquote> <h2>v0.40.1</h2> <ul> <li>Fixed codecov for repository (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/909">#909</a>).</li> <li>Add traceparent header to enable distributed tracing. (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/914">#914</a>).</li> <li>Log cancelled and failed requests (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/919">#919</a>).</li> </ul> <p>Dependency updates:</p> <ul> <li>Bump golang.org/x/net from 0.22.0 to 0.24.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/884">#884</a>).</li> <li>Bump golang.org/x/net from 0.17.0 to 0.23.0 in /examples/zerolog (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/896">#896</a>).</li> <li>Bump golang.org/x/net from 0.21.0 to 0.23.0 in /examples/slog (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/897">#897</a>).</li> </ul> <h2>v0.40.0</h2> <h2>0.40.0</h2> <ul> <li>Allow unlimited timeouts in retries (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/904">#904</a>). By setting RETRY_TIMEOUT_SECONDS to a negative value, WorkspaceClient and AccountClient will retry retriable failures indefinitely. As a reminder, without setting this parameter, the default retry timeout is 5 minutes.</li> </ul> <p>API Changes:</p> <ul> <li>Changed <code>Create</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service . New request type is <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#CreateAppRequest">serving.CreateAppRequest</a>.</li> <li>Changed <code>Create</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service to return <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li> <li>Removed <code>DeleteApp</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Removed <code>GetApp</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Removed <code>GetAppDeploymentStatus</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Removed <code>GetApps</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Removed <code>GetEvents</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Added <code>CreateDeployment</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Added <code>Delete</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Added <code>Get</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Added <code>GetDeployment</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Added <code>GetEnvironment</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Added <code>List</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Added <code>ListDeployments</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Added <code>Stop</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Added <code>Update</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppEvents">serving.AppEvents</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppManifest">serving.AppManifest</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppServiceStatus">serving.AppServiceStatus</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeleteAppResponse">serving.DeleteAppResponse</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeployAppRequest">serving.DeployAppRequest</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeploymentStatus">serving.DeploymentStatus</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeploymentStatusState">serving.DeploymentStatusState</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetAppDeploymentStatusRequest">serving.GetAppDeploymentStatusRequest</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetAppResponse">serving.GetAppResponse</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetEventsRequest">serving.GetEventsRequest</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ListAppEventsResponse">serving.ListAppEventsResponse</a>.</li> <li>Changed <code>Apps</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ListAppsResponse">serving.ListAppsResponse</a> to <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppList">serving.AppList</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeployment">serving.AppDeployment</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeploymentState">serving.AppDeploymentState</a>.</li> </ul> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's changelog</a>.</em></p> <blockquote> <h2>0.40.1</h2> <ul> <li>Fixed codecov for repository (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/909">#909</a>).</li> <li>Add traceparent header to enable distributed tracing. (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/914">#914</a>).</li> <li>Log cancelled and failed requests (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/919">#919</a>).</li> </ul> <p>Dependency updates:</p> <ul> <li>Bump golang.org/x/net from 0.22.0 to 0.24.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/884">#884</a>).</li> <li>Bump golang.org/x/net from 0.17.0 to 0.23.0 in /examples/zerolog (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/896">#896</a>).</li> <li>Bump golang.org/x/net from 0.21.0 to 0.23.0 in /examples/slog (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/897">#897</a>).</li> </ul> <h2>0.40.0</h2> <ul> <li>Allow unlimited timeouts in retries (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/904">#904</a>). By setting RETRY_TIMEOUT_SECONDS to a negative value, WorkspaceClient and AccountClient will retry retriable failures indefinitely. As a reminder, without setting this parameter, the default retry timeout is 5 minutes.</li> </ul> <p>API Changes:</p> <ul> <li>Changed <code>Create</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service . New request type is <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#CreateAppRequest">serving.CreateAppRequest</a>.</li> <li>Changed <code>Create</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service to return <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li> <li>Removed <code>DeleteApp</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Removed <code>GetApp</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Removed <code>GetAppDeploymentStatus</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Removed <code>GetApps</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Removed <code>GetEvents</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Added <code>CreateDeployment</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Added <code>Delete</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Added <code>Get</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Added <code>GetDeployment</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Added <code>GetEnvironment</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Added <code>List</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Added <code>ListDeployments</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Added <code>Stop</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Added <code>Update</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a> workspace-level service.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppEvents">serving.AppEvents</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppManifest">serving.AppManifest</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppServiceStatus">serving.AppServiceStatus</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeleteAppResponse">serving.DeleteAppResponse</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeployAppRequest">serving.DeployAppRequest</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeploymentStatus">serving.DeploymentStatus</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeploymentStatusState">serving.DeploymentStatusState</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetAppDeploymentStatusRequest">serving.GetAppDeploymentStatusRequest</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetAppResponse">serving.GetAppResponse</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetEventsRequest">serving.GetEventsRequest</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ListAppEventsResponse">serving.ListAppEventsResponse</a>.</li> <li>Changed <code>Apps</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ListAppsResponse">serving.ListAppsResponse</a> to <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppList">serving.AppList</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeployment">serving.AppDeployment</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeploymentState">serving.AppDeploymentState</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeploymentStatus">serving.AppDeploymentStatus</a>.</li> </ul> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Commits</summary> <ul> <li><a href=" |
|
Ilia Babanov |
2035516fde
|
Don't merge-in remote resources during depolyments (#1432)
## Changes `check_running_resources` now pulls the remote state without modifying the bundle state, similar to how it was doing before. This avoids a problem when we fail to compute deployment metadata for a deleted job (which we shouldn't do in the first place) `deploy_then_remove_resources_test` now also deploys and deletes a job (in addition to a pipeline), which catches the error that this PR fixes. ## Tests Unit and integ tests |
|
Andrew Nester |
0a21428a48
|
Upgrade to 1.43 terraform provider (#1429)
## Changes Upgrade to 1.43 terraform provider |
|
shreyas-goenka |
a71929b943
|
Add line about Docker installation to README.md (#1363)
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com> |
|
dependabot[bot] |
5920da4320
|
Bump golang.org/x/term from 0.19.0 to 0.20.0 (#1422)
Bumps [golang.org/x/term](https://github.com/golang/term) from 0.19.0 to
0.20.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="
|
|
shreyas-goenka |
63617253bd
|
Assert customer marshalling is implemented for resources (#1425)
## Changes This PR ensures every resource implements a custom marshaller / unmarshaller. This is required because we directly embed Go SDK structs. which implement custom marshalling overrides. Since the struct is embedded, the [customer marshalling overrides](https://pkg.go.dev/encoding/json#example-package-CustomMarshalJSON) are promoted to the top level. If the embedded struct itself is nil, then JSON marshal / unmarshal will panic because it tries to call `MarshalJSON` / `UnmarshalJSON` on a nil object. Fixing this issue at the Go SDK level does not seem possible. Discussed with @hectorcast-db. |
|
shreyas-goenka |
95bbe2ece1
|
Fix flaky tests for the parallel mutator (#1426)
## Changes Around 0.5% to 1% of the time, the tests would fail due to concurrent access to the underlying slice in the mutator. This PR makes the test thread safe preventing race conditions. Example of failed run: https://github.com/databricks/cli/actions/runs/9004657555/job/24738145829 |
|
dependabot[bot] |
649016d50d
|
Bump golang.org/x/oauth2 from 0.19.0 to 0.20.0 (#1421)
Bumps [golang.org/x/oauth2](https://github.com/golang/oauth2) from 0.19.0 to 0.20.0. <details> <summary>Commits</summary> <ul> <li><a href=" |
|
dependabot[bot] |
648309d939
|
Bump golang.org/x/text from 0.14.0 to 0.15.0 (#1419)
Bumps [golang.org/x/text](https://github.com/golang/text) from 0.14.0 to
0.15.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="
|
|
Andrew Nester |
a393c87ed9
|
Upgrade TF provider to 1.42.0 (#1418)
## Changes Upgrade TF provider to 1.42.0 Also fixes #1258 |
|
Andrew Nester |
4724ecb324
|
Release v0.219.0 (#1412)
Bundles: * Don't fail while parsing outdated terraform state ([#1404](https://github.com/databricks/cli/pull/1404)). * Annotate DLT pipelines when deployed using DABs ([#1410](https://github.com/databricks/cli/pull/1410)). API Changes: * Changed `databricks libraries cluster-status` command. New request type is compute.ClusterStatus. * Changed `databricks libraries cluster-status` command to return . * Added `databricks serving-endpoints get-open-api` command. OpenAPI commit 21f9f1482f9d0d15228da59f2cd9f0863d2a6d55 (2024-04-23) Dependency updates: * Bump github.com/databricks/databricks-sdk-go from 0.38.0 to 0.39.0 ([#1405](https://github.com/databricks/cli/pull/1405)). |
|
shreyas-goenka |
30215860e7
|
Fix description memoization in bundle schema (#1409)
## Changes Fixes https://github.com/databricks/cli/issues/559 The CLI generation is now stable and does not produce a diff for the `bundle_descriptions.json` file. Before a pointer to the schema was stored in the memo, which would be mutated later to include the description. This lead to duplicate documentation for schema components that were used in multiple places. This PR fixes this issue. Eg: Before all references of `pause_status` would have the same description. ## Tests Added regression test. |
|
shreyas-goenka |
507053ee50
|
Annotate DLT pipelines when deployed using DABs (#1410)
## Changes This PR annotates any pipelines that were deployed using DABs to have `deployment.kind` set to "BUNDLE", mirroring the annotation for Jobs (similar PR for jobs FYI: https://github.com/databricks/cli/pull/880). Breakglass UI is not yet available for pipelines, so this annotation will just be used for revenue attribution ATM. Note: The API field has been deployed in all regions including GovCloud. ## Tests Unit tests and manually. Manually verified that the kind and metadata_file_path are being set by DABs, and are returned by a GET API to a pipeline deployed using a DAB. Example: ``` "deployment": { "kind":"BUNDLE", "metadata_file_path":"/Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default/state/metadata.json" }, ``` |
|
Ilia Babanov |
153141d3ea
|
Don't fail while parsing outdated terraform state (#1404)
`terraform show -json` (`terraform.Show()`) fails if the state file contains resources with fields that non longer conform to the provider schemas. This can happen when you deploy a bundle with one version of the CLI, then updated the CLI to a version that uses different databricks terraform provider, and try to run `bundle run` or `bundle summary`. Those commands don't recreate local terraform state (only `terraform apply` or `plan` do) and terraform itself fails while parsing it. [Terraform docs](https://developer.hashicorp.com/terraform/language/state#format) point out that it's best to use `terraform show` after successful `apply` or `plan`. Here we parse the state ourselves. The state file format is internal to terraform, but it's more stable than our resource schemas. We only parse a subset of fields from the state, and only update ID and ModifiedStatus of bundle resources in the `terraform.Load` mutator. |
|
dependabot[bot] |
781688c9cb
|
Bump github.com/databricks/databricks-sdk-go from 0.38.0 to 0.39.0 (#1405)
Bumps [github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go) from 0.38.0 to 0.39.0. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's releases</a>.</em></p> <blockquote> <h2>v0.39.0</h2> <h2>0.39.0</h2> <ul> <li>Ignored flaky integration tests (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/894">#894</a>).</li> <li>Added retries for "worker env WorkerEnvId(workerenv-XXXXX) not found" (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/890">#890</a>).</li> <li>Updated SDK to OpenAPI spec (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/899">#899</a>).</li> </ul> <p>Note: This release contains breaking changes, please see the API changes below for more details.</p> <p>API Changes:</p> <ul> <li>Added <code>IngestionDefinition</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec">pipelines.PipelineSpec</a>.</li> <li>Added <code>Deployment</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec">pipelines.PipelineSpec</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatus">compute.ClusterStatus</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusResponse">compute.ClusterStatusResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryInstallStatus">compute.LibraryInstallStatus</a>.</li> <li>Added <code>WarehouseId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#NotebookTask">jobs.NotebookTask</a>.</li> <li>Added <code>RunAs</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#DeploymentKind">pipelines.DeploymentKind</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#IngestionConfig">pipelines.IngestionConfig</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#ManagedIngestionPipelineDefinition">pipelines.ManagedIngestionPipelineDefinition</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineDeployment">pipelines.PipelineDeployment</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#SchemaSpec">pipelines.SchemaSpec</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpec">pipelines.TableSpec</a>.</li> <li>Added <code>GetOpenApi</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetOpenApiRequest">serving.GetOpenApiRequest</a>.</li> <li>Added <code>SchemaId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#SchemaInfo">catalog.SchemaInfo</a>.</li> <li>Added <code>Operation</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultOperation">catalog.ValidationResultOperation</a>.</li> <li>Added <code>Requirements</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#Library">compute.Library</a>.</li> <li>Removed <code>AwsOperation</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li> <li>Removed <code>AzureOperation</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li> <li>Removed <code>GcpOperation</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultAwsOperation">catalog.ValidationResultAwsOperation</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultAzureOperation">catalog.ValidationResultAzureOperation</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultGcpOperation">catalog.ValidationResultGcpOperation</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusRequest">compute.ClusterStatusRequest</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryFullStatusStatus">compute.LibraryFullStatusStatus</a>.</li> <li>Changed <code>ClusterStatus</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibrariesAPI">w.Libraries</a> workspace-level service . New request type is <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatus">compute.ClusterStatus</a>.</li> <li>Changed <code>ClusterStatus</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibrariesAPI">w.Libraries</a> workspace-level service to return <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusResponse">compute.ClusterStatusResponse</a>.</li> <li>Changed <code>Status</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryFullStatus">compute.LibraryFullStatus</a> to <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryInstallStatus">compute.LibraryInstallStatus</a>.</li> </ul> <p>OpenAPI SHA: 21f9f1482f9d0d15228da59f2cd9f0863d2a6d55, Date: 2024-04-23</p> </blockquote> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's changelog</a>.</em></p> <blockquote> <h2>0.39.0</h2> <ul> <li>Ignored flaky integration tests (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/894">#894</a>).</li> <li>Added retries for "worker env WorkerEnvId(workerenv-XXXXX) not found" (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/890">#890</a>).</li> <li>Updated SDK to OpenAPI spec (<a href="https://redirect.github.com/databricks/databricks-sdk-go/pull/899">#899</a>).</li> </ul> <p>Note: This release contains breaking changes, please see the API changes below for more details.</p> <p>API Changes:</p> <ul> <li>Added <code>IngestionDefinition</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec">pipelines.PipelineSpec</a>.</li> <li>Added <code>Deployment</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>, <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a> and <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec">pipelines.PipelineSpec</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatus">compute.ClusterStatus</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusResponse">compute.ClusterStatusResponse</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryInstallStatus">compute.LibraryInstallStatus</a>.</li> <li>Added <code>WarehouseId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#NotebookTask">jobs.NotebookTask</a>.</li> <li>Added <code>RunAs</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#DeploymentKind">pipelines.DeploymentKind</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#IngestionConfig">pipelines.IngestionConfig</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#ManagedIngestionPipelineDefinition">pipelines.ManagedIngestionPipelineDefinition</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineDeployment">pipelines.PipelineDeployment</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#SchemaSpec">pipelines.SchemaSpec</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpec">pipelines.TableSpec</a>.</li> <li>Added <code>GetOpenApi</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a> workspace-level service.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetOpenApiRequest">serving.GetOpenApiRequest</a>.</li> <li>Added <code>SchemaId</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#SchemaInfo">catalog.SchemaInfo</a>.</li> <li>Added <code>Operation</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li> <li>Added <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultOperation">catalog.ValidationResultOperation</a>.</li> <li>Added <code>Requirements</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#Library">compute.Library</a>.</li> <li>Removed <code>AwsOperation</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li> <li>Removed <code>AzureOperation</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li> <li>Removed <code>GcpOperation</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultAwsOperation">catalog.ValidationResultAwsOperation</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultAzureOperation">catalog.ValidationResultAzureOperation</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultGcpOperation">catalog.ValidationResultGcpOperation</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusRequest">compute.ClusterStatusRequest</a>.</li> <li>Removed <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryFullStatusStatus">compute.LibraryFullStatusStatus</a>.</li> <li>Changed <code>ClusterStatus</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibrariesAPI">w.Libraries</a> workspace-level service . New request type is <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatus">compute.ClusterStatus</a>.</li> <li>Changed <code>ClusterStatus</code> method for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibrariesAPI">w.Libraries</a> workspace-level service to return <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusResponse">compute.ClusterStatusResponse</a>.</li> <li>Changed <code>Status</code> field for <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryFullStatus">compute.LibraryFullStatus</a> to <a href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryInstallStatus">compute.LibraryInstallStatus</a>.</li> </ul> <p>OpenAPI SHA: 21f9f1482f9d0d15228da59f2cd9f0863d2a6d55, Date: 2024-04-23</p> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href=" |
|
Pieter Noordhuis |
a292eefc2e
|
Release v0.218.1 (#1401)
This is a bugfix release. CLI: * Pass `DATABRICKS_CONFIG_FILE` for `auth profiles` ([#1394](https://github.com/databricks/cli/pull/1394)). Bundles: * Show a better error message for using wheel tasks with older DBR versions ([#1373](https://github.com/databricks/cli/pull/1373)). * Allow variable references in non-string fields in the JSON schema ([#1398](https://github.com/databricks/cli/pull/1398)). * Fix variable overrides in targets for non-string variables ([#1397](https://github.com/databricks/cli/pull/1397)). * Fix bundle schema for variables ([#1396](https://github.com/databricks/cli/pull/1396)). * Fix bundle documentation URL ([#1399](https://github.com/databricks/cli/pull/1399)). Internal: * Removed autogenerated docs for the CLI commands ([#1392](https://github.com/databricks/cli/pull/1392)). * Remove `JSON.parse` call from homebrew-tap action ([#1393](https://github.com/databricks/cli/pull/1393)). * Ensure that Python dependencies are installed during upgrade ([#1390](https://github.com/databricks/cli/pull/1390)). |
|
Pieter Noordhuis |
db84a707cd
|
Fix bundle documentation URL (#1399)
Closes #1395. |
|
shreyas-goenka |
d949f2b4f2
|
Fix bundle schema for variables (#1396)
## Changes This PR fixes the variable schema to: 1. Allow non-string values in the "default" value of a variable. 2. Allow non-string overrides in a target for a variable. ## Tests Manually. There are no longer squiggly lines. Before: <img width="329" alt="Screenshot 2024-04-24 at 3 26 43 PM" src="https://github.com/databricks/cli/assets/88374338/43be02c2-80a4-4f80-bd79-0f3e1e93ee17"> After: <img width="361" alt="Screenshot 2024-04-24 at 3 26 10 PM" src="https://github.com/databricks/cli/assets/88374338/2c1fb892-a2a2-478b-8d2e-9bda6d844b54"> |
|
shreyas-goenka |
e652333103
|
Fix variable overrides in targets for non-string variables (#1397)
Before variable overrides that were not string in a target would not work. This PR fixes that. Tested manually and via a unit test. |
|
shreyas-goenka |
6fd581d173
|
Allow variable references in non-string fields in the JSON schema (#1398)
## Tests Verified manually. Before: <img width="373" alt="Screenshot 2024-04-24 at 7 18 44 PM" src="https://github.com/databricks/cli/assets/88374338/b4aef51f-0c16-4589-9d47-cdec9ab91158"> After: <img width="364" alt="Screenshot 2024-04-24 at 7 18 31 PM" src="https://github.com/databricks/cli/assets/88374338/3d8e412e-77ee-4641-943d-f99eab26ba02"> <img width="356" alt="Screenshot 2024-04-24 at 7 16 54 PM" src="https://github.com/databricks/cli/assets/88374338/2aed369a-3c6a-4754-9c76-0969423f319e"> Manually verified the schema diff is sane. Example: ``` < "type": "boolean", < "description": "If inference tables are enabled or not. NOTE: If you have already disabled payload logging once, you cannot enable again." --- > "description": "If inference tables are enabled or not. NOTE: If you have already disabled payload logging once, you cannot enable again.", > "anyOf": [ > { > "type": "boolean" > }, > { > "type": "string", > "pattern": "\\$\\{([a-zA-Z]+([-_]?[a-zA-Z0-9]+)*(\\.[a-zA-Z]+([-_]?[a-zA-Z0-9]+)*)*)\\}" > } > ] ``` |
|
Jim Idle |
4c71f8cac4
|
Ensure that python dependencies are installed during upgrade (#1390)
## Changes The installer.Upgrade() processing did not install Python dependencies. This resulted in errors such as: ``` ModuleNotFoundError: No module named 'databricks.labs.blueprint' ``` Any new dependencies are now installed during the upgrade process. Resolves: databrickslabs/ucx#1276 ## Tests The TestUpgraderWorksForReleases test now checks to see if the upgrade process resulted in the dependencies being installed. --------- Signed-off-by: Jim.Idle <jimi@idle.ws> |
|
Kartik Gupta |
1c02224902
|
Pass `DATABRICKS_CONFIG_FILE` env var to sdk config during `auth profiles` (#1394)
## Changes * Currently, we use `auth profiles` command with `DATABRICKS_CONFIG_FILE` env var set, the file pointed to by the env var is ONLY used for loading the profile names (ini file sections). It is not passed to go sdk config object. We also don't use env variable loader in the go sdk config object, so this env var is ignored by the config and only default file is read. * This PR explicitly sets the config file path in the go sdk config object. ## Tests * integration tests in vscode |
|
Lennart Kats (databricks) |
60122f6035
|
Show a better error message for using wheel tasks with older DBR versions (#1373)
## Changes This is a minor improvement to the error about wheel tasks with older DBR versions, since we get questions about it every now and then. It also adds a pointer to the docs that were added since the original messages was committed. --------- Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com> |
|
Andrew Nester |
d99f2b808b
|
Remove `JSON.parse` call from homebrew-tap action (#1393)
## Changes `needs.goreleaser.outputs.artifacts` already contains valid JS object so no need to make it a string and try to parse |
|
shreyas-goenka |
5120e94302
|
Removed autogenerated docs for the CLI commands (#1392)
This documentation is very outdated. We don't need this because the CLI is self-documenting. Also users who land on this page (because of SEO for example) can be confused because of the lack of documentation for new commands. Example issue: https://github.com/databricks/cli/issues/1331 |
|
Pieter Noordhuis |
27d35d5e1c
|
Release v0.218.0 (#1391)
This release marks the general availability of Databricks Asset Bundles. CLI: * Publish Docker images ([#1353](https://github.com/databricks/cli/pull/1353)). * Add support for multi-arch Docker images ([#1362](https://github.com/databricks/cli/pull/1362)). * Do not prefill https:// in prompt for Databricks Host ([#1364](https://github.com/databricks/cli/pull/1364)). * Add better documentation for the `auth login` command ([#1366](https://github.com/databricks/cli/pull/1366)). * Add URLs for authentication documentation to the auth command help ([#1365](https://github.com/databricks/cli/pull/1365)). Bundles: * Fix compute override for foreach tasks ([#1357](https://github.com/databricks/cli/pull/1357)). * Transform artifact files source patterns in build not upload stage ([#1359](https://github.com/databricks/cli/pull/1359)). * Convert between integer and float in normalization ([#1371](https://github.com/databricks/cli/pull/1371)). * Disable locking for development mode ([#1302](https://github.com/databricks/cli/pull/1302)). * Resolve variable references inside variable lookup fields ([#1368](https://github.com/databricks/cli/pull/1368)). * Added validate mutator to surface additional bundle warnings ([#1352](https://github.com/databricks/cli/pull/1352)). * Upgrade terraform-provider-databricks to 1.40.0 ([#1376](https://github.com/databricks/cli/pull/1376)). * Print host in `bundle validate` when passed via profile or environment variables ([#1378](https://github.com/databricks/cli/pull/1378)). * Cleanup remote file path on bundle destroy ([#1374](https://github.com/databricks/cli/pull/1374)). * Add docs URL for `run_as` in error message ([#1381](https://github.com/databricks/cli/pull/1381)). * Enable job queueing by default ([#1385](https://github.com/databricks/cli/pull/1385)). * Added support for job environments ([#1379](https://github.com/databricks/cli/pull/1379)). * Processing and completion of positional args to bundle run ([#1120](https://github.com/databricks/cli/pull/1120)). * Add legacy option for `run_as` ([#1384](https://github.com/databricks/cli/pull/1384)). API Changes: * Changed `databricks lakehouse-monitors cancel-refresh` command with new required argument order. * Changed `databricks lakehouse-monitors create` command with new required argument order. * Changed `databricks lakehouse-monitors delete` command with new required argument order. * Changed `databricks lakehouse-monitors get` command with new required argument order. * Changed `databricks lakehouse-monitors get-refresh` command with new required argument order. * Changed `databricks lakehouse-monitors list-refreshes` command with new required argument order. * Changed `databricks lakehouse-monitors run-refresh` command with new required argument order. * Changed `databricks lakehouse-monitors update` command with new required argument order. * Changed `databricks account workspace-assignment update` command to return response. OpenAPI commit 94684175b8bd65f8701f89729351f8069e8309c9 (2024-04-11) Dependency updates: * Bump github.com/databricks/databricks-sdk-go from 0.37.0 to 0.38.0 ([#1361](https://github.com/databricks/cli/pull/1361)). * Bump golang.org/x/net from 0.22.0 to 0.23.0 ([#1380](https://github.com/databricks/cli/pull/1380)). |
|
Miles Yucht |
5ee4b41cd5
|
Decouple winget release (#1389)
## Changes We are starting to sign Windows CLI executables, but this has to be done from a machine with a Yubikey storing the signing certificate for the immediate future. As such, we will only trigger Winget publishing once the signed binaries have been uploaded to Github. Additionally, as an extra precaution, we will only release the signed binaries via Winget. ## Tests <!-- How is this tested? --> |
|
shreyas-goenka |
1d9bf4b2c4
|
Add legacy option for `run_as` (#1384)
## Changes This PR partially reverts the changes in https://github.com/databricks/cli/pull/1233 and puts the old code under an "experimental.use_legacy_run_as" configuration. This gives customers who ran into the breaking change made in the PR a way out. ## Tests Both manually and via unit tests. Manually verified that run_as works for pipelines now. And if a user wants to use the feature they need to be both a Metastore and a workspace admin. --------- Error when the deploying user is a workspace admin but not a metastore admin: ``` Error: terraform apply: exit status 1 Error: cannot update permissions: User is not a metastore admin for Metastore 'deco-uc-prod-aws-us-east-1'. with databricks_permissions.pipeline_foo, on bundle.tf.json line 23, in resource.databricks_permissions.pipeline_foo: 23: } ``` -------- Output of bundle validate: ``` ➜ bundle-playground git:(master) ✗ cli bundle validate Warning: You are using the legacy mode of run_as. The support for this mode is experimental and might be removed in a future release of the CLI. In order to run the DLT pipelines in your DAB as the run_as user this mode changes the owners of the pipelines to the run_as identity, which requires the user deploying the bundle to be a workspace admin, and also a Metastore admin if the pipeline target is in UC. at experimental.use_legacy_run_as in databricks.yml:13:22 Name: bundle-playground Target: default Workspace: Host: https://dbc-a39a1eb1-ef95.cloud.databricks.com User: shreyas.goenka@databricks.com Path: /Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default Found 1 warning ``` |
|
Pieter Noordhuis |
3108883a8f
|
Processing and completion of positional args to bundle run (#1120)
## Changes With this change, both job parameters and task parameters can be specified as positional arguments to bundle run. How the positional arguments are interpreted depends on the configuration of the job. ### Examples: For a job that has job parameters configured a user can specify: ``` databricks bundle run my_job -- --param1=value1 --param2=value2 ``` And the run is kicked off with job parameters set to: ```json { "param1": "value1", "param2": "value2" } ``` Similarly, for a job that doesn't use job parameters and only has `notebook_task` tasks, a user can specify: ``` databricks bundle run my_notebook_job -- --param1=value1 --param2=value2 ``` And the run is kicked off with task level `notebook_params` configured as: ```json { "param1": "value1", "param2": "value2" } ``` For a job that doesn't doesn't use job parameters and only has either `spark_python_task` or `python_wheel_task` tasks, a user can specify: ``` databricks bundle run my_python_file_job -- --flag=value other arguments ``` And the run is kicked off with task level `python_params` configured as: ```json [ "--flag=value", "other", "arguments" ] ``` The same is applied to jobs with only `spark_jar_task` or `spark_submit_task` tasks. ## Tests Unit tests. Tested the completions manually. |
|
Andrew Nester |
1872aa12b3
|
Added support for job environments (#1379)
## Changes The main changes are: 1. Don't link artifacts to libraries anymore and instead just iterate over all jobs and tasks when uploading artifacts and update local path to remote 2. Iterating over `jobs.environments` to check if there are any local libraries and checking that they exist locally 3. Added tests to check environments are handled correctly End-to-end test will follow up ## Tests Added regression test, existing tests (including integration one) pass |
|
Lennart Kats (databricks) |
000a7fef8c
|
Enable job queueing by default (#1385)
## Changes This enable queueing for jobs by default, following the behavior from API 2.2+. Queing is a best practice and will be the default in API 2.2. Since we're still using API 2.1 which has queueing disabled by default, this PR enables queuing using a mutator. Customers can manually turn off queueing for any job by adding the following to their job spec: ``` queue: enabled: false ``` ## Tests Unit tests, manual confirmation of property after deployment. --------- Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com> |
|
Pieter Noordhuis |
cd675ded9a
|
Update `testutil` helpers to return path (#1383)
## Changes I spotted a few call sites where the path of a test file was synthesized multiple times. It is easier to capture the path as a variable and reuse it. |
|
Pieter Noordhuis |
b296f90767
|
Add trailing newline in usage string (#1382)
## Changes The default template includes a final newline but this was missing from the cmdgroup template. This change also adds test coverage for inherited flags and the flag group description. |
|
dependabot[bot] |
ebbb716164
|
Bump golang.org/x/net from 0.22.0 to 0.23.0 (#1380)
Bumps [golang.org/x/net](https://github.com/golang/net) from 0.22.0 to 0.23.0. <details> <summary>Commits</summary> <ul> <li><a href=" |
|
shreyas-goenka |
6ca57a7e68
|
Add docs URL for `run_as` in error message (#1381) | |
shreyas-goenka |
e008c2bd8c
|
Cleanup remote file path on bundle destroy (#1374)
## Changes The sync struct initialization would recreate the deleted `file_path`. This PR moves to not initializing the sync object to delete the snapshot, thus fixing the lingering `file_path` after `bundle destroy`. ## Tests Manually, and a integration test to prevent regression. |
|
shreyas-goenka |
f6c4d6d152
|
Add NOTICE for using Terraform 1.5.5 licensed under MPL 2.0 (#1377)
We package the terraform 1.5.5 binary in our docker container images for the CLI. This thus needs to be included in our notice for the repository. |
|
shreyas-goenka |
331313ea5f
|
Print host in `bundle validate` when passed via profile or environment variables (#1378)
## Changes Fixes to get host from the workspace client rather than only printing the host when it's configured in the bundle config. ## Tests Manually. When a profile was specified for auth. Before: ``` ➜ bundle-playground git:(master) ✗ cli bundle validate Name: bundle-playground Target: default Workspace: Host: User: shreyas.goenka@databricks.com Path: /Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default ``` After: ``` ➜ bundle-playground git:(master) ✗ cli bundle validate Name: bundle-playground Target: default Workspace: Host: https://e2-dogfood.staging.cloud.databricks.com User: shreyas.goenka@databricks.com Path: /Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default ``` |
|
Pieter Noordhuis |
6e59b13452
|
Update Spark version in integration tests to 13.3 (#1375)
## Tests Integration test run pending. |