Commit Graph

1080 Commits

Author SHA1 Message Date
Miles Yucht f7d4b272f4
Improve token refresh flow (#1434)
## Changes
Currently, there are a number of issues with the non-happy-path flows
for token refresh in the CLI.

If the token refresh fails, the raw error message is presented to the
user, as seen below. This message is very difficult for users to
interpret and doesn't give any clear direction on how to resolve this
issue.
```
Error: token refresh: Post "https://adb-<WSID>.azuredatabricks.net/oidc/v1/token": http 400: {"error":"invalid_request","error_description":"Refresh token is invalid"}
```

When logging in again, I've noticed that the timeout for logging in is
very short, only 45 seconds. If a user is using a password manager and
needs to login to that first, or needs to do MFA, 45 seconds may not be
enough time. to an account-level profile, it is quite frustrating for
users to need to re-enter account ID information when that information
is already stored in the user's `.databrickscfg` file.

This PR tackles these two issues. First, the presentation of error
messages from `databricks auth token` is improved substantially by
converting the `error` into a human-readable message. When the refresh
token is invalid, it will present a command for the user to run to
reauthenticate. If the token fetching failed for some other reason, that
reason will be presented in a nice way, providing front-line debugging
steps and ultimately redirecting users to file a ticket at this repo if
they can't resolve the issue themselves. After this PR, the new error
message is:
```
Error: a new access token could not be retrieved because the refresh token is invalid. To reauthenticate, run `.databricks/databricks auth login --host https://adb-<WSID>.azuredatabricks.net`
```

To improve the login flow, this PR modifies `databricks auth login` to
auto-complete the account ID from the profile when present.
Additionally, it increases the login timeout from 45 seconds to 1 hour
to give the user sufficient time to login as needed.

To test this change, I needed to refactor some components of the CLI
around profile management, the token cache, and the API client used to
fetch OAuth tokens. These are now settable in the context, and a
demonstration of how they can be set and used is found in
`auth_test.go`.

Separately, this also demonstrates a sort-of integration test of the CLI
by executing the Cobra command for `databricks auth token` from tests,
which may be useful for testing other end-to-end functionality in the
CLI. In particular, I believe this is necessary in order to set flag
values (like the `--profile` flag in this case) for use in testing.

## Tests
Unit tests cover the unhappy and happy paths using the mocked API
client, token cache, and profiler.

Manually tested

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-05-16 10:22:09 +00:00
Ilia Babanov 157877a152
Fix bundle destroy integration test (#1435)
I've updated the `deploy_then_remove_resources` test template in the
previous PR, but didn't notice that it was used in the destroy test too.
Now destroy test also checks deletion of jobs
2024-05-16 09:32:55 +00:00
dependabot[bot] 216d2b058a
Bump github.com/databricks/databricks-sdk-go from 0.39.0 to 0.40.1 (#1431)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.39.0 to 0.40.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.40.1</h2>
<ul>
<li>Fixed codecov for repository (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/909">#909</a>).</li>
<li>Add traceparent header to enable distributed tracing. (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/914">#914</a>).</li>
<li>Log cancelled and failed requests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/919">#919</a>).</li>
</ul>
<p>Dependency updates:</p>
<ul>
<li>Bump golang.org/x/net from 0.22.0 to 0.24.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/884">#884</a>).</li>
<li>Bump golang.org/x/net from 0.17.0 to 0.23.0 in /examples/zerolog (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/896">#896</a>).</li>
<li>Bump golang.org/x/net from 0.21.0 to 0.23.0 in /examples/slog (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/897">#897</a>).</li>
</ul>
<h2>v0.40.0</h2>
<h2>0.40.0</h2>
<ul>
<li>Allow unlimited timeouts in retries (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/904">#904</a>).
By setting RETRY_TIMEOUT_SECONDS to a negative value, WorkspaceClient
and AccountClient will retry retriable failures indefinitely. As a
reminder, without setting this parameter, the default retry timeout is 5
minutes.</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#CreateAppRequest">serving.CreateAppRequest</a>.</li>
<li>Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Removed <code>DeleteApp</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetApp</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetAppDeploymentStatus</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetApps</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetEvents</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>CreateDeployment</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Delete</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Get</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>GetDeployment</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>GetEnvironment</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>List</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>ListDeployments</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Stop</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppEvents">serving.AppEvents</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppManifest">serving.AppManifest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppServiceStatus">serving.AppServiceStatus</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeleteAppResponse">serving.DeleteAppResponse</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeployAppRequest">serving.DeployAppRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeploymentStatus">serving.DeploymentStatus</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeploymentStatusState">serving.DeploymentStatusState</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetAppDeploymentStatusRequest">serving.GetAppDeploymentStatusRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetAppResponse">serving.GetAppResponse</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetEventsRequest">serving.GetEventsRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ListAppEventsResponse">serving.ListAppEventsResponse</a>.</li>
<li>Changed <code>Apps</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ListAppsResponse">serving.ListAppsResponse</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppList">serving.AppList</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeployment">serving.AppDeployment</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeploymentState">serving.AppDeploymentState</a>.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>0.40.1</h2>
<ul>
<li>Fixed codecov for repository (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/909">#909</a>).</li>
<li>Add traceparent header to enable distributed tracing. (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/914">#914</a>).</li>
<li>Log cancelled and failed requests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/919">#919</a>).</li>
</ul>
<p>Dependency updates:</p>
<ul>
<li>Bump golang.org/x/net from 0.22.0 to 0.24.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/884">#884</a>).</li>
<li>Bump golang.org/x/net from 0.17.0 to 0.23.0 in /examples/zerolog (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/896">#896</a>).</li>
<li>Bump golang.org/x/net from 0.21.0 to 0.23.0 in /examples/slog (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/897">#897</a>).</li>
</ul>
<h2>0.40.0</h2>
<ul>
<li>Allow unlimited timeouts in retries (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/904">#904</a>).
By setting RETRY_TIMEOUT_SECONDS to a negative value, WorkspaceClient
and AccountClient will retry retriable failures indefinitely. As a
reminder, without setting this parameter, the default retry timeout is 5
minutes.</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#CreateAppRequest">serving.CreateAppRequest</a>.</li>
<li>Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Removed <code>DeleteApp</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetApp</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetAppDeploymentStatus</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetApps</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetEvents</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>CreateDeployment</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Delete</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Get</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>GetDeployment</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>GetEnvironment</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>List</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>ListDeployments</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Stop</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppEvents">serving.AppEvents</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppManifest">serving.AppManifest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppServiceStatus">serving.AppServiceStatus</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeleteAppResponse">serving.DeleteAppResponse</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeployAppRequest">serving.DeployAppRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeploymentStatus">serving.DeploymentStatus</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeploymentStatusState">serving.DeploymentStatusState</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetAppDeploymentStatusRequest">serving.GetAppDeploymentStatusRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetAppResponse">serving.GetAppResponse</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetEventsRequest">serving.GetEventsRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ListAppEventsResponse">serving.ListAppEventsResponse</a>.</li>
<li>Changed <code>Apps</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ListAppsResponse">serving.ListAppsResponse</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppList">serving.AppList</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeployment">serving.AppDeployment</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeploymentState">serving.AppDeploymentState</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeploymentStatus">serving.AppDeploymentStatus</a>.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="72334ef449"><code>72334ef</code></a>
Release v0.40.1 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/920">#920</a>)</li>
<li><a
href="63343c8c66"><code>63343c8</code></a>
Log cancelled and failed requests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/919">#919</a>)</li>
<li><a
href="27d08a67df"><code>27d08a6</code></a>
Add traceparent header to enable distributed tracing. (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/914">#914</a>)</li>
<li><a
href="b5322db1a7"><code>b5322db</code></a>
Bump golang.org/x/net from 0.21.0 to 0.23.0 in /examples/slog (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/897">#897</a>)</li>
<li><a
href="5edb2dfc92"><code>5edb2df</code></a>
Bump golang.org/x/net from 0.17.0 to 0.23.0 in /examples/zerolog (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/896">#896</a>)</li>
<li><a
href="0e8dba0f70"><code>0e8dba0</code></a>
Bump golang.org/x/net from 0.22.0 to 0.24.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/884">#884</a>)</li>
<li><a
href="b6e76fc44b"><code>b6e76fc</code></a>
Fixed codecov for repository (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/909">#909</a>)</li>
<li><a
href="70aa8eea19"><code>70aa8ee</code></a>
Release v0.40.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/908">#908</a>)</li>
<li><a
href="dae1642c5d"><code>dae1642</code></a>
Add ToLibraryList() for ClusterStatusResponse (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/906">#906</a>)</li>
<li><a
href="c6516aa754"><code>c6516aa</code></a>
Bump version of mockery (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/907">#907</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.39.0...v0.40.1">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.39.0&new-version=0.40.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-05-16 09:04:58 +00:00
Ilia Babanov 2035516fde
Don't merge-in remote resources during depolyments (#1432)
## Changes
`check_running_resources` now pulls the remote state without modifying
the bundle state, similar to how it was doing before. This avoids a
problem when we fail to compute deployment metadata for a deleted job
(which we shouldn't do in the first place)

`deploy_then_remove_resources_test` now also deploys and deletes a job
(in addition to a pipeline), which catches the error that this PR fixes.

## Tests
Unit and integ tests
2024-05-15 12:41:44 +00:00
Andrew Nester 0a21428a48
Upgrade to 1.43 terraform provider (#1429)
## Changes
Upgrade to 1.43 terraform provider
2024-05-14 12:19:34 +00:00
shreyas-goenka a71929b943
Add line about Docker installation to README.md (#1363)
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-05-14 10:58:55 +00:00
dependabot[bot] 5920da4320
Bump golang.org/x/term from 0.19.0 to 0.20.0 (#1422)
Bumps [golang.org/x/term](https://github.com/golang/term) from 0.19.0 to
0.20.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="46c790f81f"><code>46c790f</code></a>
go.mod: update golang.org/x dependencies</li>
<li>See full diff in <a
href="https://github.com/golang/term/compare/v0.19.0...v0.20.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/term&package-manager=go_modules&previous-version=0.19.0&new-version=0.20.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-14 10:53:59 +00:00
shreyas-goenka 63617253bd
Assert customer marshalling is implemented for resources (#1425)
## Changes
This PR ensures every resource implements a custom marshaller /
unmarshaller. This is required because we directly embed Go SDK structs.
which implement custom marshalling overrides. Since the struct is
embedded, the [customer marshalling
overrides](https://pkg.go.dev/encoding/json#example-package-CustomMarshalJSON)
are promoted to the top level. If the embedded struct itself is nil,
then JSON marshal / unmarshal will panic because it tries to call
`MarshalJSON` / `UnmarshalJSON` on a nil object.

Fixing this issue at the Go SDK level does not seem possible. Discussed
with @hectorcast-db.
2024-05-14 10:30:48 +00:00
shreyas-goenka 95bbe2ece1
Fix flaky tests for the parallel mutator (#1426)
## Changes
Around 0.5% to 1% of the time, the tests would fail due to concurrent
access to the underlying slice in the mutator. This PR makes the test
thread safe preventing race conditions.

Example of failed run:
https://github.com/databricks/cli/actions/runs/9004657555/job/24738145829
2024-05-13 12:16:43 +00:00
dependabot[bot] 649016d50d
Bump golang.org/x/oauth2 from 0.19.0 to 0.20.0 (#1421)
Bumps [golang.org/x/oauth2](https://github.com/golang/oauth2) from
0.19.0 to 0.20.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="84cb9f7f5c"><code>84cb9f7</code></a>
oauth2: fix typo in comment</li>
<li><a
href="4b7f0bdbc7"><code>4b7f0bd</code></a>
go.mod: update cloud.google.com/go/compute/metadata dependency</li>
<li><a
href="e11eea88a8"><code>e11eea8</code></a>
microsoft: added DeviceAuthURL to AzureADEndpoint</li>
<li>See full diff in <a
href="https://github.com/golang/oauth2/compare/v0.19.0...v0.20.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/oauth2&package-manager=go_modules&previous-version=0.19.0&new-version=0.20.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-13 11:32:55 +00:00
dependabot[bot] 648309d939
Bump golang.org/x/text from 0.14.0 to 0.15.0 (#1419)
Bumps [golang.org/x/text](https://github.com/golang/text) from 0.14.0 to
0.15.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="8d533a0c40"><code>8d533a0</code></a>
encoding/charmap: update UCM spec file URL prefix</li>
<li>See full diff in <a
href="https://github.com/golang/text/compare/v0.14.0...v0.15.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/text&package-manager=go_modules&previous-version=0.14.0&new-version=0.15.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-13 11:32:35 +00:00
Andrew Nester a393c87ed9
Upgrade TF provider to 1.42.0 (#1418)
## Changes
Upgrade TF provider to 1.42.0

Also fixes #1258
2024-05-06 11:41:37 +00:00
Andrew Nester 4724ecb324
Release v0.219.0 (#1412)
Bundles:
* Don't fail while parsing outdated terraform state
([#1404](https://github.com/databricks/cli/pull/1404)).
* Annotate DLT pipelines when deployed using DABs
([#1410](https://github.com/databricks/cli/pull/1410)).


API Changes:
* Changed `databricks libraries cluster-status` command. New request
type is compute.ClusterStatus.
 * Changed `databricks libraries cluster-status` command to return .
 * Added `databricks serving-endpoints get-open-api` command.

OpenAPI commit 21f9f1482f9d0d15228da59f2cd9f0863d2a6d55 (2024-04-23)
Dependency updates:
* Bump github.com/databricks/databricks-sdk-go from 0.38.0 to 0.39.0
([#1405](https://github.com/databricks/cli/pull/1405)).
2024-05-01 12:09:06 +00:00
shreyas-goenka 30215860e7
Fix description memoization in bundle schema (#1409)
## Changes
Fixes https://github.com/databricks/cli/issues/559

The CLI generation is now stable and does not produce a diff for the
`bundle_descriptions.json` file.

Before a pointer to the schema was stored in the memo, which would be
mutated later to include the description. This lead to duplicate
documentation for schema components that were used in multiple places.
This PR fixes this issue.

Eg: Before all references of `pause_status` would have the same
description.

## Tests
Added regression test.
2024-05-01 11:04:37 +00:00
shreyas-goenka 507053ee50
Annotate DLT pipelines when deployed using DABs (#1410)
## Changes
This PR annotates any pipelines that were deployed using DABs to have
`deployment.kind` set to "BUNDLE", mirroring the annotation for Jobs
(similar PR for jobs FYI: https://github.com/databricks/cli/pull/880).

Breakglass UI is not yet available for pipelines, so this annotation
will just be used for revenue attribution ATM.

Note: The API field has been deployed in all regions including GovCloud.

## Tests
Unit tests and manually.

Manually verified that the kind and metadata_file_path are being set by
DABs, and are returned by a GET API to a pipeline deployed using a DAB.
Example:
```
    "deployment": {
      "kind":"BUNDLE",
      "metadata_file_path":"/Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default/state/metadata.json"
    },
```
2024-05-01 08:37:03 +00:00
Ilia Babanov 153141d3ea
Don't fail while parsing outdated terraform state (#1404)
`terraform show -json` (`terraform.Show()`) fails if the state file
contains resources with fields that non longer conform to the provider
schemas.

This can happen when you deploy a bundle with one version of the CLI,
then updated the CLI to a version that uses different databricks
terraform provider, and try to run `bundle run` or `bundle summary`.
Those commands don't recreate local terraform state (only `terraform
apply` or `plan` do) and terraform itself fails while parsing it.
[Terraform
docs](https://developer.hashicorp.com/terraform/language/state#format)
point out that it's best to use `terraform show` after successful
`apply` or `plan`.

Here we parse the state ourselves. The state file format is internal to
terraform, but it's more stable than our resource schemas. We only parse
a subset of fields from the state, and only update ID and ModifiedStatus
of bundle resources in the `terraform.Load` mutator.
2024-05-01 08:22:35 +00:00
dependabot[bot] 781688c9cb
Bump github.com/databricks/databricks-sdk-go from 0.38.0 to 0.39.0 (#1405)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.38.0 to 0.39.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.39.0</h2>
<h2>0.39.0</h2>
<ul>
<li>Ignored flaky integration tests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/894">#894</a>).</li>
<li>Added retries for &quot;worker env WorkerEnvId(workerenv-XXXXX) not
found&quot; (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/890">#890</a>).</li>
<li>Updated SDK to OpenAPI spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/899">#899</a>).</li>
</ul>
<p>Note: This release contains breaking changes, please see the API
changes below for more details.</p>
<p>API Changes:</p>
<ul>
<li>Added <code>IngestionDefinition</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec">pipelines.PipelineSpec</a>.</li>
<li>Added <code>Deployment</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec">pipelines.PipelineSpec</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatus">compute.ClusterStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusResponse">compute.ClusterStatusResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryInstallStatus">compute.LibraryInstallStatus</a>.</li>
<li>Added <code>WarehouseId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#NotebookTask">jobs.NotebookTask</a>.</li>
<li>Added <code>RunAs</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#DeploymentKind">pipelines.DeploymentKind</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#IngestionConfig">pipelines.IngestionConfig</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#ManagedIngestionPipelineDefinition">pipelines.ManagedIngestionPipelineDefinition</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineDeployment">pipelines.PipelineDeployment</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#SchemaSpec">pipelines.SchemaSpec</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpec">pipelines.TableSpec</a>.</li>
<li>Added <code>GetOpenApi</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetOpenApiRequest">serving.GetOpenApiRequest</a>.</li>
<li>Added <code>SchemaId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#SchemaInfo">catalog.SchemaInfo</a>.</li>
<li>Added <code>Operation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultOperation">catalog.ValidationResultOperation</a>.</li>
<li>Added <code>Requirements</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#Library">compute.Library</a>.</li>
<li>Removed <code>AwsOperation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Removed <code>AzureOperation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Removed <code>GcpOperation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultAwsOperation">catalog.ValidationResultAwsOperation</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultAzureOperation">catalog.ValidationResultAzureOperation</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultGcpOperation">catalog.ValidationResultGcpOperation</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusRequest">compute.ClusterStatusRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryFullStatusStatus">compute.LibraryFullStatusStatus</a>.</li>
<li>Changed <code>ClusterStatus</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibrariesAPI">w.Libraries</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatus">compute.ClusterStatus</a>.</li>
<li>Changed <code>ClusterStatus</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibrariesAPI">w.Libraries</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusResponse">compute.ClusterStatusResponse</a>.</li>
<li>Changed <code>Status</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryFullStatus">compute.LibraryFullStatus</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryInstallStatus">compute.LibraryInstallStatus</a>.</li>
</ul>
<p>OpenAPI SHA: 21f9f1482f9d0d15228da59f2cd9f0863d2a6d55, Date:
2024-04-23</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>0.39.0</h2>
<ul>
<li>Ignored flaky integration tests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/894">#894</a>).</li>
<li>Added retries for &quot;worker env WorkerEnvId(workerenv-XXXXX) not
found&quot; (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/890">#890</a>).</li>
<li>Updated SDK to OpenAPI spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/899">#899</a>).</li>
</ul>
<p>Note: This release contains breaking changes, please see the API
changes below for more details.</p>
<p>API Changes:</p>
<ul>
<li>Added <code>IngestionDefinition</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec">pipelines.PipelineSpec</a>.</li>
<li>Added <code>Deployment</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec">pipelines.PipelineSpec</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatus">compute.ClusterStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusResponse">compute.ClusterStatusResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryInstallStatus">compute.LibraryInstallStatus</a>.</li>
<li>Added <code>WarehouseId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#NotebookTask">jobs.NotebookTask</a>.</li>
<li>Added <code>RunAs</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#DeploymentKind">pipelines.DeploymentKind</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#IngestionConfig">pipelines.IngestionConfig</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#ManagedIngestionPipelineDefinition">pipelines.ManagedIngestionPipelineDefinition</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineDeployment">pipelines.PipelineDeployment</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#SchemaSpec">pipelines.SchemaSpec</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpec">pipelines.TableSpec</a>.</li>
<li>Added <code>GetOpenApi</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetOpenApiRequest">serving.GetOpenApiRequest</a>.</li>
<li>Added <code>SchemaId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#SchemaInfo">catalog.SchemaInfo</a>.</li>
<li>Added <code>Operation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultOperation">catalog.ValidationResultOperation</a>.</li>
<li>Added <code>Requirements</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#Library">compute.Library</a>.</li>
<li>Removed <code>AwsOperation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Removed <code>AzureOperation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Removed <code>GcpOperation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultAwsOperation">catalog.ValidationResultAwsOperation</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultAzureOperation">catalog.ValidationResultAzureOperation</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultGcpOperation">catalog.ValidationResultGcpOperation</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusRequest">compute.ClusterStatusRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryFullStatusStatus">compute.LibraryFullStatusStatus</a>.</li>
<li>Changed <code>ClusterStatus</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibrariesAPI">w.Libraries</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatus">compute.ClusterStatus</a>.</li>
<li>Changed <code>ClusterStatus</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibrariesAPI">w.Libraries</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusResponse">compute.ClusterStatusResponse</a>.</li>
<li>Changed <code>Status</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryFullStatus">compute.LibraryFullStatus</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryInstallStatus">compute.LibraryInstallStatus</a>.</li>
</ul>
<p>OpenAPI SHA: 21f9f1482f9d0d15228da59f2cd9f0863d2a6d55, Date:
2024-04-23</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7672dece38"><code>7672dec</code></a>
Release v0.39.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/901">#901</a>)</li>
<li><a
href="2f56ab8431"><code>2f56ab8</code></a>
Update SDK to OpenAPI spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/899">#899</a>)</li>
<li><a
href="fa3a5d24eb"><code>fa3a5d2</code></a>
Add retries for &quot;worker env WorkerEnvId(workerenv-XXXXX) not
found&quot; (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/890">#890</a>)</li>
<li><a
href="219975c53f"><code>219975c</code></a>
Ignore flaky integration tests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/894">#894</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.38.0...v0.39.0">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.38.0&new-version=0.39.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-04-30 14:41:24 +00:00
Pieter Noordhuis a292eefc2e
Release v0.218.1 (#1401)
This is a bugfix release.

CLI:
* Pass `DATABRICKS_CONFIG_FILE` for `auth profiles`
([#1394](https://github.com/databricks/cli/pull/1394)).

Bundles:
* Show a better error message for using wheel tasks with older DBR
versions ([#1373](https://github.com/databricks/cli/pull/1373)).
* Allow variable references in non-string fields in the JSON schema
([#1398](https://github.com/databricks/cli/pull/1398)).
* Fix variable overrides in targets for non-string variables
([#1397](https://github.com/databricks/cli/pull/1397)).
* Fix bundle schema for variables
([#1396](https://github.com/databricks/cli/pull/1396)).
* Fix bundle documentation URL
([#1399](https://github.com/databricks/cli/pull/1399)).

Internal:
* Removed autogenerated docs for the CLI commands
([#1392](https://github.com/databricks/cli/pull/1392)).
* Remove `JSON.parse` call from homebrew-tap action
([#1393](https://github.com/databricks/cli/pull/1393)).
* Ensure that Python dependencies are installed during upgrade
([#1390](https://github.com/databricks/cli/pull/1390)).
2024-04-25 13:19:23 +00:00
Pieter Noordhuis db84a707cd
Fix bundle documentation URL (#1399)
Closes #1395.
2024-04-25 11:25:26 +00:00
shreyas-goenka d949f2b4f2
Fix bundle schema for variables (#1396)
## Changes
This PR fixes the variable schema to:
1. Allow non-string values in the "default" value of a variable.
2. Allow non-string overrides in a target for a variable. 

## Tests
Manually. There are no longer squiggly lines. 

Before:
<img width="329" alt="Screenshot 2024-04-24 at 3 26 43 PM"
src="https://github.com/databricks/cli/assets/88374338/43be02c2-80a4-4f80-bd79-0f3e1e93ee17">


After:
<img width="361" alt="Screenshot 2024-04-24 at 3 26 10 PM"
src="https://github.com/databricks/cli/assets/88374338/2c1fb892-a2a2-478b-8d2e-9bda6d844b54">
2024-04-25 11:23:50 +00:00
shreyas-goenka e652333103
Fix variable overrides in targets for non-string variables (#1397)
Before variable overrides that were not string in a target would not
work. This PR fixes that. Tested manually and via a unit test.
2024-04-25 11:21:10 +00:00
shreyas-goenka 6fd581d173
Allow variable references in non-string fields in the JSON schema (#1398)
## Tests
Verified manually.

Before:
<img width="373" alt="Screenshot 2024-04-24 at 7 18 44 PM"
src="https://github.com/databricks/cli/assets/88374338/b4aef51f-0c16-4589-9d47-cdec9ab91158">

After:
<img width="364" alt="Screenshot 2024-04-24 at 7 18 31 PM"
src="https://github.com/databricks/cli/assets/88374338/3d8e412e-77ee-4641-943d-f99eab26ba02">
<img width="356" alt="Screenshot 2024-04-24 at 7 16 54 PM"
src="https://github.com/databricks/cli/assets/88374338/2aed369a-3c6a-4754-9c76-0969423f319e">

Manually verified the schema diff is sane. Example:
```
<                               "type": "boolean",
<                               "description": "If inference tables are enabled or not. NOTE: If you have already disabled payload logging once, you cannot enable again."
---
>                               "description": "If inference tables are enabled or not. NOTE: If you have already disabled payload logging once, you cannot enable again.",
>                               "anyOf": [
>                                 {
>                                   "type": "boolean"
>                                 },
>                                 {
>                                   "type": "string",
>                                   "pattern": "\\$\\{([a-zA-Z]+([-_]?[a-zA-Z0-9]+)*(\\.[a-zA-Z]+([-_]?[a-zA-Z0-9]+)*)*)\\}"
>                                 }
>                               ]
```
2024-04-25 11:20:45 +00:00
Jim Idle 4c71f8cac4
Ensure that python dependencies are installed during upgrade (#1390)
## Changes
The installer.Upgrade() processing did not install Python dependencies.
This resulted in errors such as:

```
ModuleNotFoundError: No module named 'databricks.labs.blueprint'
```

Any new dependencies are now installed during the upgrade process.

Resolves: databrickslabs/ucx#1276

## Tests

The TestUpgraderWorksForReleases test now checks to see if the upgrade
process resulted in the dependencies being installed.

---------

Signed-off-by: Jim.Idle <jimi@idle.ws>
2024-04-24 17:34:09 +00:00
Kartik Gupta 1c02224902
Pass `DATABRICKS_CONFIG_FILE` env var to sdk config during `auth profiles` (#1394)
## Changes
* Currently, we use `auth profiles` command with
`DATABRICKS_CONFIG_FILE` env var set, the file pointed to by the env var
is ONLY used for loading the profile names (ini file sections). It is
not passed to go sdk config object. We also don't use env variable
loader in the go sdk config object, so this env var is ignored by the
config and only default file is read.
* This PR explicitly sets the config file path in the go sdk config
object.

## Tests
* integration tests in vscode
2024-04-24 09:18:13 +00:00
Lennart Kats (databricks) 60122f6035
Show a better error message for using wheel tasks with older DBR versions (#1373)
## Changes

This is a minor improvement to the error about wheel tasks with older
DBR versions, since we get questions about it every now and then. It
also adds a pointer to the docs that were added since the original
messages was committed.

---------

Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com>
2024-04-23 19:36:25 +00:00
Andrew Nester d99f2b808b
Remove `JSON.parse` call from homebrew-tap action (#1393)
## Changes
`needs.goreleaser.outputs.artifacts` already contains valid JS object so
no need to make it a string and try to parse
2024-04-23 13:31:36 +00:00
shreyas-goenka 5120e94302
Removed autogenerated docs for the CLI commands (#1392)
This documentation is very outdated. We don't need this because the CLI
is self-documenting. Also users who land on this page (because of SEO
for example) can be confused because of the lack of documentation for
new commands.

Example issue: https://github.com/databricks/cli/issues/1331
2024-04-23 13:25:25 +00:00
Pieter Noordhuis 27d35d5e1c
Release v0.218.0 (#1391)
This release marks the general availability of Databricks Asset Bundles.

CLI:
* Publish Docker images
([#1353](https://github.com/databricks/cli/pull/1353)).
* Add support for multi-arch Docker images
([#1362](https://github.com/databricks/cli/pull/1362)).
* Do not prefill https:// in prompt for Databricks Host
([#1364](https://github.com/databricks/cli/pull/1364)).
* Add better documentation for the `auth login` command
([#1366](https://github.com/databricks/cli/pull/1366)).
* Add URLs for authentication documentation to the auth command help
([#1365](https://github.com/databricks/cli/pull/1365)).

Bundles:
* Fix compute override for foreach tasks
([#1357](https://github.com/databricks/cli/pull/1357)).
* Transform artifact files source patterns in build not upload stage
([#1359](https://github.com/databricks/cli/pull/1359)).
* Convert between integer and float in normalization
([#1371](https://github.com/databricks/cli/pull/1371)).
* Disable locking for development mode
([#1302](https://github.com/databricks/cli/pull/1302)).
* Resolve variable references inside variable lookup fields
([#1368](https://github.com/databricks/cli/pull/1368)).
* Added validate mutator to surface additional bundle warnings
([#1352](https://github.com/databricks/cli/pull/1352)).
* Upgrade terraform-provider-databricks to 1.40.0
([#1376](https://github.com/databricks/cli/pull/1376)).
* Print host in `bundle validate` when passed via profile or environment
variables ([#1378](https://github.com/databricks/cli/pull/1378)).
* Cleanup remote file path on bundle destroy
([#1374](https://github.com/databricks/cli/pull/1374)).
* Add docs URL for `run_as` in error message
([#1381](https://github.com/databricks/cli/pull/1381)).
* Enable job queueing by default
([#1385](https://github.com/databricks/cli/pull/1385)).
* Added support for job environments
([#1379](https://github.com/databricks/cli/pull/1379)).
* Processing and completion of positional args to bundle run
([#1120](https://github.com/databricks/cli/pull/1120)).
* Add legacy option for `run_as`
([#1384](https://github.com/databricks/cli/pull/1384)).

API Changes:
* Changed `databricks lakehouse-monitors cancel-refresh` command with
new required argument order.
* Changed `databricks lakehouse-monitors create` command with new
required argument order.
* Changed `databricks lakehouse-monitors delete` command with new
required argument order.
* Changed `databricks lakehouse-monitors get` command with new required
argument order.
* Changed `databricks lakehouse-monitors get-refresh` command with new
required argument order.
* Changed `databricks lakehouse-monitors list-refreshes` command with
new required argument order.
* Changed `databricks lakehouse-monitors run-refresh` command with new
required argument order.
* Changed `databricks lakehouse-monitors update` command with new
required argument order.
* Changed `databricks account workspace-assignment update` command to
return response.

OpenAPI commit 94684175b8bd65f8701f89729351f8069e8309c9 (2024-04-11)

Dependency updates:
* Bump github.com/databricks/databricks-sdk-go from 0.37.0 to 0.38.0
([#1361](https://github.com/databricks/cli/pull/1361)).
* Bump golang.org/x/net from 0.22.0 to 0.23.0
([#1380](https://github.com/databricks/cli/pull/1380)).
2024-04-23 10:54:40 +00:00
Miles Yucht 5ee4b41cd5
Decouple winget release (#1389)
## Changes
We are starting to sign Windows CLI executables, but this has to be done
from a machine with a Yubikey storing the signing certificate for the
immediate future. As such, we will only trigger Winget publishing once
the signed binaries have been uploaded to Github.

Additionally, as an extra precaution, we will only release the signed
binaries via Winget.

## Tests
<!-- How is this tested? -->
2024-04-23 07:17:14 +00:00
shreyas-goenka 1d9bf4b2c4
Add legacy option for `run_as` (#1384)
## Changes
This PR partially reverts the changes in
https://github.com/databricks/cli/pull/1233 and puts the old code under
an "experimental.use_legacy_run_as" configuration. This gives customers
who ran into the breaking change made in the PR a way out.


## Tests
Both manually and via unit tests.

Manually verified that run_as works for pipelines now. And if a user
wants to use the feature they need to be both a Metastore and a
workspace admin.

---------

Error when the deploying user is a workspace admin but not a metastore
admin:
```
Error: terraform apply: exit status 1

Error: cannot update permissions: User is not a metastore admin for Metastore 'deco-uc-prod-aws-us-east-1'.

  with databricks_permissions.pipeline_foo,
  on bundle.tf.json line 23, in resource.databricks_permissions.pipeline_foo:
  23:       }
```

--------

Output of bundle validate:
```
➜  bundle-playground git:(master) ✗ cli bundle validate
Warning: You are using the legacy mode of run_as. The support for this mode is experimental and might be removed in a future release of the CLI. In order to run the DLT pipelines in your DAB as the run_as user this mode changes the owners of the pipelines to the run_as identity, which requires the user deploying the bundle to be a workspace admin, and also a Metastore admin if the pipeline target is in UC.
  at experimental.use_legacy_run_as
  in databricks.yml:13:22

Name: bundle-playground
Target: default
Workspace:
  Host: https://dbc-a39a1eb1-ef95.cloud.databricks.com
  User: shreyas.goenka@databricks.com
  Path: /Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default

Found 1 warning
```
2024-04-22 11:51:41 +00:00
Pieter Noordhuis 3108883a8f
Processing and completion of positional args to bundle run (#1120)
## Changes

With this change, both job parameters and task parameters can be
specified as positional arguments to bundle run. How the positional
arguments are interpreted depends on the configuration of the job.

### Examples:

For a job that has job parameters configured a user can specify:

```
databricks bundle run my_job -- --param1=value1 --param2=value2
```

And the run is kicked off with job parameters set to:
```json
{
  "param1": "value1",
  "param2": "value2"
}
```

Similarly, for a job that doesn't use job parameters and only has
`notebook_task` tasks, a user can specify:

```
databricks bundle run my_notebook_job -- --param1=value1 --param2=value2
```

And the run is kicked off with task level `notebook_params` configured
as:
```json
{
  "param1": "value1",
  "param2": "value2"
}
```

For a job that doesn't doesn't use job parameters and only has either
`spark_python_task` or `python_wheel_task` tasks, a user can specify:

```
databricks bundle run my_python_file_job -- --flag=value other arguments
```

And the run is kicked off with task level `python_params` configured as:
```json
[
  "--flag=value",
  "other",
  "arguments"
]
```

The same is applied to jobs with only `spark_jar_task` or
`spark_submit_task` tasks.

## Tests

Unit tests. Tested the completions manually.
2024-04-22 11:50:13 +00:00
Andrew Nester 1872aa12b3
Added support for job environments (#1379)
## Changes
The main changes are:
1. Don't link artifacts to libraries anymore and instead just iterate
over all jobs and tasks when uploading artifacts and update local path
to remote
2. Iterating over `jobs.environments` to check if there are any local
libraries and checking that they exist locally
3. Added tests to check environments are handled correctly

End-to-end test will follow up

## Tests
Added regression test, existing tests (including integration one) pass
2024-04-22 11:44:34 +00:00
Lennart Kats (databricks) 000a7fef8c
Enable job queueing by default (#1385)
## Changes

This enable queueing for jobs by default, following the behavior from
API 2.2+. Queing is a best practice and will be the default in API 2.2.
Since we're still using API 2.1 which has queueing disabled by default,
this PR enables queuing using a mutator.

Customers can manually turn off queueing for any job by adding the
following to their job spec:

```
queue:
  enabled: false
```

## Tests

Unit tests, manual confirmation of property after deployment.

---------

Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com>
2024-04-22 10:36:39 +00:00
Pieter Noordhuis cd675ded9a
Update `testutil` helpers to return path (#1383)
## Changes

I spotted a few call sites where the path of a test file was synthesized
multiple times. It is easier to capture the path as a variable and reuse
it.
2024-04-19 15:05:36 +00:00
Pieter Noordhuis b296f90767
Add trailing newline in usage string (#1382)
## Changes

The default template includes a final newline but this was missing from
the cmdgroup template. This change also adds test coverage for inherited
flags and the flag group description.
2024-04-19 14:12:52 +00:00
dependabot[bot] ebbb716164
Bump golang.org/x/net from 0.22.0 to 0.23.0 (#1380)
Bumps [golang.org/x/net](https://github.com/golang/net) from 0.22.0 to
0.23.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="c48da13158"><code>c48da13</code></a>
http2: fix TestServerContinuationFlood flakes</li>
<li><a
href="762b58d1cf"><code>762b58d</code></a>
http2: fix tipos in comment</li>
<li><a
href="ba872109ef"><code>ba87210</code></a>
http2: close connections when receiving too many headers</li>
<li><a
href="ebc8168ac8"><code>ebc8168</code></a>
all: fix some typos</li>
<li><a
href="3678185f8a"><code>3678185</code></a>
http2: make TestCanonicalHeaderCacheGrowth faster</li>
<li><a
href="448c44f928"><code>448c44f</code></a>
http2: remove clientTester</li>
<li><a
href="c7877ac421"><code>c7877ac</code></a>
http2: convert the remaining clientTester tests to testClientConn</li>
<li><a
href="d8870b0bf2"><code>d8870b0</code></a>
http2: use synthetic time in TestIdleConnTimeout</li>
<li><a
href="d73acffdc9"><code>d73acff</code></a>
http2: only set up deadline when Server.IdleTimeout is positive</li>
<li><a
href="89f602b7bb"><code>89f602b</code></a>
http2: validate client/outgoing trailers</li>
<li>Additional commits viewable in <a
href="https://github.com/golang/net/compare/v0.22.0...v0.23.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/net&package-manager=go_modules&previous-version=0.22.0&new-version=0.23.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/databricks/cli/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-19 14:10:18 +00:00
shreyas-goenka 6ca57a7e68
Add docs URL for `run_as` in error message (#1381) 2024-04-19 14:09:33 +00:00
shreyas-goenka e008c2bd8c
Cleanup remote file path on bundle destroy (#1374)
## Changes
The sync struct initialization would recreate the deleted `file_path`.
This PR moves to not initializing the sync object to delete the
snapshot, thus fixing the lingering `file_path` after `bundle destroy`.

## Tests
Manually, and a integration test to prevent regression.
2024-04-19 11:48:04 +00:00
shreyas-goenka f6c4d6d152
Add NOTICE for using Terraform 1.5.5 licensed under MPL 2.0 (#1377)
We package the terraform 1.5.5 binary in our docker container images for
the CLI. This thus needs to be included in our notice for the
repository.
2024-04-19 11:44:05 +00:00
shreyas-goenka 331313ea5f
Print host in `bundle validate` when passed via profile or environment variables (#1378)
## Changes
Fixes to get host from the workspace client rather than only printing
the host when it's configured in the bundle config.

## Tests
Manually. When a profile was specified for auth.

Before:
```
➜  bundle-playground git:(master) ✗ cli bundle validate
Name: bundle-playground
Target: default
Workspace:
  Host: 
  User: shreyas.goenka@databricks.com
  Path: /Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default
```


After:
```
➜  bundle-playground git:(master) ✗ cli bundle validate
Name: bundle-playground
Target: default
Workspace:
  Host: https://e2-dogfood.staging.cloud.databricks.com
  User: shreyas.goenka@databricks.com
  Path: /Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default
```
2024-04-19 11:43:50 +00:00
Pieter Noordhuis 6e59b13452
Update Spark version in integration tests to 13.3 (#1375)
## Tests

Integration test run pending.
2024-04-19 11:31:54 +00:00
shreyas-goenka 3c14204e98
Followup improvements to the Docker setup script (#1369)
## Changes
This PR:
1. Uses bash to run the setup.sh script instead of the native busybox sh
shipped with alpine.
2. Verifies the checksums of the installed terraform CLI binaries.
 
## Tests
Manually. The docker image successfully builds.

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-04-18 20:52:11 +00:00
Andrew Nester 6b81b627fe
Upgrade terraform-provider-databricks to 1.40.0 (#1376)
## Changes
Upgrade terraform-provider-databricks to 1.40.0
2024-04-18 20:20:01 +00:00
Andrew Nester 27f51c760f
Added validate mutator to surface additional bundle warnings (#1352)
## Changes
All these validators will return warnings as part of `bundle validate`
run

Added 2 mutators: 
1. To check that if tasks use job_cluster_key it is actually defined
2. To check if there are any files to sync as part of deployment

Also added `bundle.Parallel` to run them in parallel

To make sure mutators under bundle.Parallel do not mutate config,
introduced new `ReadOnlyMutator`, `ReadOnlyBundle` and `ReadOnlyConfig`.

Example 

```
databricks bundle validate -p deco-staging
Warning: unknown field: new_cluster
  at resources.jobs.my_job
  in bundle.yml:24:7

Warning: job_cluster_key high_cpu_workload_job_cluster is not defined
  at resources.jobs.my_job.tasks[0].job_cluster_key
  in bundle.yml:35:28

Warning: There are no files to sync, please check your your .gitignore and sync.exclude configuration
  at sync.exclude
  in bundle.yml:18:5

Name: test
Target: default
Workspace:
  Host: https://acme.databricks.com
  User: andrew.nester@databricks.com
  Path: /Users/andrew.nester@databricks.com/.bundle/test/default

Found 3 warnings
```

## Tests
Added unit tests
2024-04-18 15:13:16 +00:00
shreyas-goenka eb9665d2ee
Add better documentation for the `auth login` command (#1366)
This PR improves the documentation for the `auth login` command,
accounting for the various ways this command can be used in.

---------

Co-authored-by: PaulCornellDB <paul.cornell@databricks.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-04-18 11:55:42 +00:00
Andrew Nester 542156c30b
Resolve variable references inside variable lookup fields (#1368)
## Changes
Allows for the syntax below

```
variables:
  service_principal_app_id:
    description: 'The app id of the service principal for running workflows as.'
    lookup:
      service_principal: "sp-${bundle.environment}"
```

Fixes #1259

## Tests
Added regression test
2024-04-18 09:56:16 +00:00
Lennart Kats (databricks) c3a7d17d1d
Disable locking for development mode (#1302)
## Changes

This changes `databricks bundle deploy` so that it skips the lock
acquisition/release step for a `mode: development` target:
* This saves about 2 seconds (measured over 100 runs on a quiet/busy
workspace).
* This helps avoid the `deploy lock acquired by lennart@company.com at
2024-02-28 15:48:38.40603 +0100 CET. Use --force-lock to override` error
* Risk: this may cause deployment conflicts, but since dev mode
deployments are always scoped to a user, that risk should be minimal

Update after discussion:
* This behavior can now be disabled via a setting.
* Docs PR: https://github.com/databricks/docs/pull/15873

## Measurements

### 100 deployments of the "python_default" project to an empty
workspace

_Before this branch:_
p50 time: 11.479 seconds
p90 time: 11.757 seconds

_After this branch:_
p50 time: 9.386 seconds
p90 time: 9.599 seconds

### 100 deployments of the "python_default" project to a busy (staging)
workspace

_Before this branch:_
* p50 time: 13.335 seconds
* p90 time: 15.295 seconds

_After this branch:_
* p50 time: 11.397 seconds
* p90 time: 11.743 seconds

### Typical duration of deployment steps

* Acquiring Deployment Lock: 1.096 seconds
* Deployment Preparations and Operations: 1.477 seconds
* Uploading Artifacts: 1.26 seconds
* Finalizing Deployment: 9.699 seconds
* Releasing Deployment Lock: 1.198 seconds

---------

Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com>
Co-authored-by: Andrew Nester <andrew.nester.dev@gmail.com>
2024-04-18 01:59:39 +00:00
Pieter Noordhuis 77d6820075
Convert between integer and float in normalization (#1371)
## Changes

We currently issue a warning if an integer is used where a floating
point number is expected. But if they are convertible, we should convert
and not issue a warning. This change fixes normalization if they are
convertible between each other. We still produce a warning if the type
conversion leads to a loss in precision.

## Tests

Unit tests pass.
2024-04-17 08:58:07 +00:00
dependabot[bot] c949655f9f
Bump github.com/databricks/databricks-sdk-go from 0.37.0 to 0.38.0 (#1361)
[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.37.0&new-version=0.38.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-04-16 12:03:21 +00:00
shreyas-goenka 569bb1cf63
Add support for multi-arch Docker images (#1362)
## Changes
This PR follows instructions in
https://goreleaser.com/cookbooks/multi-platform-docker-images/ to create
a multi-arch docker CLI image. Thus customers can simply specify `docker
pull ghcr.io/databricks/cli:latest` to pull and run the image.

The current approach uses the `docker manifest` support in goreleaser to
create a multi-arch image. This has a couple of pros and cons. TLDR; The
changes as is in the PR are good to go and very low risk. The
information provided here is just FYI.

pros: 
Fewer configurations/workflows for us to manage/maintain. Goreleaser
makes sure the correct CLI binary is in place when building the CLI and
also takes care of publishing it to the Github Container Registry.

cons:
Goreleaser only supports [docker
manifest](https://docs.docker.com/reference/cli/docker/manifest/) to
create multi-arch images. This has a few minor disadvantages:
1. `goreleaser` pushes all intermediate images (arm64 and and64 specific
images) to the registry. This is required for the manifest to reference
them. See: https://github.com/goreleaser/goreleaser/issues/2606

Note: We have a migration path here, if someday we stop publishing
intermediate images, we can simply tag the "multi-arch" image as both
`latest-amd64` and `latest-arm64`. For now, these are separate images.
see: https://github.com/databricks/cli/pkgs/container/cli

2. `docker manifest` is technically an experimental command. Though it's
been out for multiple years now and the indirect dependency by
`goreleaser` should be fine. In any case, we can migrate by moving our
docker build process off goreleaser if we need to.

## Tests
Tested manually by publishing a new release for `v0.0.0-docker` in
ghcr.io.
1. Package: https://github.com/databricks/cli/pkgs/container/cli
2. Release workflow:
https://github.com/databricks/cli/actions/runs/8689359851

Tests the image itself by running it manually:
```
➜  cli git:(feature/multi-arch-docker) docker pull ghcr.io/databricks/cli:latest
latest: Pulling from databricks/cli
bca4290a9639: Already exists 
6d445556910d: Already exists 
Digest: sha256:82eabc500b541a89182aed4d3158c955e57c1e84d9616b76510aceb1d9024425
Status: Downloaded newer image for ghcr.io/databricks/cli:latest
ghcr.io/databricks/cli:latest

What's Next?
  View a summary of image vulnerabilities and recommendations → docker scout quickview ghcr.io/databricks/cli:latest
➜  cli git:(feature/multi-arch-docker) docker run ghcr.io/databricks/cli --version
Databricks CLI v0.0.0-docker
```
2024-04-16 11:26:19 +00:00