Compare commits

...

171 Commits

Author SHA1 Message Date
Denis Bilenko b938d567b9 Break integration test to see new failure reporting logic 2025-02-11 14:40:30 +01:00
Denis Bilenko 878fa80322
acc: Fix RecordRequests to support requests without body (#2333)
## Changes
Do not paste request body into output if it's not a valid JSON.

## Tests
While working on #2334 I found that if I try to record a test that calls
/api/2.0/preview/scim/v2/Me which has no request body, it crashes.
2025-02-11 10:50:52 +00:00
Denis Bilenko 8d849fe868
acc: Disable custom server on CLOUD_ENV (#2332)
We're not using local server when CLOUD_ENV is enabled, no need to set
up a custom one.
2025-02-11 10:37:48 +00:00
Denis Bilenko f2096eddcc
acc: Do not show all replacements on every failure (#2331)
## Changes
- Only print replacements if VERBOSE_TEST flag is set.
- This is set on CI but not when you do "go test" or "make test".

Note, env var is used, so that it can be set in Makefile.

## Tests
Manually.
2025-02-11 09:38:53 +00:00
dependabot[bot] e81ec4ee23
Bump golang.org/x/mod from 0.22.0 to 0.23.0 (#2324)
Bumps [golang.org/x/mod](https://github.com/golang/mod) from 0.22.0 to
0.23.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="52289f1fa7"><code>52289f1</code></a>
modfile: fix trailing empty lines in require blocks</li>
<li>See full diff in <a
href="https://github.com/golang/mod/compare/v0.22.0...v0.23.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/mod&package-manager=go_modules&previous-version=0.22.0&new-version=0.23.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 17:13:49 +01:00
dependabot[bot] 6f3dbaec4c
Bump golang.org/x/text from 0.21.0 to 0.22.0 (#2323)
Bumps [golang.org/x/text](https://github.com/golang/text) from 0.21.0 to
0.22.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="3b64043c9e"><code>3b64043</code></a>
go.mod: update golang.org/x dependencies</li>
<li><a
href="1e59086680"><code>1e59086</code></a>
message/pipeline: add two Unalias calls</li>
<li>See full diff in <a
href="https://github.com/golang/text/compare/v0.21.0...v0.22.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/text&package-manager=go_modules&previous-version=0.21.0&new-version=0.22.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 16:13:38 +01:00
dependabot[bot] f6c50a6318
Bump golang.org/x/term from 0.28.0 to 0.29.0 (#2325)
Bumps [golang.org/x/term](https://github.com/golang/term) from 0.28.0 to
0.29.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="743b2709ab"><code>743b270</code></a>
go.mod: update golang.org/x dependencies</li>
<li>See full diff in <a
href="https://github.com/golang/term/compare/v0.28.0...v0.29.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/term&package-manager=go_modules&previous-version=0.28.0&new-version=0.29.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 15:49:52 +01:00
Andrew Nester f7a45d0c7e
Upgrade to TF provider 1.65.1 (#2328)
## Changes
Upgrade to TF provider 1.65.1

Notable changes:
- Now it's possible to use `run_as` field in `pipelines` definition
- Added support for `performance_target` for `jobs`
2025-02-10 14:06:02 +00:00
dependabot[bot] 4bc231ad4f
Bump golang.org/x/oauth2 from 0.25.0 to 0.26.0 (#2322)
Bumps [golang.org/x/oauth2](https://github.com/golang/oauth2) from
0.25.0 to 0.26.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b9c813be7d"><code>b9c813b</code></a>
google: add warning about externally-provided credentials</li>
<li>See full diff in <a
href="https://github.com/golang/oauth2/compare/v0.25.0...v0.26.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/oauth2&package-manager=go_modules&previous-version=0.25.0&new-version=0.26.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 14:58:18 +01:00
shreyas-goenka 6953a84db6
Serialize recorded requests with indentation in acceptance tests (#2329)
## Changes
This PR indents the recorded requests to make them easier to review.
They can still be parsed using jq.

## Tests
Existing tests.
2025-02-10 19:03:27 +05:30
shreyas-goenka ddedc4272d
Return 501 status code when API stub is not implemented (#2327)
## Changes
Addresses feedback from
https://github.com/databricks/cli/pull/2292#discussion_r1946846865

## Tests
Manually, confirmed that unstubbed API calls still cause acceptance
tests to fail.
2025-02-10 13:05:12 +00:00
Denis Bilenko d282f33a22
Append newline to "-o json" for validate/summary/run (#2326)
## Changes
- Insert newline after rendering indented JSON in bundle
validate/summary/run.
- This prevents "No newline at end of file" message in various cases,
for example when switching between recording raw output of the command
to output processed by jq, since jq does add a newline or when running
diff in acceptance tests.

## Tests
Manually running validate:

```
~/work/dabs_cuj_brickfood % ../cli/cli-main bundle validate -o json | tail -n 2  # without change
Error: root_path must start with '~/' or contain the current username to ensure uniqueness when using 'mode: development'

  }
}%
~/work/dabs_cuj_brickfood % ../cli/cli bundle validate -o json | tail -n 2  # with change
Error: root_path must start with '~/' or contain the current username to ensure uniqueness when using 'mode: development'

  }
}
~/work/dabs_cuj_brickfood %
```

Via #2316 -- see cleaner output there.
2025-02-10 14:00:49 +01:00
dependabot[bot] 047691dd91
Bump github.com/databricks/databricks-sdk-go from 0.56.1 to 0.57.0 (#2321)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.56.1 to 0.57.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.57.0</h2>
<h2>[Release] Release v0.57.0</h2>
<h3>New Features and Improvements</h3>
<ul>
<li>Add support for async OAuth token refreshes (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1135">#1135</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/billing#BudgetPolicyAPI">a.BudgetPolicy</a>
account-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#EnableIpAccessListsAPI">a.EnableIpAccessLists</a>
account-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#LakeviewEmbeddedAPI">w.LakeviewEmbedded</a>
workspace-level service and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#QueryExecutionAPI">w.QueryExecution</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#RedashConfigAPI">w.RedashConfig</a>
workspace-level service.</li>
<li>Added <code>GcpOauthToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TemporaryCredentials">catalog.TemporaryCredentials</a>.</li>
<li>Added <code>Options</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateCatalog">catalog.UpdateCatalog</a>.</li>
<li>Added <code>StatementId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#QueryAttachment">dashboards.QueryAttachment</a>.</li>
<li>Added <code>EffectivePerformanceTarget</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li>
<li>Added <code>PerformanceTarget</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob">jobs.CreateJob</a>.</li>
<li>Added <code>PerformanceTarget</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings">jobs.JobSettings</a>.</li>
<li>Added <code>EffectivePerformanceTarget</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li>
<li>Added <code>PerformanceTarget</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunNow">jobs.RunNow</a>.</li>
<li>Added <code>Disabled</code> and
<code>EffectivePerformanceTarget</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask">jobs.RunTask</a>.</li>
<li>Added <code>UserAuthorizedScopes</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#CreateCustomAppIntegration">oauth2.CreateCustomAppIntegration</a>.</li>
<li>Added <code>UserAuthorizedScopes</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#GetCustomAppIntegrationOutput">oauth2.GetCustomAppIntegrationOutput</a>.</li>
<li>Added <code>UserAuthorizedScopes</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateCustomAppIntegration">oauth2.UpdateCustomAppIntegration</a>.</li>
<li>Added <code>Contents</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#HttpRequestResponse">serving.HttpRequestResponse</a>.</li>
<li>Changed <code>HttpRequest</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service to type <code>HttpRequest</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service.</li>
<li>Changed <code>HttpRequest</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#HttpRequestResponse">serving.HttpRequestResponse</a>.</li>
<li>Removed <code>SecurableKind</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogInfo">catalog.CatalogInfo</a>.</li>
<li>Removed <code>SecurableKind</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ConnectionInfo">catalog.ConnectionInfo</a>.</li>
<li>Removed <code>StatusCode</code> and <code>Text</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalFunctionResponse">serving.ExternalFunctionResponse</a>.</li>
</ul>
<p>OpenAPI SHA: c72c58f97b950fcb924a90ef164bcb10cfcd5ece, Date:
2025-02-03</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>[Release] Release v0.57.0</h2>
<h3>New Features and Improvements</h3>
<ul>
<li>Add support for async OAuth token refreshes (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1135">#1135</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/billing#BudgetPolicyAPI">a.BudgetPolicy</a>
account-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#EnableIpAccessListsAPI">a.EnableIpAccessLists</a>
account-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#LakeviewEmbeddedAPI">w.LakeviewEmbedded</a>
workspace-level service and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#QueryExecutionAPI">w.QueryExecution</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#RedashConfigAPI">w.RedashConfig</a>
workspace-level service.</li>
<li>Added <code>GcpOauthToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TemporaryCredentials">catalog.TemporaryCredentials</a>.</li>
<li>Added <code>Options</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateCatalog">catalog.UpdateCatalog</a>.</li>
<li>Added <code>StatementId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#QueryAttachment">dashboards.QueryAttachment</a>.</li>
<li>Added <code>EffectivePerformanceTarget</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li>
<li>Added <code>PerformanceTarget</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob">jobs.CreateJob</a>.</li>
<li>Added <code>PerformanceTarget</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings">jobs.JobSettings</a>.</li>
<li>Added <code>EffectivePerformanceTarget</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li>
<li>Added <code>PerformanceTarget</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunNow">jobs.RunNow</a>.</li>
<li>Added <code>Disabled</code> and
<code>EffectivePerformanceTarget</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask">jobs.RunTask</a>.</li>
<li>Added <code>UserAuthorizedScopes</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#CreateCustomAppIntegration">oauth2.CreateCustomAppIntegration</a>.</li>
<li>Added <code>UserAuthorizedScopes</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#GetCustomAppIntegrationOutput">oauth2.GetCustomAppIntegrationOutput</a>.</li>
<li>Added <code>UserAuthorizedScopes</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateCustomAppIntegration">oauth2.UpdateCustomAppIntegration</a>.</li>
<li>Added <code>Contents</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#HttpRequestResponse">serving.HttpRequestResponse</a>.</li>
<li>Changed <code>HttpRequest</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service to type <code>HttpRequest</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service.</li>
<li>Changed <code>HttpRequest</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#HttpRequestResponse">serving.HttpRequestResponse</a>.</li>
<li>Removed <code>SecurableKind</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogInfo">catalog.CatalogInfo</a>.</li>
<li>Removed <code>SecurableKind</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ConnectionInfo">catalog.ConnectionInfo</a>.</li>
<li>Removed <code>StatusCode</code> and <code>Text</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalFunctionResponse">serving.ExternalFunctionResponse</a>.</li>
</ul>
<p>OpenAPI SHA: c72c58f97b950fcb924a90ef164bcb10cfcd5ece, Date:
2025-02-03</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7cb1883c85"><code>7cb1883</code></a>
[Release] Release v0.57.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1140">#1140</a>)</li>
<li><a
href="31fdc692bb"><code>31fdc69</code></a>
[Feature] Add support for async OAuth token refreshes (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1135">#1135</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.56.1...v0.57.0">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.56.1&new-version=0.57.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2025-02-10 12:50:52 +00:00
shreyas-goenka ee440e65fe
Serialize all header values in acceptance tests (#2311)
## Changes
Based on feedback in
https://github.com/databricks/cli/pull/2296#discussion_r1946660650.
Previously we only serialized the first value for a header in the
requests log. Now we serialise all values for a header key.

## Tests
Existing test
2025-02-10 12:18:05 +00:00
Denis Bilenko 4ebc86282f
acc: Split bundle/templates and bundle/templates-machinery (#2317)
The tests in acceptance/bundle/template focus on standard templates. The
plan is to extend them with "bundle deploy" and enable them on the
cloud.

The tests in acceptance/bundle/template-machinery focus on specific
aspects of template implementation. Most of them are expected to remain
local-only.
2025-02-10 11:55:34 +01:00
Denis Bilenko cc07380185
acc: Summarize unexpected files (#2320)
## Changes
When there are many unexpected files, it's good to see them as a list
rather than scattered throughout the output.

<!-- Summary of your changes that are easy to understand -->

## Tests
Manually, example output:
```
    acceptance_test.go:363: Test produced unexpected files:
        output/my_default_sql/.databricks/bundle/dev/sync-snapshots/71c79ded90615dc7.json
        output/my_default_sql/.databricks/bundle/dev/terraform/.terraform/providers/registry.terraform.io/databricks/databricks/1.64.1/darwin_arm64
        output/my_default_sql/.databricks/bundle/dev/terraform/plan
        output/my_default_sql/.databricks/bundle/prod/sync-snapshots/83e677e75259c93b.json
        output/my_default_sql/.databricks/bundle/prod/terraform/.terraform/providers/registry.terraform.io/databricks/databricks/1.64.1/darwin_arm64
```
2025-02-10 11:53:00 +01:00
Denis Bilenko 2175dd24a4
Do not gitignore .databricks and terraform (#2318)
For acceptance/bundle/templates I'd like to run "bundle deploy". This
would create .databricks directory inside materialized output. It might
makes sense to commit some of this as part of golden files output. Even
if we did not commit anything, the test runner will see those files and
show the difference. Thus git should also see them.

Also rename .gitignore to out.gitignore in those tests, since that
includes .databricks as well.
2025-02-10 11:42:39 +01:00
Denis Bilenko 06e342afc5
Silence a comment in Makefile (#2315)
It was not indended to be printed. Follow up to #2298
2025-02-10 09:16:31 +00:00
Andrew Nester f8aaa7fce3
Added support to generate Git based jobs (#2304)
## Changes
This will generate bundle YAML configuration for Git based jobs but
won't download any related files as they are in Git repo.

Fixes #1423 

## Tests
Added unit test

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2025-02-07 18:37:03 +00:00
Andrew Nester 2a97dcaa45
Raise an error when there are multiple local libraries with the same basename used (#2297)
## Changes
Raise an error when there are multiple local libraries with the same
basename used

Fixes #1674

## Tests
Added an unit test
2025-02-07 17:55:16 +00:00
Denis Bilenko 6d83ffd109
acc: enable bundle/scripts on cloud (#2313) 2025-02-07 17:42:47 +00:00
Denis Bilenko 989aabe5f1
acc: Make variable tests local-only (#2312)
Makes use of #2294
2025-02-07 17:42:35 +00:00
Andrew Nester 5aa89230e9
Use CreatePipeline instead of PipelineSpec for resources.Pipeline struct (#2287)
## Changes
`CreatePipeline` is a more complete structure (superset of PipelineSpec
one) which enables support of additional fields such as `run_as` and
`allow_duplicate_names` in DABs configuration. Note: these fields are
subject to support in TF in order to correctly work.

## Tests
Existing tests pass + no fields are removed from JSON schema
2025-02-07 17:22:51 +00:00
Denis Bilenko ff4a5c2269
acc: Implement config merge (#2294)
## Changes
Instead of using leaf-most config, all configs from root at
acceptance/test.toml to all intermediate ones to leaf config are merged
into one. Maps are merged, slices are appended, other values are
overridden.

I had to disable caching, because it is tricky when merging is involved
- deep copy is needed. There is performance
impact but currently it is tiny, about 1%.

Also, remove empty root config.

## Tests

Manually checked that inheritance of LocalOnly setting worked for these
tests:

Before - integration tests showed:

```
PASS acceptance.TestAccept/bundle/templates/wrong-url (0.70s)
PASS acceptance.TestAccept/bundle/templates/wrong-path (0.44s)
```

After:

```
SKIP acceptance.TestAccept/bundle/templates/wrong-url (0.00s)
SKIP acceptance.TestAccept/bundle/templates/wrong-path (0.00s)
      acceptance_test.go:216: Disabled via LocalOnly setting in bundle/templates/test.toml, bundle/templates/wrong-path/test.toml (CLOUD_ENV=***)
```
2025-02-07 17:38:27 +01:00
shreyas-goenka f71583fbc0
Error when unknown API endpoint is used in testserver (#2292)
## Changes
This PR fails the acceptance test when an unknown endpoint (i.e. not
stubbed) is used. We want to ensure that all API endpoints used in an
acceptance test are stubbed and do not otherwise silently fail with a
404.

The logs on failure output include a configuration that developers can
simply copy-paste to `test.toml` to stub the missing API endpoint. It'll
look something like:
```
[[Server]]
Pattern = "<method> <path>"
Response.Body = '''
<response body here>
'''
Response.StatusCode = <response status-code here>
```


## Tests
Manually:

output.txt when an endpoint is not found: 
```
>>> [CLI] jobs create --json {"name":"abc"}
Error: No stub found for pattern: POST /api/2.1/jobs/create
```

How this renders in the test logs:
```
    --- FAIL: TestAccept/workspace/jobs/create (0.03s)
        server.go:46: 
            
            ----------------------------------------
            No stub found for pattern: POST /api/2.1/jobs/create
            
            To stub a response for this request, you can add
            the following to test.toml:
            [[Server]]
            Pattern = "POST /api/2.1/jobs/create"
            Response.Body = '''
            <response body here>
            '''
            Response.StatusCode = <response status-code here>
            ----------------------------------------
```

Manually checked that the debug mode still works.
2025-02-07 16:26:48 +00:00
Denis Bilenko 6b1a778fe1
Fix flaky acceptance test (#2310)
## Changes
Replace timestamps with fixed string before output is sorted (and before
test runner replacements are applied).

Otherwise the test sometimes fails with error below. Note, timestamps
themselves do not show it, because they were replaced.

```
    --- FAIL: TestAccept/bundle/debug (0.78s)
        acceptance_test.go:404: Diff:
            --- bundle/debug/out.stderr.parallel.txt
            +++ /var/folders/5y/9kkdnjw91p11vsqwk0cvmk200000gp/T/TestAcceptbundledebug1859985035/001/out.stderr.parallel.txt
            @@ -8,8 +8,8 @@
             10:07:59 Debug: ApplyReadOnly pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=validate:files_to_sync
             10:07:59 Debug: ApplyReadOnly pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=validate:folder_permissions
             10:07:59 Debug: ApplyReadOnly pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=validate:validate_sync_patterns
            -10:07:59 Debug: Path /Workspace/Users/[USERNAME]/.bundle/debug/default/files has type directory (ID: 0) pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=validate:files_to_sync
             10:07:59 Debug: non-retriable error:  pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=validate:files_to_sync sdk=true
            +10:07:59 Debug: Path /Workspace/Users/[USERNAME]/.bundle/debug/default/files has type directory (ID: 0) pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=validate:files_to_sync
             < {} pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=validate:files_to_sync sdk=true
             < {} pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=validate:files_to_sync sdk=true
             < } pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=validate:files_to_sync sdk=true
```

## Tests
Running `hyperfine --min-runs 10 'go test ../.. -run
^TestAccept$/^bundle$/^debug$ -count=1' --show-output` detects flakiness
on main but not with this PR.
2025-02-07 16:17:50 +00:00
Andrew Nester ecb816446e
Update app deploy test to confirm app config changes on redeploy (#2301)
## Changes
Adds additional step to integration test which changes the app config
and confirms it's updated after redeploy

## Tests
```
    helpers_test.go:156: stderr: Deleting files...
    helpers_test.go:156: stderr: Destroy complete!
--- PASS: TestDeployBundleWithApp (470.25s)
PASS
coverage: [no statements]
ok      github.com/databricks/cli/integration/bundle    470.981s        coverage: [no statements]
```
2025-02-07 14:54:24 +00:00
Denis Bilenko ecc05689ca
Add a couple of tests for bundle init with custom template (#2293)
These test custom template and what happens if helper function returns
an error.
2025-02-07 13:13:12 +00:00
shreyas-goenka 65ac9a336a
Add doc string for the `auth token` command (#2302)
## Changes
The intent of this PR is to clarify that the `databricks auth token`
command is not supported for M2M auth. Fixes:
https://github.com/databricks/cli/issues/1939

## Tests
Manually.
2025-02-07 11:51:37 +00:00
Denis Bilenko 54e16d5f62
Always print warnings and errors; clean up format (#2213)
## Changes
- Print warnings and errors by default.
- Fix ErrAlreadyPrinted not to be logged at Error level.
- Format log messages as "Warn: message" instead of "WARN" to make it
more readable and in-line with the rest of the output.
- Only print attributes (pid, mutator, etc) and time when the overall
level is debug (so --debug output has not changed much).

## Tests
- Existing acceptance tests show how warning messages appear in various
test case.
- Added new test for `--debug` output.
- Add sort_lines.py helper to avoid dependency on 'sort' which is
locale-sensitive.
2025-02-07 11:29:40 +00:00
Gleb Kanterov 75127fe42e
Extend testserver for deployment (#2299)
## Changes
Extend testserver for bundle deployment:

- Allocate a new workspace per test case to isolate test cases from each
other
- Support jobs get/list/create
- Support creation and listing of workspace files

## Tests
Using existing acceptance tests
2025-02-07 10:26:20 +00:00
Ilya Kuznetsov 27eb0c4072
Allow 'any' examples in JSON schema (#2289)
## Changes

1. Allow `any` examples in json-schema type since we have many of them
in open api spec
2. Fix issue with missing overrides annotations when re-generating the
schema

## Tests
<!-- How is this tested? -->
2025-02-06 19:27:55 +00:00
Denis Bilenko e0903fbd37
Include 'go mod tidy' into 'make' and 'make tidy' (#2298)
Apparently, it's not part of golangci-lint, so you can send PRs that
fail this check on CI.
2025-02-05 14:58:29 +00:00
Marcin Wojtyczka 27caf413f2
Add support for extras to the labs CLI (#2288)
## Changes

Added support for extras / optional Python dependencies in the labs CLI.

Added new `extras` field under install.

Example:

```yaml
install:
  script: install.py
  extras: cli
```

Resolves: #2257 

## Tests

Manual test
2025-02-05 13:24:15 +00:00
Andrew Nester 5c90752797
acc: Added acceptance test for CLI commands inside bundle with and without profile flag (#2270)
## Changes
This encodes existing behaviour in CLI as reported here: #1358
2025-02-05 11:53:36 +00:00
shreyas-goenka 57b8d336e0
Add ability to record headers in acceptance tests (#2296)
## Changes
HTTP headers like the User-Agent are an important part of our internal
ETL pipelines. This PR adds the ability to validate the headers used in
an HTTP request as part of our acceptance tests.

## Tests
Modifying existing test.
2025-02-05 09:32:15 +00:00
Ilya Kuznetsov 1678503cb0
Fix docs template (#2283)
## Changes

Comment breaks markdown front-matter and description cannot be read

## Tests
<!-- How is this tested? -->
2025-02-05 09:01:51 +00:00
Pieter Noordhuis 2e1455841c
Update CODEOWNERS for cmd/labs (#2295)
## Changes

The `CODEOWNERS` file must live in one of the directories specified in
the [docs][docs], so the existing file under `cmd/labs` didn't work.
This change moves the contents to the top-level file and includes
@alexott as owner.

[docs]:
https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-code-owners#codeowners-file-location
2025-02-04 21:20:02 +00:00
rikjansen-hu dcc61cd763
Fix env variable for AzureCli local config (#2248)
## Changes

Solves #1722 (current solution passes wrong variable)

## Tests

None, this is a simple find-and-replace on a previous PR.
Proof that this is the correct
[variable](https://learn.microsoft.com/en-us/cli/azure/azure-cli-configuration#cli-configuration-file).
This just passes the variable along to the Terraform environment, which
[should](https://github.com/hashicorp/terraform/issues/25416) be picked
up by Terraform.

Co-authored-by: Rik Jansen <rik.jansen2@nl.abnamro.com>
2025-02-04 19:30:02 +01:00
Simon Poltier 84b694f2a1
accept JSON includes (#2265)
#2201 disabled using JSON as part of a bundle definition. I believe this
was not intended.

## Changes
Accept json files as includes, just as YAML files.
## Tests
Covered by the tests in #2201
2025-02-04 19:28:19 +01:00
shreyas-goenka d86ad91899
Allow test servers to return errors responses (#2291)
## Changes
The APIs at Databricks when returning a non `200` status code will
return a response body of the format:
```
{
  "error_code": "Error code",
  "message": "Human-readable error message."
}
```

This PR adds the ability to stub non-200 status codes in the test
server, allowing us to mock API errors from Databricks.
## Tests
New test
2025-02-04 17:38:11 +01:00
Denis Bilenko 07efe83023
Use go-version-file instead of go-version in github actions (#2290)
This minimizes number of places where we hard-code go version.

Note, since we have go version specified without patch version ("1.23")
in go.mod, it will use most recent in 1.23.x line. I think this is fine.


https://github.com/actions/setup-go?tab=readme-ov-file#getting-go-version-from-the-gomod-file
2025-02-04 16:08:01 +00:00
dependabot[bot] 2eb9abb5ee
Bump github.com/spf13/pflag from 1.0.5 to 1.0.6 (#2281)
Bumps [github.com/spf13/pflag](https://github.com/spf13/pflag) from
1.0.5 to 1.0.6.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/spf13/pflag/releases">github.com/spf13/pflag's
releases</a>.</em></p>
<blockquote>
<h2>v1.0.6</h2>
<h2>What's Changed</h2>
<ul>
<li>Add exported functions to preserve <code>pkg/flag</code>
compatibility by <a
href="https://github.com/mckern"><code>@​mckern</code></a> in <a
href="https://redirect.github.com/spf13/pflag/pull/220">spf13/pflag#220</a></li>
<li>remove dead code for checking error nil by <a
href="https://github.com/yashbhutwala"><code>@​yashbhutwala</code></a>
in <a
href="https://redirect.github.com/spf13/pflag/pull/282">spf13/pflag#282</a></li>
<li>Add IPNetSlice and unit tests by <a
href="https://github.com/rpothier"><code>@​rpothier</code></a> in <a
href="https://redirect.github.com/spf13/pflag/pull/170">spf13/pflag#170</a></li>
<li>allow for blank ip addresses by <a
href="https://github.com/duhruh"><code>@​duhruh</code></a> in <a
href="https://redirect.github.com/spf13/pflag/pull/316">spf13/pflag#316</a></li>
<li>add github actions by <a
href="https://github.com/sagikazarmark"><code>@​sagikazarmark</code></a>
in <a
href="https://redirect.github.com/spf13/pflag/pull/419">spf13/pflag#419</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/mckern"><code>@​mckern</code></a> made
their first contribution in <a
href="https://redirect.github.com/spf13/pflag/pull/220">spf13/pflag#220</a></li>
<li><a
href="https://github.com/yashbhutwala"><code>@​yashbhutwala</code></a>
made their first contribution in <a
href="https://redirect.github.com/spf13/pflag/pull/282">spf13/pflag#282</a></li>
<li><a href="https://github.com/rpothier"><code>@​rpothier</code></a>
made their first contribution in <a
href="https://redirect.github.com/spf13/pflag/pull/170">spf13/pflag#170</a></li>
<li><a href="https://github.com/duhruh"><code>@​duhruh</code></a> made
their first contribution in <a
href="https://redirect.github.com/spf13/pflag/pull/316">spf13/pflag#316</a></li>
<li><a
href="https://github.com/sagikazarmark"><code>@​sagikazarmark</code></a>
made their first contribution in <a
href="https://redirect.github.com/spf13/pflag/pull/419">spf13/pflag#419</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/spf13/pflag/compare/v1.0.5...v1.0.6">https://github.com/spf13/pflag/compare/v1.0.5...v1.0.6</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="5ca813443b"><code>5ca8134</code></a>
Merge pull request <a
href="https://redirect.github.com/spf13/pflag/issues/419">#419</a> from
spf13/ci</li>
<li><a
href="100ab0eb25"><code>100ab0e</code></a>
disable unsupported dependency graph for now</li>
<li><a
href="a0f4ddd9fe"><code>a0f4ddd</code></a>
fix govet</li>
<li><a
href="f48cbd1964"><code>f48cbd1</code></a>
add github actions</li>
<li><a
href="d5e0c0615a"><code>d5e0c06</code></a>
allow for blank ip addresses (<a
href="https://redirect.github.com/spf13/pflag/issues/316">#316</a>)</li>
<li><a
href="85dd5c8bc6"><code>85dd5c8</code></a>
Add IPNetSlice and unit tests (<a
href="https://redirect.github.com/spf13/pflag/issues/170">#170</a>)</li>
<li><a
href="6971c29c4a"><code>6971c29</code></a>
remove dead code for checking error nil (<a
href="https://redirect.github.com/spf13/pflag/issues/282">#282</a>)</li>
<li><a
href="81378bbcd8"><code>81378bb</code></a>
Add exported functions to preserve <code>pkg/flag</code> compatibility
(<a
href="https://redirect.github.com/spf13/pflag/issues/220">#220</a>)</li>
<li>See full diff in <a
href="https://github.com/spf13/pflag/compare/v1.0.5...v1.0.6">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/spf13/pflag&package-manager=go_modules&previous-version=1.0.5&new-version=1.0.6)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-03 17:00:52 +01:00
Denis Bilenko 9320bd1682
acc: Use [VARNAME] instead of $VARNAME (#2282)
$VARNAME is what we use for environment variables, it's good to
separate.

Some people use envsubst for homemade variable interpolation, it's also
good to have separation there.
2025-02-03 14:10:19 +00:00
dependabot[bot] 838de2fde2
Bump github.com/hashicorp/terraform-exec from 0.21.0 to 0.22.0 (#2237)
Bumps
[github.com/hashicorp/terraform-exec](https://github.com/hashicorp/terraform-exec)
from 0.21.0 to 0.22.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/hashicorp/terraform-exec/releases">github.com/hashicorp/terraform-exec's
releases</a>.</em></p>
<blockquote>
<h2>v0.22.0</h2>
<p>ENHANCEMENTS:</p>
<ul>
<li>tfexec: Add support for <code>terraform init --json</code> via
<code>InitJSON</code> (<a
href="https://redirect.github.com/hashicorp/terraform-exec/pull/478">#478</a>)</li>
</ul>
<p>INTERNAL:</p>
<ul>
<li>go: Require Go 1.22 (previously 1.18) (<a
href="https://redirect.github.com/hashicorp/terraform-exec/pull/499">#499</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/hashicorp/terraform-exec/blob/main/CHANGELOG.md">github.com/hashicorp/terraform-exec's
changelog</a>.</em></p>
<blockquote>
<h1>0.22.0 (January 21, 2025)</h1>
<p>ENHANCEMENTS:</p>
<ul>
<li>tfexec: Add support for <code>terraform init --json</code> via
<code>InitJSON</code> (<a
href="https://redirect.github.com/hashicorp/terraform-exec/pull/478">#478</a>)</li>
</ul>
<p>INTERNAL:</p>
<ul>
<li>go: Require Go 1.22 (previously 1.18) (<a
href="https://redirect.github.com/hashicorp/terraform-exec/pull/499">#499</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="6801a6e6fd"><code>6801a6e</code></a>
v0.22.0 [skip ci]</li>
<li><a
href="dd2bc9a8a9"><code>dd2bc9a</code></a>
Update CHANGELOG.md (<a
href="https://redirect.github.com/hashicorp/terraform-exec/issues/501">#501</a>)</li>
<li><a
href="b5e5740e8d"><code>b5e5740</code></a>
build(deps): bump github.com/hashicorp/hc-install from 0.8.0 to 0.9.1
(<a
href="https://redirect.github.com/hashicorp/terraform-exec/issues/494">#494</a>)</li>
<li><a
href="abfb5ba96e"><code>abfb5ba</code></a>
tfexec: add InitJSON (<a
href="https://redirect.github.com/hashicorp/terraform-exec/issues/478">#478</a>)</li>
<li><a
href="840ecadf66"><code>840ecad</code></a>
ci/e2etests: Add latest major Terraform versions (<a
href="https://redirect.github.com/hashicorp/terraform-exec/issues/498">#498</a>)</li>
<li><a
href="4497f9edf3"><code>4497f9e</code></a>
go: Require Go 1.22 (previously 1.18) (<a
href="https://redirect.github.com/hashicorp/terraform-exec/issues/499">#499</a>)</li>
<li><a
href="b13b10be14"><code>b13b10b</code></a>
build(deps): bump github.com/zclconf/go-cty from 1.16.0 to 1.16.1 (<a
href="https://redirect.github.com/hashicorp/terraform-exec/issues/496">#496</a>)</li>
<li><a
href="6b0d5eb88a"><code>6b0d5eb</code></a>
build(deps): bump github.com/zclconf/go-cty from 1.15.1 to 1.16.0 (<a
href="https://redirect.github.com/hashicorp/terraform-exec/issues/495">#495</a>)</li>
<li><a
href="ef0b6c386e"><code>ef0b6c3</code></a>
build(deps): Bump workflows to latest trusted versions (<a
href="https://redirect.github.com/hashicorp/terraform-exec/issues/493">#493</a>)</li>
<li><a
href="c75d998af5"><code>c75d998</code></a>
build(deps): bump github.com/hashicorp/terraform-json from 0.23.0 to
0.24.0 (...</li>
<li>Additional commits viewable in <a
href="https://github.com/hashicorp/terraform-exec/compare/v0.21.0...v0.22.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/hashicorp/terraform-exec&package-manager=go_modules&previous-version=0.21.0&new-version=0.22.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-03 12:38:16 +01:00
dependabot[bot] 75db82ae1f
Bump actions/create-github-app-token from 1.11.1 to 1.11.2 (#2276)
Bumps
[actions/create-github-app-token](https://github.com/actions/create-github-app-token)
from 1.11.1 to 1.11.2.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/actions/create-github-app-token/releases">actions/create-github-app-token's
releases</a>.</em></p>
<blockquote>
<h2>v1.11.2</h2>
<h2><a
href="https://github.com/actions/create-github-app-token/compare/v1.11.1...v1.11.2">1.11.2</a>
(2025-01-30)</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>deps:</strong> bump <code>@​octokit/request</code> from
9.1.3 to 9.1.4 in the production-dependencies group (<a
href="https://redirect.github.com/actions/create-github-app-token/issues/196">#196</a>)
(<a
href="b4192a5b36">b4192a5</a>),
closes <a
href="https://redirect.github.com/actions/create-github-app-token/issues/730">#730</a>
<a
href="https://redirect.github.com/actions/create-github-app-token/issues/730">#730</a>
<a
href="https://redirect.github.com/actions/create-github-app-token/issues/729">#729</a>
<a
href="https://redirect.github.com/actions/create-github-app-token/issues/727">#727</a>
<a
href="https://redirect.github.com/actions/create-github-app-token/issues/726">#726</a>
<a
href="https://redirect.github.com/actions/create-github-app-token/issues/723">#723</a>
<a
href="https://redirect.github.com/actions/create-github-app-token/issues/724">#724</a>
<a
href="https://redirect.github.com/actions/create-github-app-token/issues/722">#722</a>
<a
href="https://redirect.github.com/actions/create-github-app-token/issues/721">#721</a>
<a
href="https://redirect.github.com/actions/create-github-app-token/issues/720">#720</a>
<a
href="https://redirect.github.com/actions/create-github-app-token/issues/719">#719</a></li>
<li><strong>deps:</strong> bump undici from 6.19.8 to 7.2.0 (<a
href="https://redirect.github.com/actions/create-github-app-token/issues/198">#198</a>)
(<a
href="29aa0514a7">29aa051</a>),
closes <a
href="https://redirect.github.com/nodejs/undici/issues/3958">nodejs/undici#3958</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3955">nodejs/undici#3955</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3962">nodejs/undici#3962</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3921">nodejs/undici#3921</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3923">nodejs/undici#3923</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3925">nodejs/undici#3925</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3926">nodejs/undici#3926</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3924">nodejs/undici#3924</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3933">nodejs/undici#3933</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3916">nodejs/undici#3916</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3930">nodejs/undici#3930</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3938">nodejs/undici#3938</a>
<a
href="https://redirect.github.com/actions/create-github-app-token/issues/3937">#3937</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3940">nodejs/undici#3940</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3931">nodejs/undici#3931</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3941">nodejs/undici#3941</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3911">nodejs/undici#3911</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3888">nodejs/undici#3888</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3939">nodejs/undici#3939</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3947">nodejs/undici#3947</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3945">nodejs/undici#3945</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3916">nodejs/undici#3916</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3893">nodejs/undici#3893</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3902">nodejs/undici#3902</a>
<a
href="https://redirect.github.com/actions/create-github-app-token/issues/3901">#3901</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3903">nodejs/undici#3903</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3905">nodejs/undici#3905</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3900">nodejs/undici#3900</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3913">nodejs/undici#3913</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3910">nodejs/undici#3910</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3909">nodejs/undici#3909</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3906">nodejs/undici#3906</a>
<a
href="https://redirect.github.com/nodejs/undici/issues/3922">nodejs/undici#3922</a>
<a
href="https://redirect.github.com/actions/create-github-app-token/issues/3962">#3962</a>
<a
href="https://redirect.github.com/actions/create-github-app-token/issues/3955">#3955</a>
<a
href="https://redirect.github.com/actions/create-github-app-token/issues/3958">#3958</a>
<a
href="https://redirect.github.com/actions/create-github-app-token/issues/3945">#3945</a>
<a
href="https://redirect.github.com/actions/create-github-app-token/issues/3947">#3947</a>
<a
href="https://redirect.github.com/actions/create-github-app-token/issues/3939">#3939</a>
<a
href="https://redirect.github.com/actions/create-github-app-token/issues/3888">#3888</a>
<a
href="https://redirect.github.com/actions/create-github-app-token/issues/3911">#3911</a>
<a
href="https://redirect.github.com/actions/create-github-app-token/issues/3941">#3941</a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="136412a57a"><code>136412a</code></a>
build(release): 1.11.2 [skip ci]</li>
<li><a
href="b4192a5b36"><code>b4192a5</code></a>
fix(deps): bump <code>@​octokit/request</code> from 9.1.3 to 9.1.4 in
the production-depend...</li>
<li><a
href="29aa0514a7"><code>29aa051</code></a>
fix(deps): bump undici from 6.19.8 to 7.2.0 (<a
href="https://redirect.github.com/actions/create-github-app-token/issues/198">#198</a>)</li>
<li><a
href="a5f8600f58"><code>a5f8600</code></a>
build(deps-dev): bump <code>@​sinonjs/fake-timers</code> from 13.0.2 to
14.0.0 (<a
href="https://redirect.github.com/actions/create-github-app-token/issues/199">#199</a>)</li>
<li><a
href="0edddd70c8"><code>0edddd7</code></a>
build(deps-dev): bump the development-dependencies group with 2 updates
(<a
href="https://redirect.github.com/actions/create-github-app-token/issues/197">#197</a>)</li>
<li><a
href="bb3ca765af"><code>bb3ca76</code></a>
docs(README): remove extra space in variable syntax in README example
(<a
href="https://redirect.github.com/actions/create-github-app-token/issues/201">#201</a>)</li>
<li>See full diff in <a
href="c1a285145b...136412a57a">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=actions/create-github-app-token&package-manager=github_actions&previous-version=1.11.1&new-version=1.11.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-03 12:37:17 +01:00
dependabot[bot] 4f3a289333
Bump actions/stale from 9.0.0 to 9.1.0 (#2275)
Bumps [actions/stale](https://github.com/actions/stale) from 9.0.0 to
9.1.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/actions/stale/releases">actions/stale's
releases</a>.</em></p>
<blockquote>
<h2>v9.1.0</h2>
<h2>What's Changed</h2>
<ul>
<li>Documentation update by <a
href="https://github.com/Marukome0743"><code>@​Marukome0743</code></a>
in <a
href="https://redirect.github.com/actions/stale/pull/1116">actions/stale#1116</a></li>
<li>Add workflow file for publishing releases to immutable action
package by <a
href="https://github.com/Jcambass"><code>@​Jcambass</code></a> in <a
href="https://redirect.github.com/actions/stale/pull/1179">actions/stale#1179</a></li>
<li>Update undici from 5.28.2 to 5.28.4 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/actions/stale/pull/1150">actions/stale#1150</a></li>
<li>Update actions/checkout from 3 to 4 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/actions/stale/pull/1091">actions/stale#1091</a></li>
<li>Update actions/publish-action from 0.2.2 to 0.3.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/actions/stale/pull/1147">actions/stale#1147</a></li>
<li>Update ts-jest from 29.1.1 to 29.2.5 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/actions/stale/pull/1175">actions/stale#1175</a></li>
<li>Update <code>@​actions/core</code> from 1.10.1 to 1.11.1 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/actions/stale/pull/1191">actions/stale#1191</a></li>
<li>Update <code>@​types/jest</code> from 29.5.11 to 29.5.14 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/actions/stale/pull/1193">actions/stale#1193</a></li>
<li>Update <code>@​actions/cache</code> from 3.2.2 to 4.0.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/actions/stale/pull/1194">actions/stale#1194</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/Marukome0743"><code>@​Marukome0743</code></a>
made their first contribution in <a
href="https://redirect.github.com/actions/stale/pull/1116">actions/stale#1116</a></li>
<li><a href="https://github.com/Jcambass"><code>@​Jcambass</code></a>
made their first contribution in <a
href="https://redirect.github.com/actions/stale/pull/1179">actions/stale#1179</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/actions/stale/compare/v9...v9.1.0">https://github.com/actions/stale/compare/v9...v9.1.0</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="5bef64f19d"><code>5bef64f</code></a>
build(deps): bump <code>@​actions/cache</code> from 3.2.2 to 4.0.0 (<a
href="https://redirect.github.com/actions/stale/issues/1194">#1194</a>)</li>
<li><a
href="fa77dfddd0"><code>fa77dfd</code></a>
build(deps-dev): bump <code>@​types/jest</code> from 29.5.11 to 29.5.14
(<a
href="https://redirect.github.com/actions/stale/issues/1193">#1193</a>)</li>
<li><a
href="f04443dce3"><code>f04443d</code></a>
build(deps): bump <code>@​actions/core</code> from 1.10.1 to 1.11.1 (<a
href="https://redirect.github.com/actions/stale/issues/1191">#1191</a>)</li>
<li><a
href="5c715b0513"><code>5c715b0</code></a>
build(deps-dev): bump ts-jest from 29.1.1 to 29.2.5 (<a
href="https://redirect.github.com/actions/stale/issues/1175">#1175</a>)</li>
<li><a
href="f69122271d"><code>f691222</code></a>
build(deps): bump actions/publish-action from 0.2.2 to 0.3.0 (<a
href="https://redirect.github.com/actions/stale/issues/1147">#1147</a>)</li>
<li><a
href="df990c2cf5"><code>df990c2</code></a>
build(deps): bump actions/checkout from 3 to 4 (<a
href="https://redirect.github.com/actions/stale/issues/1091">#1091</a>)</li>
<li><a
href="6e472ce44a"><code>6e472ce</code></a>
Merge pull request <a
href="https://redirect.github.com/actions/stale/issues/1179">#1179</a>
from actions/Jcambass-patch-1</li>
<li><a
href="d10ba64261"><code>d10ba64</code></a>
Merge pull request <a
href="https://redirect.github.com/actions/stale/issues/1150">#1150</a>
from actions/dependabot/npm_and_yarn/undici-5.28.4</li>
<li><a
href="bbf3da5f64"><code>bbf3da5</code></a>
resolve check failures</li>
<li><a
href="6a2e61d18b"><code>6a2e61d</code></a>
Add workflow file for publishing releases to immutable action
package</li>
<li>Additional commits viewable in <a
href="28ca103628...5bef64f19d">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=actions/stale&package-manager=github_actions&previous-version=9.0.0&new-version=9.1.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-03 12:36:50 +01:00
dependabot[bot] 75932198f7
Bump astral-sh/ruff-action from 3.0.1 to 3.1.0 (#2274)
Bumps [astral-sh/ruff-action](https://github.com/astral-sh/ruff-action)
from 3.0.1 to 3.1.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff-action/releases">astral-sh/ruff-action's
releases</a>.</em></p>
<blockquote>
<h2>v3.1.0 🌈 Determine ruff version from optional or dependency
groups</h2>
<h2>Changes</h2>
<p>Big thank you to <a
href="https://github.com/AA-Turner"><code>@​AA-Turner</code></a> for
expanding the pyproject.toml parsing to also find the ruff version to
use in the following scenarios:</p>
<pre lang="toml"><code>[dependency-groups]
dev = [
    { include-group = &quot;docs&quot; },
    { include-group = &quot;lint&quot; },
]
docs = [
    &quot;sphinx&quot;,
]
lint = [
    &quot;ruff==0.8.3&quot;,
]
</code></pre>
<pre lang="toml"><code>[project.optional-dependencies]
lint = [
    &quot;ruff==0.8.3&quot;,
]
</code></pre>
<h2>🚀 Enhancements</h2>
<ul>
<li>Read the <code>[project.optional-dependencies]</code> and
<code>[dependency-groups]</code> tables <a
href="https://github.com/AA-Turner"><code>@​AA-Turner</code></a> (<a
href="https://redirect.github.com/astral-sh/ruff-action/issues/66">#66</a>)</li>
</ul>
<h2>v3.0.2 🌈 Full support for GHES</h2>
<h2>Changes</h2>
<p>This release fixes some issues that prevented use with GitHub
Enterprise Server instances.
Parsing the ruff version from pyproject.toml now also uses a library
that is fully TOML 1.0.0 compliant.</p>
<h2>🐛 Bug fixes</h2>
<ul>
<li>Do not expect GITHUB_TOKEN to be set or valid <a
href="https://github.com/eifinger"><code>@​eifinger</code></a> (<a
href="https://redirect.github.com/astral-sh/ruff-action/issues/65">#65</a>)</li>
<li>Use TOML 1.0.0 compliant library for parsing <a
href="https://github.com/eifinger"><code>@​eifinger</code></a> (<a
href="https://redirect.github.com/astral-sh/ruff-action/issues/47">#47</a>)</li>
</ul>
<h2>🧰 Maintenance</h2>
<ul>
<li>Fix compiled known versions <a
href="https://github.com/eifinger"><code>@​eifinger</code></a> (<a
href="https://redirect.github.com/astral-sh/ruff-action/issues/62">#62</a>)</li>
<li>chore: update known checksums for 0.9.3 @<a
href="https://github.com/apps/github-actions">github-actions[bot]</a>
(<a
href="https://redirect.github.com/astral-sh/ruff-action/issues/61">#61</a>)</li>
<li>chore: update known checksums for 0.9.1 @<a
href="https://github.com/apps/github-actions">github-actions[bot]</a>
(<a
href="https://redirect.github.com/astral-sh/ruff-action/issues/42">#42</a>)</li>
</ul>
<h2>📚 Documentation</h2>
<ul>
<li>Fix Markdown link to Install the latest version <a
href="https://github.com/eifinger"><code>@​eifinger</code></a> (<a
href="https://redirect.github.com/astral-sh/ruff-action/issues/58">#58</a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="f14634c415"><code>f14634c</code></a>
Read the <code>[project.optional-dependencies]</code> and
<code>[dependency-groups]</code> tables (...</li>
<li><a
href="47de3deae8"><code>47de3de</code></a>
Bump <code>@​types/node</code> from 22.10.10 to 22.12.0 (<a
href="https://redirect.github.com/astral-sh/ruff-action/issues/60">#60</a>)</li>
<li><a
href="d8281c74d4"><code>d8281c7</code></a>
Do not expect GITHUB_TOKEN to be set or valid (<a
href="https://redirect.github.com/astral-sh/ruff-action/issues/65">#65</a>)</li>
<li><a
href="a634044659"><code>a634044</code></a>
Bump eifinger/actionlint-action from 1.9.0 to 1.9.1 (<a
href="https://redirect.github.com/astral-sh/ruff-action/issues/59">#59</a>)</li>
<li><a
href="2993ff4a65"><code>2993ff4</code></a>
Fix compiled known versions (<a
href="https://redirect.github.com/astral-sh/ruff-action/issues/62">#62</a>)</li>
<li><a
href="20a3b171f4"><code>20a3b17</code></a>
chore: update known checksums for 0.9.3 (<a
href="https://redirect.github.com/astral-sh/ruff-action/issues/61">#61</a>)</li>
<li><a
href="1c1aef9e3d"><code>1c1aef9</code></a>
Bump typescript from 5.7.2 to 5.7.3 (<a
href="https://redirect.github.com/astral-sh/ruff-action/issues/41">#41</a>)</li>
<li><a
href="0ceb04d9a0"><code>0ceb04d</code></a>
Bump release-drafter/release-drafter from 6.0.0 to 6.1.0 (<a
href="https://redirect.github.com/astral-sh/ruff-action/issues/50">#50</a>)</li>
<li><a
href="18db80c954"><code>18db80c</code></a>
Bump <code>@​types/node</code> from 22.10.5 to 22.10.10 (<a
href="https://redirect.github.com/astral-sh/ruff-action/issues/53">#53</a>)</li>
<li><a
href="0a5dfb89f1"><code>0a5dfb8</code></a>
Fix Markdown link to Install the latest version (<a
href="https://redirect.github.com/astral-sh/ruff-action/issues/58">#58</a>)</li>
<li>Additional commits viewable in <a
href="31a5185046...f14634c415">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=astral-sh/ruff-action&package-manager=github_actions&previous-version=3.0.1&new-version=3.1.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-03 12:36:42 +01:00
dependabot[bot] 91e04cc444
Bump golangci/golangci-lint-action from 6.1.1 to 6.2.0 (#2273)
Bumps
[golangci/golangci-lint-action](https://github.com/golangci/golangci-lint-action)
from 6.1.1 to 6.2.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/golangci/golangci-lint-action/releases">golangci/golangci-lint-action's
releases</a>.</em></p>
<blockquote>
<h2>v6.2.0</h2>
<!-- raw HTML omitted -->
<h2>What's Changed</h2>
<h3>Changes</h3>
<ul>
<li>chore: use new build tag syntax by <a
href="https://github.com/alexandear"><code>@​alexandear</code></a> in <a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1133">golangci/golangci-lint-action#1133</a></li>
<li>feat: support linux arm64 public preview by <a
href="https://github.com/ldez"><code>@​ldez</code></a> in <a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1144">golangci/golangci-lint-action#1144</a></li>
</ul>
<h3>Documentation</h3>
<ul>
<li>docs: update local development instructions by <a
href="https://github.com/dmitris"><code>@​dmitris</code></a> in <a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1125">golangci/golangci-lint-action#1125</a></li>
</ul>
<h3>Dependencies</h3>
<ul>
<li>build(deps-dev): bump the dev-dependencies group with 3 updates by
<a href="https://github.com/dependabot"><code>@​dependabot</code></a> in
<a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1112">golangci/golangci-lint-action#1112</a></li>
<li>build(deps): bump the dependencies group with 2 updates by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1113">golangci/golangci-lint-action#1113</a></li>
<li>build(deps-dev): bump the dev-dependencies group with 3 updates by
<a href="https://github.com/dependabot"><code>@​dependabot</code></a> in
<a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1114">golangci/golangci-lint-action#1114</a></li>
<li>build(deps): bump <code>@​types/node</code> from 22.7.4 to 22.7.5 in
the dependencies group by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1115">golangci/golangci-lint-action#1115</a></li>
<li>build(deps-dev): bump the dev-dependencies group with 2 updates by
<a href="https://github.com/dependabot"><code>@​dependabot</code></a> in
<a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1117">golangci/golangci-lint-action#1117</a></li>
<li>build(deps): bump <code>@​types/node</code> from 22.7.5 to 22.7.7 in
the dependencies group by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1118">golangci/golangci-lint-action#1118</a></li>
<li>build(deps-dev): bump the dev-dependencies group with 2 updates by
<a href="https://github.com/dependabot"><code>@​dependabot</code></a> in
<a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1119">golangci/golangci-lint-action#1119</a></li>
<li>build(deps): bump <code>@​types/node</code> from 22.7.7 to 22.8.1 in
the dependencies group by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1120">golangci/golangci-lint-action#1120</a></li>
<li>build(deps-dev): bump the dev-dependencies group with 2 updates by
<a href="https://github.com/dependabot"><code>@​dependabot</code></a> in
<a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1122">golangci/golangci-lint-action#1122</a></li>
<li>build(deps): bump the dependencies group with 2 updates by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1123">golangci/golangci-lint-action#1123</a></li>
<li>build(deps-dev): bump the dev-dependencies group with 2 updates by
<a href="https://github.com/dependabot"><code>@​dependabot</code></a> in
<a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1126">golangci/golangci-lint-action#1126</a></li>
<li>build(deps): bump <code>@​types/node</code> from 22.8.7 to 22.9.0 in
the dependencies group by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1127">golangci/golangci-lint-action#1127</a></li>
<li>build(deps-dev): bump the dev-dependencies group with 3 updates by
<a href="https://github.com/dependabot"><code>@​dependabot</code></a> in
<a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1128">golangci/golangci-lint-action#1128</a></li>
<li>build(deps): bump <code>@​types/node</code> from 22.9.0 to 22.9.3 in
the dependencies group by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1130">golangci/golangci-lint-action#1130</a></li>
<li>build(deps): bump <code>@​types/node</code> from 22.9.3 to 22.10.1
in the dependencies group by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1131">golangci/golangci-lint-action#1131</a></li>
<li>build(deps-dev): bump the dev-dependencies group across 1 directory
with 4 updates by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1132">golangci/golangci-lint-action#1132</a></li>
<li>build(deps-dev): bump the dev-dependencies group with 3 updates by
<a href="https://github.com/dependabot"><code>@​dependabot</code></a> in
<a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1134">golangci/golangci-lint-action#1134</a></li>
<li>build(deps): bump <code>@​actions/cache</code> from 3.3.0 to 4.0.0
in the dependencies group by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1135">golangci/golangci-lint-action#1135</a></li>
<li>build(deps-dev): bump the dev-dependencies group with 2 updates by
<a href="https://github.com/dependabot"><code>@​dependabot</code></a> in
<a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1136">golangci/golangci-lint-action#1136</a></li>
<li>build(deps): bump <code>@​types/node</code> from 22.10.1 to 22.10.2
in the dependencies group by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1137">golangci/golangci-lint-action#1137</a></li>
<li>build(deps-dev): bump the dev-dependencies group with 2 updates by
<a href="https://github.com/dependabot"><code>@​dependabot</code></a> in
<a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1138">golangci/golangci-lint-action#1138</a></li>
<li>build(deps-dev): bump the dev-dependencies group with 2 updates by
<a href="https://github.com/dependabot"><code>@​dependabot</code></a> in
<a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1139">golangci/golangci-lint-action#1139</a></li>
<li>build(deps-dev): bump the dev-dependencies group with 2 updates by
<a href="https://github.com/dependabot"><code>@​dependabot</code></a> in
<a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1141">golangci/golangci-lint-action#1141</a></li>
<li>build(deps): bump <code>@​types/node</code> from 22.10.2 to 22.10.5
in the dependencies group by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1142">golangci/golangci-lint-action#1142</a></li>
<li>build(deps-dev): bump the dev-dependencies group with 3 updates by
<a href="https://github.com/dependabot"><code>@​dependabot</code></a> in
<a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1143">golangci/golangci-lint-action#1143</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/dmitris"><code>@​dmitris</code></a> made
their first contribution in <a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1125">golangci/golangci-lint-action#1125</a></li>
<li><a
href="https://github.com/alexandear"><code>@​alexandear</code></a> made
their first contribution in <a
href="https://redirect.github.com/golangci/golangci-lint-action/pull/1133">golangci/golangci-lint-action#1133</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/golangci/golangci-lint-action/compare/v6.1.1...v6.2.0">https://github.com/golangci/golangci-lint-action/compare/v6.1.1...v6.2.0</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="ec5d18412c"><code>ec5d184</code></a>
feat: support linux arm64 public preview (<a
href="https://redirect.github.com/golangci/golangci-lint-action/issues/1144">#1144</a>)</li>
<li><a
href="a0297a1378"><code>a0297a1</code></a>
build(deps-dev): bump the dev-dependencies group with 3 updates (<a
href="https://redirect.github.com/golangci/golangci-lint-action/issues/1143">#1143</a>)</li>
<li><a
href="58eda26a51"><code>58eda26</code></a>
build(deps): bump <code>@​types/node</code> from 22.10.2 to 22.10.5 in
the dependencies gro...</li>
<li><a
href="44c2434506"><code>44c2434</code></a>
build(deps-dev): bump the dev-dependencies group with 2 updates (<a
href="https://redirect.github.com/golangci/golangci-lint-action/issues/1141">#1141</a>)</li>
<li><a
href="2f13b8027d"><code>2f13b80</code></a>
build(deps-dev): bump the dev-dependencies group with 2 updates (<a
href="https://redirect.github.com/golangci/golangci-lint-action/issues/1139">#1139</a>)</li>
<li><a
href="1ac36865a6"><code>1ac3686</code></a>
build(deps-dev): bump the dev-dependencies group with 2 updates (<a
href="https://redirect.github.com/golangci/golangci-lint-action/issues/1138">#1138</a>)</li>
<li><a
href="9937fdf718"><code>9937fdf</code></a>
build(deps): bump <code>@​types/node</code> from 22.10.1 to 22.10.2 in
the dependencies gro...</li>
<li><a
href="cb60b26e7a"><code>cb60b26</code></a>
build(deps-dev): bump the dev-dependencies group with 2 updates (<a
href="https://redirect.github.com/golangci/golangci-lint-action/issues/1136">#1136</a>)</li>
<li><a
href="774c35bccc"><code>774c35b</code></a>
build(deps): bump <code>@​actions/cache</code> from 3.3.0 to 4.0.0 in
the dependencies grou...</li>
<li><a
href="7ce548721e"><code>7ce5487</code></a>
build(deps-dev): bump the dev-dependencies group with 3 updates (<a
href="https://redirect.github.com/golangci/golangci-lint-action/issues/1134">#1134</a>)</li>
<li>Additional commits viewable in <a
href="971e284b60...ec5d18412c">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golangci/golangci-lint-action&package-manager=github_actions&previous-version=6.1.1&new-version=6.2.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-03 12:36:30 +01:00
Denis Bilenko f267318bb9
Include acceptance tests in integration tests (#2242)
## Changes
- Include acceptance directory in integration tests. Acceptance tests
will not start local server if CLOUD_ENV is set, so they become
integration tests.
- Add dependency for vendor to integration, so that CLI can be build
there.
- Implement LocalOnly option in test.toml to opt out of running
acceptance tests as integration tests. Use it in certain tests that are
difficult or not necessary to fix when run as integration tests.
- Update terraform test to redact timings out.
- Clean up .workspace.current_user from outputs of the tests.

## Tests
Existing tests.
2025-02-03 10:43:25 +00:00
Denis Bilenko fcedfe4c78
acc: Consistent & detailed output for file issues (#2279)
## Changes
- Include compact relPath in the error message title. Include full paths
in separate lines below.
- Previously sometimes full paths were printed, sometime only rel path.

## Tests
Manually trigger the errors.
2025-02-03 10:29:13 +00:00
Denis Bilenko 2f798c4ded
acc: Remove initial '$CLI --version' call (#2280)
It is proven to be not necessary.

```
~/work/cli/acceptance % hyperfine -w 2 'go test'  # with change:
Benchmark 1: go test
  Time (mean ± σ):      4.983 s ±  0.209 s    [User: 6.073 s, System: 9.869 s]
  Range (min … max):    4.792 s …  5.483 s    10 runs

~/work/cli/acceptance % git stash  # without change:
~/work/cli/acceptance % hyperfine -w 2 'go test'
Benchmark 1: go test
  Time (mean ± σ):      5.018 s ±  0.100 s    [User: 6.142 s, System: 10.234 s]
  Range (min … max):    4.899 s …  5.182 s    10 runs
```
2025-02-03 10:03:18 +00:00
Denis Bilenko e5730bf57e
Use real terraform in acceptance tests (#2267)
## Changes
- Add a script install_terraform.py that downloads terraform and
provider and generates a config to use, inspired by
https://gist.github.com/pietern/1cb6b6f3e0a452328e13cdc75031105e
- Make acceptance tests run this script once before running the tests
and set the required env vars to make cli use this terraform
installation.
- Use OS-specific directory for things that are build by acceptance test
runner (CLI and terraform).

This enables acceptance tests against cloud #2242 and local test for
bundle deploy #2254.

## Tests
- Add an acceptance test for standalone terraform. This is useful to
debug terraform with TF_LOG=DEBUG to see that it uses local provider.
- Other acceptance tests are updated with regard to terraform exec path.
- The overall time for tests locally is unchanged (if terraform is
already fetched).
2025-01-31 13:53:13 +00:00
shreyas-goenka 787dbe9099
Add request body assertions to acceptance tests (#2263)
## Changes
With this PR, any acceptance tests that define custom server stubs in
`test.toml` will automatically record all HTTP requests made and assert
on them.

Builds on top of https://github.com/databricks/cli/pull/2226

## Tests
Modifying existing acceptance test.
2025-01-31 13:31:23 +00:00
shreyas-goenka 3c6eacb05b
Add feature to mock server APIs in acceptance tests (#2226)
## Changes
This PR allows us to define custom server stubs in a `test.toml` file. 

Note: A followup PR will add functionality to do assertions on the API
request itself.

## Tests
New acceptance test.
2025-01-30 10:43:07 +00:00
Denis Bilenko f1efbd7d9f
acc: add -norepl flag that disables replacements (for debugging) (#2269) 2025-01-30 10:38:54 +00:00
Denis Bilenko a03ea73011
Add ruff.toml with increased line-length (#2268)
The default is 88 which reformats too much.

This has no effect on templates but affects Python script in this PR
https://github.com/databricks/cli/pull/2267

For context, we do not set any line length for golang and have 177 .go
files with max line length 150 or more.
2025-01-30 09:52:41 +00:00
Denis Bilenko 58ef34f320
acc: Include "id" into /api/2.0/preview/scim/v2/Me response (#2266)
This is something terraform provider expects.

Related to https://github.com/databricks/cli/pull/2242
2025-01-29 17:35:03 +00:00
shreyas-goenka 55c03cc119
Always close test HTTP server during cleanup (#2261)
## Changes
This PR registers the `server.Close()` function to be run during test
cleanup in the server initialization function. This ensures that all
test servers are closed as soon as the test they are scoped to finish.

Motivated by https://github.com/databricks/cli/pull/2255/files where a
regression was introduced where we did not close the test server.

## Tests
N/A
2025-01-29 15:54:33 +00:00
Andrew Nester ce965b22b2
[Release] Release v0.240.0 (#2264)
Bundles:
* Added support for double underscore variable references
([#2203](https://github.com/databricks/cli/pull/2203)).
* Do not wait for app compute to start on `bundle deploy`
([#2144](https://github.com/databricks/cli/pull/2144)).
* Remove bundle.git.inferred
([#2258](https://github.com/databricks/cli/pull/2258)).
* libs/python: Remove DetectInterpreters
([#2234](https://github.com/databricks/cli/pull/2234)).

API Changes:
 * Added `databricks access-control` command group.
 * Added `databricks serving-endpoints http-request` command.
* Changed `databricks serving-endpoints create` command with new
required argument order.
* Changed `databricks serving-endpoints get-open-api` command return
type to become non-empty.
* Changed `databricks recipients update` command return type to become
non-empty.

OpenAPI commit 0be1b914249781b5e903b7676fd02255755bc851 (2025-01-22)
Dependency updates:
* Bump github.com/databricks/databricks-sdk-go from 0.55.0 to 0.56.1
([#2238](https://github.com/databricks/cli/pull/2238)).
* Upgrade TF provider to 1.64.1
([#2247](https://github.com/databricks/cli/pull/2247)).
2025-01-29 16:55:53 +01:00
Denis Bilenko 38efedcd73
Remove bundle.git.inferred (#2258)
The only use case for it was to emit a warning and based on the
discussion here
https://github.com/databricks/cli/pull/2213/files#r1933558087 the
warning it not useful and logging that with reduced severity is also not
useful.
2025-01-29 14:15:52 +00:00
shreyas-goenka c3a6e11627
Add integration test for the /telemetry-ext endpoint (#2259)
## Changes
Followup from
https://github.com/databricks/cli/pull/2209#pullrequestreview-2580308075.

This PR adds an integration test to validate that the API type bindings
work against the telemetry endpoint.

## Tests
N/A

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2025-01-29 14:05:58 +00:00
Gleb Kanterov 13596eb605
PythonMutator: Fix relative path error (#2253)
## Changes

Fix relative path errors in the Python mutator that was failing during
deployment since v0.239.1.

Before that:

```
% databricks bundle deploy  
Deploying resources...
Updating deployment state...
Error: failed to compute relative path for job jobs_as_code_project_job: Rel: can't make resources/jobs_as_code_project_job.py relative to /Users/$USER/jobs_as_code_project
```

As a result, the bundle was deployed, but the deployment state wasn't
updated.

## Tests

Unit tests, adding acceptance tests in
https://github.com/databricks/cli/pull/2254
2025-01-29 13:56:57 +00:00
Andrew Nester ec7808da34
Added support for double underscore variable references (#2203)
## Changes
Added support for double underscore variable references.

Previously we made this restriction stronger with no particular reason,
TF provider supports multiple underscores and thus DABs should do as
well.

Fixes #1753

## Tests
Added acceptance and integration tests
2025-01-29 13:38:28 +00:00
Ilya Kuznetsov 59d6fbfee9
Restore variable file tests (#2220)
## Changes

Uncomment flaky tests, they work properly with latest changes from main 

## Tests
<!-- How is this tested? -->
2025-01-29 13:34:26 +00:00
Ilya Kuznetsov 708c4fbb7a
Autogenerated documentation for bundle config (#2033)
## Changes

Documentation autogeneration tool. This tool uses same annotations_*.yml
files as in json-schema

Result will go
[there](https://docs.databricks.com/en/dev-tools/bundles/reference.html)
and
[there](https://docs.databricks.com/en/dev-tools/bundles/resources.html#cluster)

## Tests
Manually
2025-01-29 12:14:21 +00:00
shreyas-goenka 30f57d3b49
Add protos for bundle telemetry (#2209)
## Changes
These types correspond to the telemetry protobufs defined in universe.

## Tests
No tests are needed since this PR only adds the type bindings.

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2025-01-29 11:49:53 +00:00
shreyas-goenka 884b5f26ed
Set bundle auth configuration in command context (#2195)
## Changes

This change is required to enable tracking execution time telemetry for
bundle commands. In order to track execution time for the command
generally, we need to have the databricks auth configuration available
at this section of the code:


41bbd89257/cmd/root/root.go (L99)

In order to do this we can rely on the `configUsed` context key.   

Most commands rely on the `root.MustWorkspaceClient` function which
automatically sets the client config in the `configUsed` context key.
Bundle commands, however, do not do so. They instead store their
workspace clients in the `&bundle.Bundle{}` object.

With this PR, the `configUsed` context key will be set for all `bundle`
commands. Functionally nothing changes.

## Tests
Existing tests. Also manually verified that either
`root.MustConfigureBundle` or `utils.ConfigureBundleWithVariables` is
called for all bundle commands (except `bundle init`) thus ensuring this
context key would be set for all bundle commands.

refs for the functions:
1. `root.MustConfigureBundle`:
41bbd89257/cmd/root/bundle.go (L88)
2. `utils.ConfigureBundleWithVariables`:
41bbd89257/cmd/bundle/utils/utils.go (L19)

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2025-01-29 11:02:08 +00:00
shreyas-goenka 124515e8d2
Move TestServer from acceptance to libs/testserver (#2255)
## Changes
Just a move, no changes. As recommended here:
https://github.com/databricks/cli/pull/2226#discussion_r1932152627

## Tests
N/A
2025-01-29 10:42:21 +00:00
Andrew Nester 413ca5c134
Do not wait for app compute to start on `bundle deploy` (#2144)
## Changes
This allows DABs to avoid waiting for the compute to start when app is
initially created as part of "bundle deploy" which significantly
improves deploy time.

Always set no_compute to true for apps

## Tests
Covered by `TestDeployBundleWithApp`, currently fails until TF provider
is upgraded to the version supporting `no_compute` option
2025-01-28 17:17:37 +00:00
Andrew Nester 099e9bed0f
Upgrade TF provider to 1.64.1 (#2247)
## Changes
- Added support for `no_compute` in Apps
- Added support for `run_as_repl` for job tasks
2025-01-28 14:34:44 +00:00
Denis Bilenko 4ba222ab36
Fix env_overrides not to use variables in workspace.profile (#2251)
This does not work when this test is run against cloud.

Needed for https://github.com/databricks/cli/pull/2242
2025-01-28 15:22:56 +01:00
Denis Bilenko 0256225408
acc: Exclude secrets from replacements (#2250)
They should never be printed by CLI anyway.
2025-01-28 15:12:47 +01:00
Denis Bilenko 5971bd5c1a
acc: Disable git hooks (#2249)
Otherwise hooks from universe and custom hooks run in tests.
2025-01-28 14:00:41 +00:00
shreyas-goenka 65e4f79dfe
Switch to using `[` from `<` in text replacements (#2224)
## Changes
Noticed this when working on
https://github.com/databricks/cli/pull/2221. `<` is a special HTML
character that is encoded during text replacement when using
`AssertEqualTexts`.


## Tests
N/A
2025-01-28 10:54:23 +00:00
Denis Bilenko 3ffac80007
acc: Use real terraform when CLOUD_ENV is set (#2245)
## Changes
- If CLOUD_ENV is set to do not override with dummy value. This allows
running acceptance tests as integration tests.
- Needed for https://github.com/databricks/cli/pull/2242

## Tests
Manually run the test suite against dogfood. `CLOUD_ENV=aws go test
./acceptance`
2025-01-28 10:23:44 +00:00
Denis Bilenko 11436faafe
acc: Avoid reading and applying replacements on large files; validate utf8 (#2244)
## Changes
- Do not start replacement / comparison if file is too large or not
valid utf-8.
- This helps to prevent replacements if there is accidentally a large
binary (e.g. terraform).

## Tests
Found this problem when working on
https://github.com/databricks/cli/pull/2242 -- the tests tried to
applied replacements on terraform binary and crashed. With this change,
an error is reported instead.
2025-01-28 10:22:29 +00:00
Denis Bilenko 60709e3d48
acc: Restore unexpected output error (#2243)
## Changes
Restore original behaviour of acceptance tests: any unaccounted for
files trigger an error (not just those that start with "out"). This got
changed in
https://github.com/databricks/cli/pull/2146/files#diff-2bb968d823f4afb825e1dcea2879bdbdedf2b7c15d4e77f47905691b14246a04L196
which started only checking files starting with "out*" and skipping
everything else.

## Tests
Existing tests.
2025-01-28 10:15:32 +00:00
Denis Bilenko be908ee1a1
Add acceptance test for 'experimental.scripts' (#2240) 2025-01-27 15:28:33 +00:00
Denis Bilenko 67d1413db5
Add default regex for DEV_VERSION (#2241)
## Changes

- Replace development version with $DEV_VERSION
- Update experimental-jobs-as-code to make use of it.

## Tests
- Existing tests.
- Using this in https://github.com/databricks/cli/pull/2213
2025-01-27 15:34:53 +01:00
Denis Bilenko 52bf7e388a
acc: Propagate user's UV_CACHE_DIR to tests (#2239)
There is a speed up in 0.5s but it is still 4.4s, so something else is
slow there.

Benchmarking bundle/templates/experimental-jobs-as-code:

```
# Without UV_CACHE_DIR
~/work/cli/acceptance/bundle/templates/experimental-jobs-as-code % hyperfine --warmup 2 'testme -count=1'
Benchmark 1: testme -count=1
  Time (mean ± σ):      4.950 s ±  0.079 s    [User: 2.730 s, System: 8.524 s]
  Range (min … max):    4.838 s …  5.076 s    10 runs

# With UV_CACHE_DIR
~/work/cli/acceptance/bundle/templates/experimental-jobs-as-code % hyperfine --warmup 2 'testme -count=1'
Benchmark 1: testme -count=1
  Time (mean ± σ):      4.410 s ±  0.049 s    [User: 2.669 s, System: 8.710 s]
  Range (min … max):    4.324 s …  4.467 s    10 runs
```
2025-01-27 15:25:56 +01:00
Denis Bilenko 65fbbd9a7c
libs/python: Remove DetectInterpreters (#2234)
## Changes
- Remove DetectInterpreters from DetectExecutable call: python3 or
python should always be on on the PATH. We don't need to detect
non-standard situations like python3.10 is present but python3 is not.
- I moved DetectInterpreters to cmd/labs where it is still used.

This is a follow up to https://github.com/databricks/cli/pull/2034

## Tests
Existing tests.
2025-01-27 13:22:08 +00:00
dependabot[bot] 4595c6f1b5
Bump github.com/databricks/databricks-sdk-go from 0.55.0 to 0.56.1 (#2238)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.55.0 to 0.56.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.56.1</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Do not send query parameters when set to zero value (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1136">#1136</a>).</li>
</ul>
<h2>v0.56.0</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Support Query parameters for all HTTP operations (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1124">#1124</a>).</li>
</ul>
<h3>Internal Changes</h3>
<ul>
<li>Add download target to MakeFile (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1125">#1125</a>).</li>
<li>Delete examples/mocking module (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1126">#1126</a>).</li>
<li>Scope the traversing directory in the Recursive list workspace test
(<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1120">#1120</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/iam#AccessControlAPI">w.AccessControl</a>
workspace-level service.</li>
<li>Added <code>HttpRequest</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service.</li>
<li>Added <code>ReviewState</code>, <code>Reviews</code> and
<code>RunnerCollaborators</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomAssetNotebook">cleanrooms.CleanRoomAssetNotebook</a>.</li>
<li>Added <code>CleanRoomsNotebookOutput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunOutput">jobs.RunOutput</a>.</li>
<li>Added <code>RunAsRepl</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SparkJarTask">jobs.SparkJarTask</a>.</li>
<li>Added <code>Scopes</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateCustomAppIntegration">oauth2.UpdateCustomAppIntegration</a>.</li>
<li>Added <code>Contents</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetOpenApiResponse">serving.GetOpenApiResponse</a>.</li>
<li>Added <code>Activated</code>, <code>ActivationUrl</code>,
<code>AuthenticationType</code>, <code>Cloud</code>,
<code>Comment</code>, <code>CreatedAt</code>, <code>CreatedBy</code>,
<code>DataRecipientGlobalMetastoreId</code>, <code>IpAccessList</code>,
<code>MetastoreId</code>, <code>Name</code>, <code>Owner</code>,
<code>PropertiesKvpairs</code>, <code>Region</code>,
<code>SharingCode</code>, <code>Tokens</code>, <code>UpdatedAt</code>
and <code>UpdatedBy</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientInfo">sharing.RecipientInfo</a>.</li>
<li>Added <code>ExpirationTime</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientInfo">sharing.RecipientInfo</a>.</li>
<li>Added <code>Pending</code> enum value for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomAssetStatusEnum">cleanrooms.CleanRoomAssetStatusEnum</a>.</li>
<li>Added <code>AddNodesFailed</code>,
<code>AutomaticClusterUpdate</code>, <code>AutoscalingBackoff</code> and
<code>AutoscalingFailed</code> enum values for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EventType">compute.EventType</a>.</li>
<li>Added <code>PendingWarehouse</code> enum value for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#MessageStatus">dashboards.MessageStatus</a>.</li>
<li>Added <code>Cpu</code>, <code>GpuLarge</code>,
<code>GpuMedium</code>, <code>GpuSmall</code> and
<code>MultigpuMedium</code> enum values for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingModelWorkloadType">serving.ServingModelWorkloadType</a>.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI">w.Recipients</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientInfo">sharing.RecipientInfo</a>.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI">w.Recipients</a>
workspace-level service return type to become non-empty.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI">w.Recipients</a>
workspace-level service to type <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI">w.Recipients</a>
workspace-level service.</li>
<li>Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service with new required argument order.</li>
<li>Changed <code>GetOpenApi</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service return type to become non-empty.</li>
<li>Changed <code>Patch</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service to type <code>Patch</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service.</li>
<li>Changed <code>Patch</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#EndpointTags">serving.EndpointTags</a>.</li>
<li>Changed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#EndpointTagList">serving.EndpointTagList</a>
to.</li>
<li>Changed <code>CollaboratorAlias</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomCollaborator">cleanrooms.CleanRoomCollaborator</a>
to be required.</li>
<li>Changed <code>CollaboratorAlias</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomCollaborator">cleanrooms.CleanRoomCollaborator</a>
to be required.</li>
<li>Changed <code>Behavior</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AiGatewayGuardrailPiiBehavior">serving.AiGatewayGuardrailPiiBehavior</a>
to no longer be required.</li>
<li>Changed <code>Behavior</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AiGatewayGuardrailPiiBehavior">serving.AiGatewayGuardrailPiiBehavior</a>
to no longer be required.</li>
<li>Changed <code>Config</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#CreateServingEndpoint">serving.CreateServingEndpoint</a>
to no longer be required.</li>
<li>Changed <code>ProjectId</code> and <code>Region</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GoogleCloudVertexAiConfig">serving.GoogleCloudVertexAiConfig</a>
to be required.</li>
<li>Changed <code>ProjectId</code> and <code>Region</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GoogleCloudVertexAiConfig">serving.GoogleCloudVertexAiConfig</a>
to be required.</li>
<li>Changed <code>WorkloadType</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput">serving.ServedEntityInput</a>
to type <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingModelWorkloadType">serving.ServingModelWorkloadType</a>.</li>
<li>Changed <code>WorkloadType</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput">serving.ServedEntityOutput</a>
to type <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingModelWorkloadType">serving.ServingModelWorkloadType</a>.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>[Release] Release v0.56.1</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Do not send query parameters when set to zero value (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1136">#1136</a>).</li>
</ul>
<h2>[Release] Release v0.56.0</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Support Query parameters for all HTTP operations (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1124">#1124</a>).</li>
</ul>
<h3>Internal Changes</h3>
<ul>
<li>Add download target to MakeFile (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1125">#1125</a>).</li>
<li>Delete examples/mocking module (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1126">#1126</a>).</li>
<li>Scope the traversing directory in the Recursive list workspace test
(<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1120">#1120</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/iam#AccessControlAPI">w.AccessControl</a>
workspace-level service.</li>
<li>Added <code>HttpRequest</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service.</li>
<li>Added <code>ReviewState</code>, <code>Reviews</code> and
<code>RunnerCollaborators</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomAssetNotebook">cleanrooms.CleanRoomAssetNotebook</a>.</li>
<li>Added <code>CleanRoomsNotebookOutput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunOutput">jobs.RunOutput</a>.</li>
<li>Added <code>RunAsRepl</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SparkJarTask">jobs.SparkJarTask</a>.</li>
<li>Added <code>Scopes</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateCustomAppIntegration">oauth2.UpdateCustomAppIntegration</a>.</li>
<li>Added <code>Contents</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetOpenApiResponse">serving.GetOpenApiResponse</a>.</li>
<li>Added <code>Activated</code>, <code>ActivationUrl</code>,
<code>AuthenticationType</code>, <code>Cloud</code>,
<code>Comment</code>, <code>CreatedAt</code>, <code>CreatedBy</code>,
<code>DataRecipientGlobalMetastoreId</code>, <code>IpAccessList</code>,
<code>MetastoreId</code>, <code>Name</code>, <code>Owner</code>,
<code>PropertiesKvpairs</code>, <code>Region</code>,
<code>SharingCode</code>, <code>Tokens</code>, <code>UpdatedAt</code>
and <code>UpdatedBy</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientInfo">sharing.RecipientInfo</a>.</li>
<li>Added <code>ExpirationTime</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientInfo">sharing.RecipientInfo</a>.</li>
<li>Added <code>Pending</code> enum value for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomAssetStatusEnum">cleanrooms.CleanRoomAssetStatusEnum</a>.</li>
<li>Added <code>AddNodesFailed</code>,
<code>AutomaticClusterUpdate</code>, <code>AutoscalingBackoff</code> and
<code>AutoscalingFailed</code> enum values for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EventType">compute.EventType</a>.</li>
<li>Added <code>PendingWarehouse</code> enum value for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#MessageStatus">dashboards.MessageStatus</a>.</li>
<li>Added <code>Cpu</code>, <code>GpuLarge</code>,
<code>GpuMedium</code>, <code>GpuSmall</code> and
<code>MultigpuMedium</code> enum values for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingModelWorkloadType">serving.ServingModelWorkloadType</a>.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI">w.Recipients</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientInfo">sharing.RecipientInfo</a>.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI">w.Recipients</a>
workspace-level service return type to become non-empty.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI">w.Recipients</a>
workspace-level service to type <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI">w.Recipients</a>
workspace-level service.</li>
<li>Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service with new required argument order.</li>
<li>Changed <code>GetOpenApi</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service return type to become non-empty.</li>
<li>Changed <code>Patch</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service to type <code>Patch</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service.</li>
<li>Changed <code>Patch</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#EndpointTags">serving.EndpointTags</a>.</li>
<li>Changed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#EndpointTagList">serving.EndpointTagList</a>
to.</li>
<li>Changed <code>CollaboratorAlias</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomCollaborator">cleanrooms.CleanRoomCollaborator</a>
to be required.</li>
<li>Changed <code>CollaboratorAlias</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomCollaborator">cleanrooms.CleanRoomCollaborator</a>
to be required.</li>
<li>Changed <code>Behavior</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AiGatewayGuardrailPiiBehavior">serving.AiGatewayGuardrailPiiBehavior</a>
to no longer be required.</li>
<li>Changed <code>Behavior</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AiGatewayGuardrailPiiBehavior">serving.AiGatewayGuardrailPiiBehavior</a>
to no longer be required.</li>
<li>Changed <code>Config</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#CreateServingEndpoint">serving.CreateServingEndpoint</a>
to no longer be required.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="bf617bb7a6"><code>bf617bb</code></a>
[Release] Release v0.56.1 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1137">#1137</a>)</li>
<li><a
href="18cebf1d5c"><code>18cebf1</code></a>
[Fix] Do not send query parameters when set to zero value (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1136">#1136</a>)</li>
<li><a
href="28ff749ee2"><code>28ff749</code></a>
[Release] Release v0.56.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1134">#1134</a>)</li>
<li><a
href="113454080f"><code>1134540</code></a>
[Internal] Add download target to MakeFile (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1125">#1125</a>)</li>
<li><a
href="e079db96f3"><code>e079db9</code></a>
[Fix] Support Query parameters for all HTTP operations (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1124">#1124</a>)</li>
<li><a
href="1045fb9697"><code>1045fb9</code></a>
[Internal] Delete examples/mocking module (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1126">#1126</a>)</li>
<li><a
href="914ab6b7e8"><code>914ab6b</code></a>
[Internal] Scope the traversing directory in the Recursive list
workspace tes...</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.55.0...v0.56.1">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.55.0&new-version=0.56.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2025-01-27 13:11:07 +00:00
Denis Bilenko b7dd70b8b3
acc: Add a couple of error tests for 'bundle init' (#2233)
This captures how we log errors related to subprocess run and what does
the output look like.
2025-01-27 12:22:40 +00:00
Denis Bilenko 6e8f0ea8af
CI: Move ruff to 'lint' job (#2232)
This is where it belongs and also there is no need to run it 3 times.
2025-01-27 10:33:16 +00:00
Denis Bilenko 1cb32eca90
acc: Support custom replacements (#2231)
## Changes
- Ability to extend a list of replacements via test.toml
- Modify selftest to both demo this feature and to get rid of sed on
Windows.

## Tests
Acceptance tests. I'm also using it
https://github.com/databricks/cli/pull/2213 for things like pid.
2025-01-27 09:11:06 +00:00
Denis Bilenko 82b0dd36d6
Add acceptance/selftest, showcasing basic features (#2229)
Also make TestInprocessMode use this test.
2025-01-27 09:17:22 +01:00
Denis Bilenko b3d98fe666
acc: Print replacements on error and rm duplicates (#2230)
## Changes
- File comparison files in acceptance test, print the contents of all
applied replacements. Do it once per test.
- Remove duplicate entries in replacement list.

## Tests
Manually, change out files of existing test, you'll get this printed
once, after first assertion:

```
        acceptance_test.go:307: Available replacements:
            REPL /Users/denis\.bilenko/work/cli/acceptance/build/databricks => $$CLI
            REPL /private/var/folders/5y/9kkdnjw91p11vsqwk0cvmk200000gp/T/TestAccept598522733/001 => $$TMPHOME
            ...
```
2025-01-27 07:45:09 +00:00
Denis Bilenko 468660dc45
Add an acc test covering failures when reading .git (#2223)
## Changes
- New test covering failures in reading .git. One case results in error,
some result in warning (not shown).
- New helper withdir runs commands in a subdirectory.

## Tests
New acceptance test.
2025-01-24 15:53:06 +00:00
Pieter Noordhuis f65508690d
Update publish-winget action to use Komac directly (#2228)
## Changes

For the most recent release, I had to re-run the "publish-winget" action
a couple of times before it passed. The underlying issue that causes the
failure should be solved by the latest version of the action, but upon
inspection of the latest version, I found that it always installs the
latest version of [Komac](https://github.com/russellbanks/Komac). To
both fix the issue and lock this down further, I updated our action to
call Komac directly instead of relying on a separate action to do this
for us.

## Tests

Successful run in
https://github.com/databricks/cli/actions/runs/12951529979.
2025-01-24 15:33:54 +00:00
Denis Bilenko 959e43e556
acc: Support per-test configuration; GOOS option to disable OS (#2227)
## Changes
- Acceptance tests load test.toml to configure test behaviour.
- If file is not found in the test directory, parents are searched,
until the test root.
- Currently there is one option: runtime.GOOS to switch off tests per
OS.

## Tests
Using it in https://github.com/databricks/cli/pull/2223 to disable test
on Windows that cannot be run there.
2025-01-24 14:28:23 +00:00
shreyas-goenka a47a058506
Limit test server to only accept GET on read endpoints (#2225)
## Changes
Now the test server will only match GET queries for these endpoints

## Tests
Existing tests.
2025-01-24 11:05:00 +00:00
Denis Bilenko b4ed235104
Include EvalSymlinks in SetPath and use SetPath on all paths (#2219)
## Changes
When adding path, a few things should take care of:
- symlink expansion
- forward/backward slashes, so that tests could do sed 's/\\\\/\//g' to
make it pass on Windows (see
acceptance/bundle/syncroot/dotdot-git/script)

SetPath() function takes care of both.

This PR uses SetPath() on all paths consistently.

## Tests
Existing tests.
2025-01-24 10:18:44 +00:00
Denis Bilenko d6d9b994d4
acc: only print non-zero exit codes in errcode function (#2222)
Reduce noise in the output and matches how "Exit code" is handled for
the whole script.
2025-01-24 10:47:12 +01:00
Andrew Nester d784147e99
[Release] Release v0.239.1 (#2218)
CLI:
* Added text output templates for apps list and list-deployments
([#2175](https://github.com/databricks/cli/pull/2175)).
* Fix duplicate "apps" entry in help output
([#2191](https://github.com/databricks/cli/pull/2191)).

Bundles:
* Allow yaml-anchors in schema
([#2200](https://github.com/databricks/cli/pull/2200)).
* Show an error when non-yaml files used in include section
([#2201](https://github.com/databricks/cli/pull/2201)).
* Set WorktreeRoot to sync root outside git repo
([#2197](https://github.com/databricks/cli/pull/2197)).
* fix: Detailed message for using source-linked deployment with
file_path specified
([#2119](https://github.com/databricks/cli/pull/2119)).
* Allow using variables in enum fields
([#2199](https://github.com/databricks/cli/pull/2199)).
* Add experimental-jobs-as-code template
([#2177](https://github.com/databricks/cli/pull/2177)).
* Reading variables from file
([#2171](https://github.com/databricks/cli/pull/2171)).
* Fixed an apps message order and added output test
([#2174](https://github.com/databricks/cli/pull/2174)).
* Default to forward slash-separated paths for path translation
([#2145](https://github.com/databricks/cli/pull/2145)).
* Include a materialized copy of built-in templates
([#2146](https://github.com/databricks/cli/pull/2146)).
2025-01-23 15:54:55 +00:00
Ilya Kuznetsov 0487e816cc
Reading variables from file (#2171)
## Changes

New source of default values for variables - variable file
`.databricks/bundle/<target>/variable-overrides.json`

CLI tries to stat and read that file every time during variable
initialisation phase

<!-- Summary of your changes that are easy to understand -->

## Tests

Acceptance tests
2025-01-23 14:35:33 +00:00
Andrew Nester 8af9efaa62
Show an error when non-yaml files used in include section (#2201)
## Changes
`include` section is used only to include other bundle configuration
YAML files. If any other file type is used, raise an error and guide
users to use `sync.include` instead

## Tests
Added acceptance test

---------

Co-authored-by: Julia Crawford (Databricks) <julia.crawford@databricks.com>
2025-01-23 13:58:18 +00:00
Andrew Nester 6153423c56
Revert "Upgrade Go SDK to 0.56.0 (#2214)" (#2217)
This reverts commit 798189eb96.
2025-01-23 13:21:59 +00:00
Denis Bilenko ddd45e25ee
Pass USE_SDK_V2_{RESOURCES,DATA_SOURCES} to terraform (#2207)
## Changes
- Propagate env vars USE_SDK_V2_RESOURCES and $USE_SDK_V2_DATA_SOURCES
to terraform
- This are troubleshooting helpers for resources migrated to new plugin
framework, recommended here:
https://registry.terraform.io/providers/databricks/databricks/latest/docs/guides/troubleshooting#plugin-framework-migration-problems
- This current unblocks deploying quality monitors, see
https://github.com/databricks/terraform-provider-databricks/issues/4229#issuecomment-2520344690

## Tests
Manually testing that I can deploy quality monitor after this change
with `USE_SDK_V2_RESOURCES="databricks_quality_monitor"` set

### Main branch:
```
~/work/databricks_quality_monitor_repro % USE_SDK_V2_RESOURCES="databricks_quality_monitor" ../cli/cli-main bundle deploy
Uploading bundle files to /Workspace/Users/denis.bilenko@databricks.com/.bundle/quality_monitor_bundle/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!
Error: terraform apply: exit status 1

Error: Provider produced inconsistent result after apply

When applying changes to databricks_quality_monitor.monitor_trips, provider
"provider[\"registry.terraform.io/databricks/databricks\"]" produced an
unexpected new value: .data_classification_config: block count changed from 0
to 1.

This is a bug in the provider, which should be reported in the provider's own
issue tracker.
```

### This branch:
```
~/work/databricks_quality_monitor_repro % USE_SDK_V2_RESOURCES="databricks_quality_monitor" ../cli/cli bundle deploy
Uploading bundle files to /Workspace/Users/denis.bilenko@databricks.com/.bundle/quality_monitor_bundle/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!
```

### Config:
```
~/work/databricks_quality_monitor_repro % cat databricks.yml
bundle:
  name: quality_monitor_bundle

resources:
  quality_monitors:
    monitor_trips:
      table_name: main.denis-bilenko-cuj-pe34.trips_sanitized_1
      output_schema_name: main.denis-bilenko-cuj-pe34
      assets_dir: /Workspace/Users/${workspace.current_user.userName}/quality_monitor_issue
      snapshot: {}
```
2025-01-23 12:48:47 +00:00
Denis Bilenko 1f63aa0912
tests: Improve reporting in case of FS errors (#2216)
## Changes
If there are unreadable files in a directory, raise an error but
continue with further diagnostics, because the answer is in the script
output.

## Tests
Manually - I'm working on some tests that create unreadable files, the
report is much better with this change.
2025-01-23 11:46:22 +00:00
Andrew Nester 798189eb96
Upgrade Go SDK to 0.56.0 (#2214)
## Changes

Upgrade Go SDK to 0.56.0

Relevant changes:
- Support Query parameters for all HTTP operations
(https://github.com/databricks/databricks-sdk-go/pull/1124).
2025-01-23 11:17:52 +00:00
Ilya Kuznetsov f60ad32f07
Allow yaml-anchors in schema (#2200)
## Changes

Allows custom untyped fields in the root config in json-schema so it
doesn't highlight errors when using yaml-anchors.

Example use case:

```
tags: &job-tags
  environment: ${bundle.target}


resources:
  jobs:
    db1:
      tags:
        <<: *job-tags
    db1:
      tags:
        <<: *job-tags
```
  
One downside is that we don't highlight any unknown top-level properties
anymore (but they will still fail during CLI validation)

## Tests

Manually checked behavior in VSCode - it doesn't show validation error.
Also checked that other typed properties are still suggested
2025-01-23 11:11:44 +00:00
Denis Bilenko ba3a400327
Remove test-specific logic from generic test runner (#2215)
Revert changes to acceptance_test.go added in #2177 and add
test-specific fix.
2025-01-23 11:59:01 +01:00
Denis Bilenko 20c1902a45
Fix passing SingleTest to TestAccept (#2210) 2025-01-22 16:26:16 +00:00
Gleb Kanterov 3d91691f25
PythonMutator: propagate source locations (#1783)
## Changes
Add a mechanism to load Python source locations in the Python mutator.
Previously, locations pointed to generated YAML. Now, they point to
Python sources instead. Python process outputs "locations.json"
containing locations of bundle paths, examples:

```json
{"path": "resources.jobs.job_0", "file": "resources/job_0.py", "line": 3, "column": 5}
{"path": "resources.jobs.job_0.tasks[0].task_key", "file": "resources/job_0.py", "line": 10, "column": 5}
{"path": "resources.jobs.job_1", "file": "resources/job_1.py", "line": 5, "column": 7}
```

Such locations form a tree, and we assign locations of the closest
ancestor to each `dyn.Value` based on its path. For example,
`resources.jobs.job_0.tasks[0].task_key` is located at `job_0.py:10:5`
and `resources.jobs.job_0.tasks[0].email_notifications` is located at
`job_0.py:3:5`, because we use the location of the job as the most
precise approximation.

This feature is only enabled if `experimental/python` is used.

Note: for now, we don't update locations with relative paths, because it
has a side effect in changing how these paths are resolved

## Example
```
% databricks bundle validate

Warning: job_cluster_key abc is not defined
  at resources.jobs.examples.tasks[0].job_cluster_key
  in resources/example.py:10:1
```

## Tests
Unit tests and manually
2025-01-22 15:37:37 +00:00
Denis Bilenko 54a470837c
Fix context propagation in bundle/deploy/terraform (#2208)
https://github.com/databricks/cli/pull/747#discussion_r1925248116
2025-01-22 13:28:13 +00:00
Denis Bilenko 667302b61b
Refactor env forwarding function in terraform (#2206)
No functional changes, just making it easier to add variables.
2025-01-22 12:51:17 +00:00
shreyas-goenka 6c3ddbd921
Add `auth.Env` function (#2204)
## Changes
`auth.Env` is a generic function that we can use for authenticated tools
downstream to the CLI.

## Tests
Unit test.
2025-01-22 12:14:54 +00:00
Denis Bilenko 876526a19a
Use local git config in tests (#2205)
I've seen this error: could not lock config file
$TMPDIR_GPARENT/TestAccept3968313522/002/.gitconfig: File exists

This is likely the cause.
2025-01-22 12:20:49 +01:00
Denis Bilenko e9902036b8
Set WorktreeRoot to sync root outside git repo (#2197)
## Changes
If git is not detected, set default worktree root to sync root.
Otherwise NewFileSet/View raise an error about worktree root being
outside view root in acceptance/bundle/sync-paths-dotdot.

This behavior is introduced in
https://github.com/databricks/cli/pull/1945

Stacked on https://github.com/databricks/cli/pull/2202

## Tests
Existing tests.
2025-01-22 10:50:13 +00:00
Ilya Kuznetsov c224be5c1f
Allow using variables in enum fields (#2199)
## Changes

It is possible to pass variable to enum fields but json-schema doesn't
accept it. This PR adds `oneOf` for enum types that includes `${var-*}`
pattern

## Tests

Manually checked in VSCode
2025-01-22 10:30:17 +00:00
Denis Bilenko fde30ff1ab
Add a test for sync root outside of git root (#2202)
- Move acceptance/bundle/sync-paths-dotdot test to
acceptance/bundle/syncroot/dotdot-notgit
- Add new test acceptance/bundle/syncroot/dotdot-git

Fix replacer to work with this test and on Windows:
- Make PATH work on Windows by using EvalSymlinks.
- Make concatenated path match within JSON but stripping quotes.
2025-01-22 10:17:45 +00:00
Denis Bilenko 3a32c63919
Add -inprocess mode for acceptance tests (#2184)
## Changes
- If you pass -inprocess flag to acceptance tests, they will run in the
same process as test itself. This enables debugging.
- If you set singleTest variable on top of acceptance_test.go, you'll
only run that test and with inprocess mode. This is intended for
debugging in VSCode.
- (minor) Converted KeepTmp to flag -keeptmp from env var KEEP_TMP for
consistency with other flags.

## Tests
- I verified that acceptance tests pass with -inprocess mode: `go test
-inprocess < /dev/null | cat`
- I verified that debugging in VSCode works: set a test name in
singleTest variable, set breakpoints inside CLI and click "debug test"
in VSCode.
2025-01-21 21:21:12 +00:00
Denis Bilenko 34a37cf4a8
Clone ReplacementContext before passing into test (#2198)
## Changes
- Add a new method Clone() on ReplacementContext
- Use it when passing common replacements to test cases.

## Tests
Manually. I have a different branch where this bug manifested and this
change helped.
2025-01-21 12:47:34 +00:00
Denis Bilenko de5155ed0a
Add acceptance for test for sync.paths equal to two dots (#2196)
Based on integration test from @andrewnester in #2194

Manually checked that this databricks.yml passes validation on v0.235.0
but fails on v0.236.0, very like it was broken in
https://github.com/databricks/cli/pull/1945

This also adds replacements for tmpdir, it's parent and (just in case)
grand parent.
2025-01-21 11:50:28 +00:00
Denis Bilenko 33613b5d2a
Add test for #2181 /Workspace not prepended (#2188) 2025-01-21 11:27:02 +00:00
Denis Bilenko 41bbd89257
Clean up unnecessary cleanup of inferred flag (#2193)
## Changes
The SelectTarget mutator (part of Load phase) clears bundle.git.inferred
flag but it is not set until later - Initialize phase / LoadGitDetails
mutator.

## Tests
Existing tests.
2025-01-20 17:21:34 +00:00
Denis Bilenko ee4a4b4c24
Migrate quality_monitor_test.go to acceptance test (#2192) 2025-01-20 16:33:03 +00:00
Ilya Kuznetsov 84a73052d2
fix: Detailed message for using source-linked deployment with file_path specified (#2119)
## Changes

Resolves remaining comments from here
https://github.com/databricks/cli/pull/2046


[This](https://github.com/databricks/cli/pull/2046#discussion_r1907121844)
and
[this](https://github.com/databricks/cli/pull/2046#discussion_r1908928239)
are on hold until Pieter's response

## Tests
<!-- How is this tested? -->
2025-01-20 16:16:51 +00:00
Pieter Noordhuis 69f3c0a869
Fix duplicate "apps" entry in help output (#2191)
## Changes

This is not needed because the command group is already returned by
`workspace.All()`.

The additional command registration was added in #1679.

## Tests

Acceptance test.
2025-01-20 16:02:29 +00:00
Denis Bilenko 395a04a8d1
Run tests with coverage on CI (#2141)
Combine 'make cover' and 'make acc-cover' into single command. They
still write coverage into different files -- it would be useful to see
separate coverage numbers.

Note, we're not making use of coverage information yet. However, running
tests in CI with coverage will
- let us catch issues that only manifest when coverage is enabled, like
https://github.com/databricks/cli/pull/2150
- will let us know if there are any issues with running coverage on CI
before investing in additional coverage support
2025-01-20 15:41:24 +00:00
shreyas-goenka e6982d09ac
Add doc string for `bundle.uuid` (#2170)
Co-authored-by: Julia Crawford (Databricks) <julia.crawford@databricks.com>
2025-01-20 14:52:22 +00:00
shreyas-goenka 41a21af556
Refactor `bundle init` (#2074)
## Summary of changes
This PR introduces three new abstractions: 
1. `Resolver`: Resolves which reader and writer to use for a template.
2. `Writer`: Writes a template project to disk. Prompts the user if
necessary.
3. `Reader`: Reads a template specification from disk, built into the
CLI or from GitHub.

Introducing these abstractions helps decouple reading a template from
writing it. When I tried adding telemetry for the `bundle init` command,
I noticed that the code in `cmd/init.go` was getting convoluted and hard
to test. A future change could have accidentally logged PII when a user
initialised a custom template.

Hedging against that risk is important here because we use a generic
untyped `map<string, string>` representation in the backend to log
telemetry for the `databricks bundle init`. Otherwise, we risk
accidentally breaking our compliance with our centralization
requirements.

### Details

After this PR there are two classes of templates that can be
initialized:
1. A `databricks` template: This could be a builtin template or a
template outside the CLI like mlops-stacks, which is still owned and
managed by Databricks. These templates log their telemetry arguments and
template name.
2. A `custom` template: These are templates created by and managed by
the end user. In these templates we do not log the template name and
args. Instead a generic placeholder string of "custom" is logged in our
telemetry system.

NOTE: The functionality of the `databricks bundle init` command remains
the same after this PR. Only the internal abstractions used are changed.

## Tests
New unit tests. Existing golden and unit tests. Also a fair bit of
manual testing.
2025-01-20 12:09:28 +00:00
Gleb Kanterov 31c10c1b82
Add experimental-jobs-as-code template (#2177)
## Changes

Add experimental-jobs-as-code template allowing defining jobs using
Python instead of YAML through the `databricks-bundles` PyPI package.

## Tests

Manually and acceptance tests.
2025-01-20 10:15:11 +00:00
Denis Bilenko 7034793d1d
Run 'ruff format' in quiet mode (#2187)
Otherwise it prints "83 files left unchanged".
2025-01-20 10:26:29 +01:00
Denis Bilenko 64fc1c8fe7
Add NoLog option on testcli.Runner (#2183)
## Changes
Setting Verbose=false on testcli.Runner disables all logging related to
running process (stdout, stderr, error, args).

I'm using this in #2184 where I'm using testcli runner to run acceptance
tests and seeing all output is not useful.

## Tests
Manually inspecting test output in #2184
2025-01-20 09:57:48 +01:00
Denis Bilenko 26f527ef64
Fix incorrect TestingT.Errorf usage and enable linting for this (#2182)
## Changes
- Fix incorrect use Errorf on literal string. This resulted in garbage
output in tests diagnostics where % was replaced by "(MISSING)".
- Enable linter on testingT.Errorf.

Note, the autofix by the linter is wrong, it proposes `t.Errorf("%s",
string)` but it should be `t.Error(string)`. That can corrected manually
though.

## Tests
Linter was tested manually by reverting the fix on Errorf.
2025-01-20 08:07:42 +00:00
Pieter Noordhuis 50f62692ce
Include a materialized copy of built-in templates (#2146)
## Changes

Include a materialized copy of built-in templates as reference output.

This updates the output comparison logic to work against an output
directory. The `doComparison` function now always works on real files.
It can now tell apart non-existing files and empty files (e.g., the
`.gitkeep` files in templates).
2025-01-17 15:03:59 +00:00
Pieter Noordhuis 0d5193a62c
Include help output for bundle commands in acceptance tests (#2178)
## Changes

This includes a change to the defaults for the output directory flags of
the "generate" commands. These defaults included the expanded working
directory. This can be omitted because it is implied.
2025-01-17 14:52:53 +00:00
Andrew Nester cff4f09cc8
Added text output templates for apps list and list-deployments (#2175)
## Changes
Added text output templates for apps list and list-deployments

Fixes #2172

## Tests
```
andrew.nester@HFW9Y94129 ~ % databricks apps list -p u2m
Name                          Url                                                            Compute Status  Deployment Status
abc                           https://abc-***.aws.databricksapps.com                         STOPPED
andre-test                    https://andre-test-***..aws.databricksapps.com                  ACTIVE          SUCCEEDED
andre-test2                   https://andre-test2-***..aws.databricksapps.com                 ACTIVE          SUCCEEDED
...
```
2025-01-17 14:42:44 +00:00
Andrew Nester 0c088d4050
Fixed an apps message order and added output test (#2174)
## Changes
Fixed an apps message order and added output test
2025-01-17 14:42:39 +00:00
Denis Bilenko 560c3d352e
Add test for passing --var twice for the same arg (#2176)
This shows that passing two --var for the same arg is rejected
currently.
2025-01-17 13:40:19 +00:00
Pieter Noordhuis 89eb556318
Migrate path translation tests to acceptance tests (#2122)
## Changes

The assertions on the output made are now captured in the `output.*`
files. These don't capture intent like actual assertions do, but we
still have regular test coverage in the path translation tests under
`bundle/config/mutator`.

## Tests

Tests pass.
2025-01-17 10:22:49 +00:00
Pieter Noordhuis 9061635789
Default to forward slash-separated paths for path translation (#2145)
## Changes

This came up in #2122 where relative library paths showed up with
backslashes on Windows. It's hard to run acceptance tests where paths
may be in either form. This change updates path translation logic to
always use forward slash-separated paths, including for absolute paths.

## Tests

* Unit tests pass.
* Confirmed that code where library paths are used uses the `filepath`
package for path manipulation. The functions in this package always
normalize their inputs to be platform-native paths.
* Confirmed that code that uses absolute paths works with forward
slash-separated paths on Windows.
2025-01-17 09:38:01 +00:00
Pieter Noordhuis 2cd0d88bdd
Format Python code with ruff (#2166)
## Changes

The materialized templates included in #2146 include Python code that we
require to be formatted. Instead of running ruff as part of the
testcase, we can enforce that all Python code in the repository is
formatted. It won't be possible to have a passing acceptance test for
template initialization with unformatted code.
2025-01-17 07:38:47 +00:00
Andrew Nester 511c8887a8
[Release] Release v0.239.0 (#2167)
### New feature announcement

#### Databricks Apps support

You can now manage Databricks Apps using DABs by defining an `app`
resource in your bundle configuration.
For more information see Databricks documentation
https://docs.databricks.com/en/dev-tools/bundles/resources.html#app

#### Referencing complex variables in complex variables

You can now reference complex variables within other complex variables.
For more details see https://github.com/databricks/cli/pull/2157

CLI:
* Filter out system clusters in cluster picker
([#2131](https://github.com/databricks/cli/pull/2131)).
* Add command line flags for fields that are not in the API request body
([#2155](https://github.com/databricks/cli/pull/2155)).

Bundles:
* Added support for Databricks Apps in DABs
([#1928](https://github.com/databricks/cli/pull/1928)).
* Allow artifact path to be located outside the sync root
([#2128](https://github.com/databricks/cli/pull/2128)).
* Retry app deployment if there is an active deployment in progress
([#2153](https://github.com/databricks/cli/pull/2153)).
* Resolve variables in a loop
([#2164](https://github.com/databricks/cli/pull/2164)).
* Improve resolution of complex variables within complex variables
([#2157](https://github.com/databricks/cli/pull/2157)).
* Added output message to warn about slower deployments with apps
([#2161](https://github.com/databricks/cli/pull/2161)).
* Patch references to UC schemas to capture dependencies automatically
([#1989](https://github.com/databricks/cli/pull/1989)).
* Format default-python template
([#2110](https://github.com/databricks/cli/pull/2110)).
* Encourage the use of root_path in production to ensure single
deployment ([#1712](https://github.com/databricks/cli/pull/1712)).
* Log warnings to stderr for "bundle validate -o json"
([#2109](https://github.com/databricks/cli/pull/2109)).

API Changes:
* Changed `databricks account federation-policy update` command with new
required argument order.
* Changed `databricks account service-principal-federation-policy
update` command with new required argument order.

OpenAPI commit 779817ed8d63031f5ea761fbd25ee84f38feec0d (2025-01-08)
Dependency updates:
* Upgrade TF provider to 1.63.0
([#2162](https://github.com/databricks/cli/pull/2162)).
* Bump golangci-lint version to v1.63.4 from v1.63.1
([#2114](https://github.com/databricks/cli/pull/2114)).
* Bump astral-sh/setup-uv from 4 to 5
([#2116](https://github.com/databricks/cli/pull/2116)).
* Bump golang.org/x/oauth2 from 0.24.0 to 0.25.0
([#2080](https://github.com/databricks/cli/pull/2080)).
* Bump github.com/hashicorp/hc-install from 0.9.0 to 0.9.1
([#2079](https://github.com/databricks/cli/pull/2079)).
* Bump golang.org/x/term from 0.27.0 to 0.28.0
([#2078](https://github.com/databricks/cli/pull/2078)).
* Bump github.com/databricks/databricks-sdk-go from 0.54.0 to 0.55.0
([#2126](https://github.com/databricks/cli/pull/2126)).

---------

Co-authored-by: shreyas-goenka <88374338+shreyas-goenka@users.noreply.github.com>
2025-01-16 16:25:17 +00:00
Denis Bilenko 2e70558dc1
Resolve variables in a loop (#2164)
## Changes
- Instead of doing 2 passes on variable resolution, do a loop until
there are no more updates (or we reach count 100).
- Stacked on top of #2163 which is a regression test for this:
acceptance/bundle/variables/complex-transitive-deep

## Tests
Existing tests, new regression tests.

These tests already passed before, added for completeness:
- acceptance/bundle/variables/cycle
- acceptance/bundle/variables/complex-cross-ref
2025-01-16 14:39:54 +00:00
shreyas-goenka f2bba632cb
Patch references to UC schemas to capture dependencies automatically (#1989)
## Changes
Fixes https://github.com/databricks/cli/issues/1977.  

This PR modifies the bundle configuration to capture the dependency that
a UC Volume or a DLT pipeline might have on a UC schema at deployment
time. It does so by replacing the schema name with a reference of the
form `${resources.schemas.foo.name}`.

For example:
The following UC Volume definition depends on the UC schema with the
name `schema_name`. This mutator converts this configuration

from:
```
resources:
  volumes:
    bar:
      catalog_name: catalog_name
      name: volume_name
      schema_name: schema_name

  schemas:
    foo:
      catalog_name: catalog_name
      name: schema_name
```

to:

```
resources:
  volumes:
    bar:
      catalog_name: catalog_name
      name: volume_name
      schema_name: ${resources.schemas.foo.name}`

  schemas:
    foo:
      catalog_name: catalog_name
      name: schema_name
```


## Tests
Unit tests and manually.
2025-01-16 13:27:00 +00:00
Andrew Nester fa87f22706
Changed warning message for apps (#2165)
## Changes
Changed warning message for apps

Original warning message added here:
https://github.com/databricks/cli/pull/2161
2025-01-16 13:03:35 +00:00
Denis Bilenko bc1610f6e6
Add a test for complex variable resolution with 3 levels (#2163)
Follow up to #2157. That PR repeated variable resolution. This test
still does not resolve fully but would resolve with 3 passes. This is
slightly different from complex-transitive-deeper - this test does not
show any errors, the issue is purely not enough passes.
2025-01-16 12:14:00 +00:00
Andrew Nester 98244606b3
Upgrade TF provider to 1.63.0 (#2162)
## Changes
No significant changes to call out for DABs.
2025-01-16 12:04:00 +00:00
Andrew Nester 8f34fc7961
Added output message to warn about slower deployments with apps (#2161)
## Changes
When users create one or more Databricks apps in their bundle it can
lead to initial bundle deployment being slower because apps need to wait
until their compute is fully provisioned and started.

This PR adds a message to warn about it. This message will be removed
when `no_compute` option becomes available in TF provider and used in
DABs (https://github.com/databricks/cli/pull/2144)
2025-01-16 11:39:59 +00:00
Denis Bilenko b273dc5942
Enable linter 'copyloopvar' and fix the issues (#2160)
## Changes
- Remove all unnecessary copies of the loop variable, it is not
necessary since Go 1.22 https://go.dev/blog/loopvar-preview
- Enable the linter that catches this issue
https://github.com/karamaru-alpha/copyloopvar

## Tests
Existing tests.
2025-01-16 11:20:50 +00:00
Andrew Nester a002a24e41
Add a unique schema for recreate pipeline test (#2159)
## Changes
Pipeline backend requires `target` to be always specified.

In order to do this we will create an unique schema as part of
`TestBundlePipelineRecreateWithoutAutoApprove` test which will be used
in pipelines

## Tests
```
    helpers_test.go:148: stderr: Destroy complete!
--- PASS: TestBundlePipelineRecreateWithoutAutoApprove (415.39s)
PASS
coverage: [no statements]
ok      github.com/databricks/cli/integration/bundle    416.141s        coverage: [no statements]
```
2025-01-15 17:22:45 +00:00
Denis Bilenko 30dec59781
Improve resolution of complex variables within complex variables (#2157)
## Changes
- Remove ResolveVariableReferencesInComplexVariables - it blocked
complex-within-complex for no good reason.
- Repeat regular resolution twice, it helps with a couple test cases we
have.

There may be a case for running it 3 times or more in a loop, but there
is no test case for that, so this PR is simple incremental improvement.

## Tests
Existing acceptance tests. Previously all unit tests for complex
variables were converted to acceptance tests, to capture this change and
ensure nothing breaks.
2025-01-15 18:03:43 +01:00
Denis Bilenko 39b03592d7
Migrate TestResolveComplexVariableWithVarReference (#2156)
This is the last test referencing
ResolveVariableReferencesInComplexVariables, allowing removal of that
mutator.
2025-01-15 17:52:17 +01:00
Denis Bilenko d53a78e926
Introduce $DATABRICKS_URL replacement in tests (#2158)
## Changes
It covers both https://$DATABRICKS_HOST and http://$DATABRICKS_HOST so
the test output does not change between local and the cloud.

## Tests
Existing tests using golden files (acceptance and integration) catch
this and were updated.
2025-01-15 17:24:12 +01:00
Andrew Nester 20179457b9
Process all the fields in top level request object even if it contains request body (#2155)
## Changes

When regenerating CLI with a new Go SDK
https://github.com/databricks/cli/pull/2126 I've noticed that some
parameters such as `no_compute` for apps are not added as flags for the
CLI commands.

This happened because we ignored all other top level fields if there's a
request body object field.

This PR relies on new AllFields method from Genkit which returns fields
from both request object and request body object.
2025-01-15 17:02:58 +01:00
Denis Bilenko 581565a1c4
Migrate more variable tests to acceptance (#2154) 2025-01-15 15:59:42 +01:00
Andrew Nester dd554412a6
Retry app deployment if there is an active deployment in progress (#2153)
## Changes
If before running an app, the app was stopped with an active deployment,
then Apps backend start it and does the auto-deploy of the last active
deployment. In such cases StartApp API won't return any active or
pending deployments in its response but doing the deploy immediately
after the start might result in the error `Cannot deploy app *** as
there is an active deployment in progress`.

From DABs side, we have to do a new deployment on every `bundle run`
(command which start the app and does deployment) because local files in
bundle might have been changed and users expect to have the app running
with new code.

Thus this PR works around the error by catching “deployment in progress”
error, getting any active / pending deployments, waits for them to
finish and start the new deployment again. If 2nd attempts fails, the
whole command fails.

## Tests
Added unit test.
2025-01-15 12:51:06 +01:00
Pieter Noordhuis 626045a17e
Use regular expressions for testdiff replacements (#2151)
## Changes

Replacement was split between the type `ReplacementContext` and the
`ReplaceOutput` function. The latter also ran a couple of regular
expressions. This change consolidates them such that it is up to the
caller to compose the set of replacements to use.

This change is required to accommodate UUID replacement in #2146.
2025-01-15 12:15:23 +01:00
Denis Bilenko 40e96b5af2
exec(test): Do not clear PATH; this breaks coverage on Windows (#2150)
## Changes
When setting up PATH in tests, put desired entry first but keep the rest
as well. Otherwise it fails on Windows

```
D:/a/cli/cli/libs/exec/exec_test.go:108
Error: Received unexpected error:
exit status 0xc0000135
```

Explanation from Claude:
> The error code 0xc0000135 is a Windows error indicating "Unable to
locate DLL"
> When code coverage is enabled, Go instruments the binary with coverage
tracking code, which requires additional DLL dependencies on Windows.

## Tests
Separate draft PR with coverage enabled on CI:
https://github.com/databricks/cli/pull/2141
2025-01-15 12:05:46 +01:00
Denis Bilenko 983a8a6633
Alias 'make' to 'make vendor fmt lint' (#2152)
The current default of building a binary is not frequently used.

The new default is useful after switching branches, stashing/unstashing,
AI changes.

Note "fmt" and "lint" are separate steps because "fmt" can fix
formatting and imports in presence of compilation errors and "lint"
cannot. Without compilation errors, "lint" also does formatting.
2025-01-15 11:41:06 +01:00
Denis Bilenko b76eee0e8c
Migrate resolution tests to acceptance tests (#2143) 2025-01-15 11:22:23 +01:00
Denis Bilenko 5592fa889e
acceptance: add -buildvcs=false when building on Windows (#2148)
I get an error on my local Windows otherwise: error obtaining VCS
status: exit status 128

On CI this does not make a difference.
2025-01-15 10:45:57 +01:00
Gleb Kanterov 25f8ee8d66
Format default-python template (#2110)
## Changes
Format code in default-python template, so it's already pre-formatted.

## Tests

```
$ databricks bundle init libs/template/templates/default-python
$ ruff format --diff my_project     
6 files already formatted
```
2025-01-15 10:40:29 +01:00
Denis Bilenko 55494a0bda
Add test about using variable in bundle.git.branch (#2118)
This test checks load git details functionality + variable interpolation
there.

The variables are not working there because LoadGitDetails mutator is
running before variable interpolation.

Additionally, correctly replace tmp path that is used for DATABRICKS_TF_EXEC_PATH
2025-01-15 10:34:51 +01:00
Denis Bilenko cc44e368b8
Golden files: always include JSON-ed string (cont) (#2147)
## Changes
Follow up to #2142 which did not actually enabled JSON conversion
because of reversed condition on err.

## Tests
Tested via #2118
2025-01-15 09:46:06 +01:00
Denis Bilenko 6a7eefa54b
Use the same test names on Win as on other OSes (#2149)
## Changes
Use name format "TestAccept/bundle/variables/host" (previously slashes
were reversed on Windows).

## Tests
Manually run "go test ./acceptance -v -run
TestAccept/bundle/variables/host" in Windows VM.
2025-01-15 09:43:40 +01:00
Pieter Noordhuis 82e35530b0
Add acceptance tests for builtin templates (#2135)
## Changes

To accommodate:
* Add the server URL to the set of output replacements
* Include a call to the permissions API to the dummy server
* Run the main script in a subshell to isolate working directory changes
2025-01-14 18:23:34 +00:00
dependabot[bot] 72e677d0ac
Bump github.com/databricks/databricks-sdk-go from 0.54.0 to 0.55.0 (#2126)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.54.0 to 0.55.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.55.0</h2>
<h3>Internal Changes</h3>
<ul>
<li>Bump staticcheck to 0.5.1 and add go 1.23 test coverage (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1106">#1106</a>).</li>
<li>Bump x/net, x/crypto dependencies (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1107">#1107</a>).</li>
<li>Create custom codeql.yml (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1114">#1114</a>).</li>
<li>Decouple serving and oauth2 package (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1110">#1110</a>).</li>
<li>Migrate workflows that need write access to use hosted runners (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1112">#1112</a>).</li>
<li>Move package credentials in config (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1115">#1115</a>).</li>
<li>Update Queries test (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1104">#1104</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <code>NoCompute</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/apps#CreateAppRequest">apps.CreateAppRequest</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseJob">jobs.BaseJob</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li>
<li>Added <code>PageToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetJobRequest">jobs.GetJobRequest</a>.</li>
<li>Added <code>HasMore</code> and <code>NextPageToken</code> fields for
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Job">jobs.Job</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li>
<li>Added <code>RunAs</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>.</li>
<li>Added <code>RunAs</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a>.</li>
<li>Added <code>AuthorizationDetails</code> and <code>EndpointUrl</code>
fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DataPlaneInfo">serving.DataPlaneInfo</a>.</li>
<li>[Breaking] Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#AccountFederationPolicyAPI">a.AccountFederationPolicy</a>
account-level service with new required argument order.</li>
<li>[Breaking] Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#ServicePrincipalFederationPolicyAPI">a.ServicePrincipalFederationPolicy</a>
account-level service with new required argument order.</li>
<li>Changed <code>UpdateMask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateAccountFederationPolicyRequest">oauth2.UpdateAccountFederationPolicyRequest</a>
to no longer be required.</li>
<li>Changed <code>UpdateMask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateServicePrincipalFederationPolicyRequest">oauth2.UpdateServicePrincipalFederationPolicyRequest</a>
to no longer be required.</li>
<li>[Breaking] Changed <code>DaysOfWeek</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#RestartWindow">pipelines.RestartWindow</a>
to type <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#DayOfWeekList">pipelines.DayOfWeekList</a>.</li>
</ul>
<p>OpenAPI SHA: 779817ed8d63031f5ea761fbd25ee84f38feec0d, Date:
2025-01-08</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>[Release] Release v0.55.0</h2>
<h3>Internal Changes</h3>
<ul>
<li>Bump staticcheck to 0.5.1 and add go 1.23 test coverage (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1106">#1106</a>).</li>
<li>Bump x/net, x/crypto dependencies (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1107">#1107</a>).</li>
<li>Create custom codeql.yml (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1114">#1114</a>).</li>
<li>Decouple serving and oauth2 package (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1110">#1110</a>).</li>
<li>Migrate workflows that need write access to use hosted runners (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1112">#1112</a>).</li>
<li>Move package credentials in config (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1115">#1115</a>).</li>
<li>Update Queries test (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1104">#1104</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <code>NoCompute</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/apps#CreateAppRequest">apps.CreateAppRequest</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseJob">jobs.BaseJob</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li>
<li>Added <code>PageToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetJobRequest">jobs.GetJobRequest</a>.</li>
<li>Added <code>HasMore</code> and <code>NextPageToken</code> fields for
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Job">jobs.Job</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li>
<li>Added <code>RunAs</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>.</li>
<li>Added <code>RunAs</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a>.</li>
<li>Added <code>AuthorizationDetails</code> and <code>EndpointUrl</code>
fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DataPlaneInfo">serving.DataPlaneInfo</a>.</li>
<li>[Breaking] Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#AccountFederationPolicyAPI">a.AccountFederationPolicy</a>
account-level service with new required argument order.</li>
<li>[Breaking] Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#ServicePrincipalFederationPolicyAPI">a.ServicePrincipalFederationPolicy</a>
account-level service with new required argument order.</li>
<li>Changed <code>UpdateMask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateAccountFederationPolicyRequest">oauth2.UpdateAccountFederationPolicyRequest</a>
to no longer be required.</li>
<li>Changed <code>UpdateMask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateServicePrincipalFederationPolicyRequest">oauth2.UpdateServicePrincipalFederationPolicyRequest</a>
to no longer be required.</li>
<li>[Breaking] Changed <code>DaysOfWeek</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#RestartWindow">pipelines.RestartWindow</a>
to type <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#DayOfWeekList">pipelines.DayOfWeekList</a>.</li>
</ul>
<p>OpenAPI SHA: 779817ed8d63031f5ea761fbd25ee84f38feec0d, Date:
2025-01-08</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b83a7262d5"><code>b83a726</code></a>
[Release] Release v0.55.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1117">#1117</a>)</li>
<li><a
href="23d9c1ea2b"><code>23d9c1e</code></a>
[Internal] Move package credentials in config (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1115">#1115</a>)</li>
<li><a
href="adc94cabf7"><code>adc94ca</code></a>
[Internal] Decouple serving and oauth2 package (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1110">#1110</a>)</li>
<li><a
href="83db3cbdab"><code>83db3cb</code></a>
[Internal] Create custom codeql.yml (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1114">#1114</a>)</li>
<li><a
href="2b55375727"><code>2b55375</code></a>
[Internal] Migrate workflows that need write access to use hosted
runners (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1">#1</a>...</li>
<li><a
href="03fb2681fa"><code>03fb268</code></a>
[Internal] Bump x/net, x/crypto dependencies (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1107">#1107</a>)</li>
<li><a
href="28e1a698ab"><code>28e1a69</code></a>
[Internal] Bump staticcheck to 0.5.1 and add go 1.23 test coverage (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1106">#1106</a>)</li>
<li><a
href="2399d721fe"><code>2399d72</code></a>
[Internal] Update Queries test (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1104">#1104</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.54.0...v0.55.0">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.54.0&new-version=0.55.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2025-01-14 16:02:34 +00:00
Denis Bilenko fca6abdfac
Golden files: always include JSON-ed string (#2142)
## Changes
Always include both verbatim and json version of replacement.

This helps when the string in question contains \\ or other chars that
need to be quoted.

Needed for https://github.com/databricks/cli/pull/2118

## Tests
Existing tests.
2025-01-14 15:50:55 +00:00
Denis Bilenko ccb2599b42
Add complex-transitive-deeper acceptance test (#2140)
Extension of complex-transitive test that shows an error instead of
simply failing to interpolate.
2025-01-14 15:38:20 +00:00
Denis Bilenko a5e09ab28a
Coverage for acceptance tests (#2123)
## Changes

Add two new make commands:
- make acc-cover: runs acceptance tests and outputs
coverage-acceptance.txt
- make acc-showcover: show coverage-acceptance.txt locally in browser

Using the GOCOVERDIR functionality:
https://go.dev/blog/integration-test-coverage

This works, but there are a couple of issues encountered:
- GOCOVERDIR does not play well with regular "go test -cover". Once this
fixed, we can simplify the code and have 'make cover' output coverage
for everything at once. We can also probably get rid of CLI_GOCOVERDIR.
https://github.com/golang/go/issues/66225
- When running tests in parallel to the same directory there is rare
conflict on writing covmeta file. For this reason each tests writes
coverage to their own directory which is then merged together by 'make
acc-cover'.

<!-- Summary of your changes that are easy to understand --


## Tests
Manually running the new make commands.
2025-01-14 14:19:00 +00:00
Denis Bilenko 2ae2b7e8c8
Enable acceptance tests for manually running against the cloud (#2120)
## Changes
- If CLOUD_ENV variable is set, acceptance will no longer set up server
& override DATABRICKS_HOST/DATABRICKS_TOKEN/HOME env vars.
- I've updated replacements logic in testdiff to use tester /
tester@databricks.com convention.

## Tests
Manually running current acceptance tests against dogfood on my laptop I
get all test pass except for 2 failures.

```
    --- FAIL: TestAccept/bundle/variables/env_overrides (0.09s)
    --- FAIL: TestAccept/bundle/variables/resolve-builtin (1.30s)
```
2025-01-14 13:50:28 +00:00
Andrew Nester fe31e4d02e
Fixed a typo in TestDeployBundleWithApp test (#2138)
## Changes
Fixed a typo in TestDeployBundleWithApp test

## Tests
```
   helpers_test.go:148: stderr: Destroy complete!
--- PASS: TestDeployBundleWithApp (647.51s)
PASS
coverage: [no statements]
ok      github.com/databricks/cli/integration/bundle    647.985s        coverage: [no statements]
```
2025-01-14 13:24:22 +00:00
Denis Bilenko 98a1e73a0f
Simplify replacements logic for golden files (#2132)
## Changes
- Do not sort, use fixed order of replacements.

## Tests
Existing tests.
2025-01-14 11:00:38 +00:00
Denis Bilenko 2b452973f3
Enable linter 'unconvert' and fix the issues found (#2136) 2025-01-14 10:56:38 +00:00
Pieter Noordhuis 5d9bc3b553
Allow artifact path to be located outside the sync root (#2128)
## Changes

We perform a check during path translation that the path being
referenced is contained in the bundle's sync root. If it isn't, it's not
a valid remote reference. However, this doesn't apply to paths that are
_always_ local, such as the artifact path. An artifact's build command
is executed in its path. Files created by the artifact build (e.g.
wheels or JARs) don't need to be in the sync root because they have a
dedicated and different upload path into `${workspace.artifact_path}`.

Therefore, this check that a path is contained in the bundle's sync root
doesn't apply to artifact paths. This change modifies the structure of
path translation to allow opting out of this check.

Fixes #1927.

## Tests

* Existing and new tests pass.
* Manually confirmed that building and using a wheel built outside the
sync root path works as expected.
* No acceptance tests because we don't run build as part of validate.
2025-01-14 08:34:55 +00:00
Denis Bilenko e682eeba80
Pin all github actions to commit hash (#2129)
## Changes
- Pin all github actions to commit hash.
- Modify vedantmgoyal2009/winget-releaser to use tag format that
dependabot can understand.

Pinning is done by
https://github.com/databricks/cli/blob/denik/pin-actions-script/pin_actions.py
(100% chatgpt authored). Commits and tags are verified manually.

This format should be recognized by dependabot enabled in
https://github.com/databricks/cli/pull/2112

## Tests
Existing tests.
2025-01-14 07:39:34 +00:00
Pieter Noordhuis e1f5f60a8d
Filter out system clusters in cluster picker (#2131)
## Changes

As of the clusters API v2.1 the results include system clusters. On
large workspaces this can lead to long load times and include many
irrelevant results. The cluster picker should only show interactive
clusters.

Also see #1754.

## Tests

Manually confirmed the picker runs fast on a large workspace.
2025-01-14 07:38:28 +00:00
637 changed files with 24999 additions and 3454 deletions

View File

@ -1 +1 @@
a6a317df8327c9b1e5cb59a03a42ffa2aabeef6d
c72c58f97b950fcb924a90ef164bcb10cfcd5ece

View File

@ -109,16 +109,19 @@ var {{.CamelName}}Overrides []func(
{{- end }}
)
{{- $excludeFromJson := list "http-request"}}
func new{{.PascalName}}() *cobra.Command {
cmd := &cobra.Command{}
{{- $canUseJson := and .CanUseJson (not (in $excludeFromJson .KebabName )) -}}
{{- if .Request}}
var {{.CamelName}}Req {{.Service.Package.Name}}.{{.Request.PascalName}}
{{- if .RequestBodyField }}
{{.CamelName}}Req.{{.RequestBodyField.PascalName}} = &{{.Service.Package.Name}}.{{.RequestBodyField.Entity.PascalName}}{}
{{- end }}
{{- if .CanUseJson}}
{{- if $canUseJson}}
var {{.CamelName}}Json flags.JsonFlag
{{- end}}
{{- end}}
@ -135,14 +138,14 @@ func new{{.PascalName}}() *cobra.Command {
{{- $request = .RequestBodyField.Entity -}}
{{- end -}}
{{if $request }}// TODO: short flags
{{- if .CanUseJson}}
{{- if $canUseJson}}
cmd.Flags().Var(&{{.CamelName}}Json, "json", `either inline JSON string or @path/to/file.json with request body`)
{{- end}}
{{$method := .}}
{{ if not .IsJsonOnly }}
{{range $request.Fields -}}
{{range .AllFields -}}
{{- if not .Required -}}
{{if .Entity.IsObject }}// TODO: complex arg: {{.Name}}
{{if .Entity.IsObject}}{{if not (eq . $method.RequestBodyField) }}// TODO: complex arg: {{.Name}}{{end}}
{{else if .Entity.IsAny }}// TODO: any: {{.Name}}
{{else if .Entity.ArrayValue }}// TODO: array: {{.Name}}
{{else if .Entity.MapValue }}// TODO: map via StringToStringVar: {{.Name}}
@ -177,7 +180,7 @@ func new{{.PascalName}}() *cobra.Command {
{{- $hasRequiredArgs := and (not $hasIdPrompt) $hasPosArgs -}}
{{- $hasSingleRequiredRequestBodyFieldWithPrompt := and (and $hasIdPrompt $request) (eq 1 (len $request.RequiredRequestBodyFields)) -}}
{{- $onlyPathArgsRequiredAsPositionalArguments := and $request (eq (len .RequiredPositionalArguments) (len $request.RequiredPathFields)) -}}
{{- $hasDifferentArgsWithJsonFlag := and (not $onlyPathArgsRequiredAsPositionalArguments) (and .CanUseJson (or $request.HasRequiredRequestBodyFields )) -}}
{{- $hasDifferentArgsWithJsonFlag := and (not $onlyPathArgsRequiredAsPositionalArguments) (and $canUseJson (or $request.HasRequiredRequestBodyFields )) -}}
{{- $hasCustomArgHandler := or $hasRequiredArgs $hasDifferentArgsWithJsonFlag -}}
{{- $atleastOneArgumentWithDescription := false -}}
@ -239,7 +242,7 @@ func new{{.PascalName}}() *cobra.Command {
ctx := cmd.Context()
{{if .Service.IsAccounts}}a := root.AccountClient(ctx){{else}}w := root.WorkspaceClient(ctx){{end}}
{{- if .Request }}
{{ if .CanUseJson }}
{{ if $canUseJson }}
if cmd.Flags().Changed("json") {
diags := {{.CamelName}}Json.Unmarshal(&{{.CamelName}}Req{{ if .RequestBodyField }}.{{.RequestBodyField.PascalName}}{{ end }})
if diags.HasError() {
@ -255,7 +258,7 @@ func new{{.PascalName}}() *cobra.Command {
return fmt.Errorf("please provide command input in JSON format by specifying the --json flag")
}{{- end}}
{{- if $hasPosArgs }}
{{- if and .CanUseJson $hasSingleRequiredRequestBodyFieldWithPrompt }} else {
{{- if and $canUseJson $hasSingleRequiredRequestBodyFieldWithPrompt }} else {
{{- end}}
{{- if $hasIdPrompt}}
if len(args) == 0 {
@ -279,9 +282,9 @@ func new{{.PascalName}}() *cobra.Command {
{{$method := .}}
{{- range $arg, $field := .RequiredPositionalArguments}}
{{- template "args-scan" (dict "Arg" $arg "Field" $field "Method" $method "HasIdPrompt" $hasIdPrompt)}}
{{- template "args-scan" (dict "Arg" $arg "Field" $field "Method" $method "HasIdPrompt" $hasIdPrompt "ExcludeFromJson" $excludeFromJson)}}
{{- end -}}
{{- if and .CanUseJson $hasSingleRequiredRequestBodyFieldWithPrompt }}
{{- if and $canUseJson $hasSingleRequiredRequestBodyFieldWithPrompt }}
}
{{- end}}
@ -392,7 +395,8 @@ func new{{.PascalName}}() *cobra.Command {
{{- $method := .Method -}}
{{- $arg := .Arg -}}
{{- $hasIdPrompt := .HasIdPrompt -}}
{{- $optionalIfJsonIsUsed := and (not $hasIdPrompt) (and $field.IsRequestBodyField $method.CanUseJson) }}
{{ $canUseJson := and $method.CanUseJson (not (in .ExcludeFromJson $method.KebabName)) }}
{{- $optionalIfJsonIsUsed := and (not $hasIdPrompt) (and $field.IsRequestBodyField $canUseJson) }}
{{- if $optionalIfJsonIsUsed }}
if !cmd.Flags().Changed("json") {
{{- end }}

6
.gitattributes vendored
View File

@ -1,11 +1,13 @@
cmd/account/access-control/access-control.go linguist-generated=true
cmd/account/billable-usage/billable-usage.go linguist-generated=true
cmd/account/budget-policy/budget-policy.go linguist-generated=true
cmd/account/budgets/budgets.go linguist-generated=true
cmd/account/cmd.go linguist-generated=true
cmd/account/credentials/credentials.go linguist-generated=true
cmd/account/csp-enablement-account/csp-enablement-account.go linguist-generated=true
cmd/account/custom-app-integration/custom-app-integration.go linguist-generated=true
cmd/account/disable-legacy-features/disable-legacy-features.go linguist-generated=true
cmd/account/enable-ip-access-lists/enable-ip-access-lists.go linguist-generated=true
cmd/account/encryption-keys/encryption-keys.go linguist-generated=true
cmd/account/esm-enablement-account/esm-enablement-account.go linguist-generated=true
cmd/account/federation-policy/federation-policy.go linguist-generated=true
@ -31,6 +33,7 @@ cmd/account/users/users.go linguist-generated=true
cmd/account/vpc-endpoints/vpc-endpoints.go linguist-generated=true
cmd/account/workspace-assignment/workspace-assignment.go linguist-generated=true
cmd/account/workspaces/workspaces.go linguist-generated=true
cmd/workspace/access-control/access-control.go linguist-generated=true
cmd/workspace/aibi-dashboard-embedding-access-policy/aibi-dashboard-embedding-access-policy.go linguist-generated=true
cmd/workspace/aibi-dashboard-embedding-approved-domains/aibi-dashboard-embedding-approved-domains.go linguist-generated=true
cmd/workspace/alerts-legacy/alerts-legacy.go linguist-generated=true
@ -74,6 +77,7 @@ cmd/workspace/instance-pools/instance-pools.go linguist-generated=true
cmd/workspace/instance-profiles/instance-profiles.go linguist-generated=true
cmd/workspace/ip-access-lists/ip-access-lists.go linguist-generated=true
cmd/workspace/jobs/jobs.go linguist-generated=true
cmd/workspace/lakeview-embedded/lakeview-embedded.go linguist-generated=true
cmd/workspace/lakeview/lakeview.go linguist-generated=true
cmd/workspace/libraries/libraries.go linguist-generated=true
cmd/workspace/metastores/metastores.go linguist-generated=true
@ -98,11 +102,13 @@ cmd/workspace/providers/providers.go linguist-generated=true
cmd/workspace/quality-monitors/quality-monitors.go linguist-generated=true
cmd/workspace/queries-legacy/queries-legacy.go linguist-generated=true
cmd/workspace/queries/queries.go linguist-generated=true
cmd/workspace/query-execution/query-execution.go linguist-generated=true
cmd/workspace/query-history/query-history.go linguist-generated=true
cmd/workspace/query-visualizations-legacy/query-visualizations-legacy.go linguist-generated=true
cmd/workspace/query-visualizations/query-visualizations.go linguist-generated=true
cmd/workspace/recipient-activation/recipient-activation.go linguist-generated=true
cmd/workspace/recipients/recipients.go linguist-generated=true
cmd/workspace/redash-config/redash-config.go linguist-generated=true
cmd/workspace/registered-models/registered-models.go linguist-generated=true
cmd/workspace/repos/repos.go linguist-generated=true
cmd/workspace/resource-quotas/resource-quotas.go linguist-generated=true

1
.github/CODEOWNERS vendored
View File

@ -1 +1,2 @@
* @pietern @andrewnester @shreyas-goenka @denik
cmd/labs @alexott @nfx

View File

@ -18,7 +18,7 @@ jobs:
pull-requests: write
steps:
- uses: actions/stale@v9
- uses: actions/stale@5bef64f19d7facfb25b37b414482c7164d639639 # v9.1.0
with:
stale-issue-message: This issue has not received a response in a while. If you want to keep this issue open, please leave a comment below and auto-close will be canceled.
stale-pr-message: This PR has not received an update in a while. If you want to keep this PR open, please leave a comment below or push a new commit and auto-close will be canceled.

View File

@ -25,7 +25,7 @@ jobs:
if: "${{ github.event.pull_request.head.repo.fork }}"
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Delete old comments
env:

View File

@ -20,7 +20,7 @@ jobs:
steps:
- name: Generate GitHub App Token
id: generate-token
uses: actions/create-github-app-token@v1
uses: actions/create-github-app-token@136412a57a7081aa63c935a2cc2918f76c34f514 # v1.11.2
with:
app-id: ${{ secrets.DECO_WORKFLOW_TRIGGER_APP_ID }}
private-key: ${{ secrets.DECO_WORKFLOW_TRIGGER_PRIVATE_KEY }}

View File

@ -23,7 +23,7 @@ jobs:
steps:
- name: Generate GitHub App Token
id: generate-token
uses: actions/create-github-app-token@v1
uses: actions/create-github-app-token@136412a57a7081aa63c935a2cc2918f76c34f514 # v1.11.2
with:
app-id: ${{ secrets.DECO_WORKFLOW_TRIGGER_APP_ID }}
private-key: ${{ secrets.DECO_WORKFLOW_TRIGGER_PRIVATE_KEY }}

View File

@ -10,19 +10,65 @@ on:
jobs:
publish-to-winget-pkgs:
runs-on:
group: databricks-protected-runner-group
labels: windows-server-latest
group: databricks-deco-testing-runner-group
labels: ubuntu-latest-deco
environment: release
steps:
- uses: vedantmgoyal2009/winget-releaser@93fd8b606a1672ec3e5c6c3bb19426be68d1a8b0 # https://github.com/vedantmgoyal2009/winget-releaser/releases/tag/v2
with:
identifier: Databricks.DatabricksCLI
installers-regex: 'windows_.*-signed\.zip$' # Only signed Windows releases
token: ${{ secrets.ENG_DEV_ECOSYSTEM_BOT_TOKEN }}
fork-user: eng-dev-ecosystem-bot
- name: Checkout repository and submodules
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
# Use the tag from the input, or the ref name if the input is not provided.
# The ref name is equal to the tag name when this workflow is triggered by the "sign-cli" command.
release-tag: ${{ inputs.tag || github.ref_name }}
# When updating the version of komac, make sure to update the checksum in the next step.
# Find both at https://github.com/russellbanks/Komac/releases.
- name: Download komac binary
run: |
curl -s -L -o $RUNNER_TEMP/komac-2.9.0-x86_64-unknown-linux-gnu.tar.gz https://github.com/russellbanks/Komac/releases/download/v2.9.0/komac-2.9.0-x86_64-unknown-linux-gnu.tar.gz
- name: Verify komac binary
run: |
echo "d07a12831ad5418fee715488542a98ce3c0e591d05c850dd149fe78432be8c4c $RUNNER_TEMP/komac-2.9.0-x86_64-unknown-linux-gnu.tar.gz" | sha256sum -c -
- name: Untar komac binary to temporary path
run: |
mkdir -p $RUNNER_TEMP/komac
tar -xzf $RUNNER_TEMP/komac-2.9.0-x86_64-unknown-linux-gnu.tar.gz -C $RUNNER_TEMP/komac
- name: Add komac to PATH
run: echo "$RUNNER_TEMP/komac" >> $GITHUB_PATH
- name: Confirm komac version
run: komac --version
# Use the tag from the input, or the ref name if the input is not provided.
# The ref name is equal to the tag name when this workflow is triggered by the "sign-cli" command.
- name: Strip "v" prefix from version
id: strip_version
run: echo "version=$(echo ${{ inputs.tag || github.ref_name }} | sed 's/^v//')" >> "$GITHUB_OUTPUT"
- name: Get URLs of signed Windows binaries
id: get_windows_urls
run: |
urls=$(
gh api https://api.github.com/repos/databricks/cli/releases/tags/${{ inputs.tag || github.ref_name }} | \
jq -r .assets[].browser_download_url | \
grep -E '_windows_.*-signed\.zip$' | \
tr '\n' ' '
)
if [ -z "$urls" ]; then
echo "No signed Windows binaries found" >&2
exit 1
fi
echo "urls=$urls" >> "$GITHUB_OUTPUT"
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Publish to Winget
run: |
komac update Databricks.DatabricksCLI \
--version ${{ steps.strip_version.outputs.version }} \
--submit \
--urls ${{ steps.get_windows_urls.outputs.urls }} \
env:
KOMAC_FORK_OWNER: eng-dev-ecosystem-bot
GITHUB_TOKEN: ${{ secrets.ENG_DEV_ECOSYSTEM_BOT_TOKEN }}

View File

@ -45,20 +45,20 @@ jobs:
steps:
- name: Checkout repository and submodules
uses: actions/checkout@v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Setup Go
uses: actions/setup-go@v5
uses: actions/setup-go@3041bf56c941b39c61721a86cd11f3bb1338122a # v5.2.0
with:
go-version: 1.23.4
go-version-file: go.mod
- name: Setup Python
uses: actions/setup-python@v5
uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b # v5.3.0
with:
python-version: '3.9'
- name: Install uv
uses: astral-sh/setup-uv@v5
uses: astral-sh/setup-uv@887a942a15af3a7626099df99e897a18d9e5ab3a # v5.1.0
- name: Set go env
run: |
@ -71,18 +71,18 @@ jobs:
make vendor
pip3 install wheel
- name: Run tests
run: make test
- name: Run tests with coverage
run: make cover
golangci:
linters:
needs: cleanups
name: lint
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-go@v5
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/setup-go@3041bf56c941b39c61721a86cd11f3bb1338122a # v5.2.0
with:
go-version: 1.23.4
go-version-file: go.mod
# Use different schema from regular job, to avoid overwriting the same key
cache-dependency-path: |
go.sum
@ -95,10 +95,15 @@ jobs:
# Exit with status code 1 if there are differences (i.e. unformatted files)
git diff --exit-code
- name: golangci-lint
uses: golangci/golangci-lint-action@v6
uses: golangci/golangci-lint-action@ec5d18412c0aeab7936cb16880d708ba2a64e1ae # v6.2.0
with:
version: v1.63.4
args: --timeout=15m
- name: Run ruff
uses: astral-sh/ruff-action@f14634c415d3e63ffd4d550a22f037df4c734a60 # v3.1.0
with:
version: "0.9.1"
args: "format --check"
validate-bundle-schema:
needs: cleanups
@ -106,12 +111,12 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Setup Go
uses: actions/setup-go@v5
uses: actions/setup-go@3041bf56c941b39c61721a86cd11f3bb1338122a # v5.2.0
with:
go-version: 1.23.4
go-version-file: go.mod
# Use different schema from regular job, to avoid overwriting the same key
cache-dependency-path: |
go.sum

View File

@ -26,15 +26,15 @@ jobs:
steps:
- name: Checkout repository and submodules
uses: actions/checkout@v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
fetch-depth: 0
fetch-tags: true
- name: Setup Go
uses: actions/setup-go@v5
uses: actions/setup-go@3041bf56c941b39c61721a86cd11f3bb1338122a # v5.2.0
with:
go-version: 1.23.4
go-version-file: go.mod
# The default cache key for this action considers only the `go.sum` file.
# We include .goreleaser.yaml here to differentiate from the cache used by the push action
@ -48,27 +48,27 @@ jobs:
- name: Run GoReleaser
id: releaser
uses: goreleaser/goreleaser-action@v6
uses: goreleaser/goreleaser-action@9ed2f89a662bf1735a48bc8557fd212fa902bebf # v6.1.0
with:
version: ~> v2
args: release --snapshot --skip docker
- name: Upload macOS binaries
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4.6.0
with:
name: cli_darwin_snapshot
path: |
dist/*_darwin_*/
- name: Upload Linux binaries
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4.6.0
with:
name: cli_linux_snapshot
path: |
dist/*_linux_*/
- name: Upload Windows binaries
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4.6.0
with:
name: cli_windows_snapshot
path: |
@ -88,7 +88,7 @@ jobs:
# Snapshot release may only be updated for commits to the main branch.
if: github.ref == 'refs/heads/main'
uses: softprops/action-gh-release@v1
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
with:
name: Snapshot
prerelease: true

View File

@ -18,15 +18,15 @@ jobs:
steps:
- name: Checkout repository and submodules
uses: actions/checkout@v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
fetch-depth: 0
fetch-tags: true
- name: Setup Go
uses: actions/setup-go@v5
uses: actions/setup-go@3041bf56c941b39c61721a86cd11f3bb1338122a # v5.2.0
with:
go-version: 1.23.4
go-version-file: go.mod
# The default cache key for this action considers only the `go.sum` file.
# We include .goreleaser.yaml here to differentiate from the cache used by the push action
@ -37,7 +37,7 @@ jobs:
# Log into the GitHub Container Registry. The goreleaser action will create
# the docker images and push them to the GitHub Container Registry.
- uses: "docker/login-action@v3"
- uses: docker/login-action@9780b0c442fbb1117ed29e0efdff1e18412f7567 # v3.3.0
with:
registry: "ghcr.io"
username: "${{ github.actor }}"
@ -46,11 +46,11 @@ jobs:
# QEMU is required to build cross platform docker images using buildx.
# It allows virtualization of the CPU architecture at the application level.
- name: Set up QEMU dependency
uses: docker/setup-qemu-action@v3
uses: docker/setup-qemu-action@53851d14592bedcffcf25ea515637cff71ef929a # v3.3.0
- name: Run GoReleaser
id: releaser
uses: goreleaser/goreleaser-action@v6
uses: goreleaser/goreleaser-action@9ed2f89a662bf1735a48bc8557fd212fa902bebf # v6.1.0
with:
version: ~> v2
args: release
@ -71,7 +71,7 @@ jobs:
echo "VERSION=${VERSION:1}" >> $GITHUB_ENV
- name: Update setup-cli
uses: actions/github-script@v7
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7.0.1
with:
github-token: ${{ secrets.DECO_GITHUB_TOKEN }}
script: |
@ -99,7 +99,7 @@ jobs:
echo "VERSION=${VERSION:1}" >> $GITHUB_ENV
- name: Update homebrew-tap
uses: actions/github-script@v7
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7.0.1
with:
github-token: ${{ secrets.DECO_GITHUB_TOKEN }}
script: |
@ -140,7 +140,7 @@ jobs:
echo "VERSION=${VERSION:1}" >> $GITHUB_ENV
- name: Update CLI version in the VSCode extension
uses: actions/github-script@v7
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7.0.1
with:
github-token: ${{ secrets.DECO_GITHUB_TOKEN }}
script: |

6
.gitignore vendored
View File

@ -20,14 +20,12 @@ dist/
*.log
coverage.txt
coverage-acceptance.txt
__pycache__
*.pyc
.terraform
.terraform.lock.hcl
.vscode/launch.json
.vscode/tasks.json
.databricks
.ruff_cache

View File

@ -15,12 +15,20 @@ linters:
- intrange
- mirror
- perfsprint
- unconvert
linters-settings:
govet:
enable-all: true
disable:
- fieldalignment
- shadow
settings:
printf:
funcs:
- (github.com/databricks/cli/internal/testutil.TestingT).Infof
- (github.com/databricks/cli/internal/testutil.TestingT).Errorf
- (github.com/databricks/cli/internal/testutil.TestingT).Fatalf
- (github.com/databricks/cli/internal/testutil.TestingT).Skipf
gofmt:
rewrite-rules:
- pattern: 'a[b:len(a)]'
@ -41,6 +49,8 @@ linters-settings:
disable:
# good check, but we have too many assert.(No)?Errorf? so excluding for now
- require-error
copyloopvar:
check-alias: true
issues:
exclude-dirs-use-default: false # recommended by docs https://golangci-lint.run/usage/false-positives/
max-issues-per-linter: 1000

View File

@ -1,5 +1,89 @@
# Version changelog
## [Release] Release v0.240.0
Bundles:
* Added support for double underscore variable references ([#2203](https://github.com/databricks/cli/pull/2203)).
* Do not wait for app compute to start on `bundle deploy` ([#2144](https://github.com/databricks/cli/pull/2144)).
* Remove bundle.git.inferred ([#2258](https://github.com/databricks/cli/pull/2258)).
* libs/python: Remove DetectInterpreters ([#2234](https://github.com/databricks/cli/pull/2234)).
API Changes:
* Added `databricks access-control` command group.
* Added `databricks serving-endpoints http-request` command.
* Changed `databricks serving-endpoints create` command with new required argument order.
* Changed `databricks serving-endpoints get-open-api` command return type to become non-empty.
* Changed `databricks recipients update` command return type to become non-empty.
OpenAPI commit 0be1b914249781b5e903b7676fd02255755bc851 (2025-01-22)
Dependency updates:
* Bump github.com/databricks/databricks-sdk-go from 0.55.0 to 0.56.1 ([#2238](https://github.com/databricks/cli/pull/2238)).
* Upgrade TF provider to 1.64.1 ([#2247](https://github.com/databricks/cli/pull/2247)).
## [Release] Release v0.239.1
CLI:
* Added text output templates for apps list and list-deployments ([#2175](https://github.com/databricks/cli/pull/2175)).
* Fix duplicate "apps" entry in help output ([#2191](https://github.com/databricks/cli/pull/2191)).
Bundles:
* Allow yaml-anchors in schema ([#2200](https://github.com/databricks/cli/pull/2200)).
* Show an error when non-yaml files used in include section ([#2201](https://github.com/databricks/cli/pull/2201)).
* Set WorktreeRoot to sync root outside git repo ([#2197](https://github.com/databricks/cli/pull/2197)).
* fix: Detailed message for using source-linked deployment with file_path specified ([#2119](https://github.com/databricks/cli/pull/2119)).
* Allow using variables in enum fields ([#2199](https://github.com/databricks/cli/pull/2199)).
* Add experimental-jobs-as-code template ([#2177](https://github.com/databricks/cli/pull/2177)).
* Reading variables from file ([#2171](https://github.com/databricks/cli/pull/2171)).
* Fixed an apps message order and added output test ([#2174](https://github.com/databricks/cli/pull/2174)).
* Default to forward slash-separated paths for path translation ([#2145](https://github.com/databricks/cli/pull/2145)).
* Include a materialized copy of built-in templates ([#2146](https://github.com/databricks/cli/pull/2146)).
## [Release] Release v0.239.0
### New feature announcement
#### Databricks Apps support
You can now manage Databricks Apps using DABs by defining an `app` resource in your bundle configuration.
For more information see Databricks documentation https://docs.databricks.com/en/dev-tools/bundles/resources.html#app
#### Referencing complex variables in complex variables
You can now reference complex variables within other complex variables.
For more details see https://github.com/databricks/cli/pull/2157
CLI:
* Filter out system clusters in cluster picker ([#2131](https://github.com/databricks/cli/pull/2131)).
* Add command line flags for fields that are not in the API request body ([#2155](https://github.com/databricks/cli/pull/2155)).
Bundles:
* Added support for Databricks Apps in DABs ([#1928](https://github.com/databricks/cli/pull/1928)).
* Allow artifact path to be located outside the sync root ([#2128](https://github.com/databricks/cli/pull/2128)).
* Retry app deployment if there is an active deployment in progress ([#2153](https://github.com/databricks/cli/pull/2153)).
* Resolve variables in a loop ([#2164](https://github.com/databricks/cli/pull/2164)).
* Improve resolution of complex variables within complex variables ([#2157](https://github.com/databricks/cli/pull/2157)).
* Added output message to warn about slower deployments with apps ([#2161](https://github.com/databricks/cli/pull/2161)).
* Patch references to UC schemas to capture dependencies automatically ([#1989](https://github.com/databricks/cli/pull/1989)).
* Format default-python template ([#2110](https://github.com/databricks/cli/pull/2110)).
* Encourage the use of root_path in production to ensure single deployment ([#1712](https://github.com/databricks/cli/pull/1712)).
* Log warnings to stderr for "bundle validate -o json" ([#2109](https://github.com/databricks/cli/pull/2109)).
API Changes:
* Changed `databricks account federation-policy update` command with new required argument order.
* Changed `databricks account service-principal-federation-policy update` command with new required argument order.
OpenAPI commit 779817ed8d63031f5ea761fbd25ee84f38feec0d (2025-01-08)
Dependency updates:
* Upgrade TF provider to 1.63.0 ([#2162](https://github.com/databricks/cli/pull/2162)).
* Bump golangci-lint version to v1.63.4 from v1.63.1 ([#2114](https://github.com/databricks/cli/pull/2114)).
* Bump astral-sh/setup-uv from 4 to 5 ([#2116](https://github.com/databricks/cli/pull/2116)).
* Bump golang.org/x/oauth2 from 0.24.0 to 0.25.0 ([#2080](https://github.com/databricks/cli/pull/2080)).
* Bump github.com/hashicorp/hc-install from 0.9.0 to 0.9.1 ([#2079](https://github.com/databricks/cli/pull/2079)).
* Bump golang.org/x/term from 0.27.0 to 0.28.0 ([#2078](https://github.com/databricks/cli/pull/2078)).
* Bump github.com/databricks/databricks-sdk-go from 0.54.0 to 0.55.0 ([#2126](https://github.com/databricks/cli/pull/2126)).
## [Release] Release v0.238.0
Bundles:

View File

@ -1,12 +1,18 @@
default: build
default: vendor fmt lint tidy
PACKAGES=./acceptance/... ./libs/... ./internal/... ./cmd/... ./bundle/... .
GOTESTSUM_FORMAT ?= pkgname-and-test-fails
GOTESTSUM_CMD ?= gotestsum --format ${GOTESTSUM_FORMAT} --no-summary=skipped
lint:
golangci-lint run --fix
tidy:
@# not part of golangci-lint, apparently
go mod tidy
lintcheck:
golangci-lint run ./...
@ -14,17 +20,26 @@ lintcheck:
# formatting/goimports will not be applied by 'make lint'. However, it will be applied by 'make fmt'.
# If you need to ensure that formatting & imports are always fixed, do "make fmt lint"
fmt:
ruff format -q
golangci-lint run --enable-only="gofmt,gofumpt,goimports" --fix ./...
test:
gotestsum --format ${GOTESTSUM_FORMAT} --no-summary=skipped -- ${PACKAGES}
${GOTESTSUM_CMD} -- ${PACKAGES}
cover:
gotestsum --format ${GOTESTSUM_FORMAT} --no-summary=skipped -- -coverprofile=coverage.txt ${PACKAGES}
rm -fr ./acceptance/build/cover/
VERBOSE_TEST=1 CLI_GOCOVERDIR=build/cover ${GOTESTSUM_CMD} -- -coverprofile=coverage.txt ${PACKAGES}
rm -fr ./acceptance/build/cover-merged/
mkdir -p acceptance/build/cover-merged/
go tool covdata merge -i $$(printf '%s,' acceptance/build/cover/* | sed 's/,$$//') -o acceptance/build/cover-merged/
go tool covdata textfmt -i acceptance/build/cover-merged -o coverage-acceptance.txt
showcover:
go tool cover -html=coverage.txt
acc-showcover:
go tool cover -html=coverage-acceptance.txt
build: vendor
go build -mod vendor
@ -33,16 +48,19 @@ snapshot:
vendor:
go mod vendor
schema:
go run ./bundle/internal/schema ./bundle/internal/schema ./bundle/schema/jsonschema.json
INTEGRATION = gotestsum --format github-actions --rerun-fails --jsonfile output.json --packages "./integration/..." -- -parallel 4 -timeout=2h
docs:
go run ./bundle/docsgen ./bundle/internal/schema ./bundle/docsgen
integration:
INTEGRATION = gotestsum --format github-actions --rerun-fails --jsonfile output.json --packages "./acceptance ./integration/..." -- -parallel 4 -timeout=2h
integration: vendor
$(INTEGRATION)
integration-short:
$(INTEGRATION) -short
integration-short: vendor
VERBOSE_TEST=1 $(INTEGRATION) -short
.PHONY: lint lintcheck fmt test cover showcover build snapshot vendor schema integration integration-short
.PHONY: lint tidy lintcheck fmt test cover showcover build snapshot vendor schema integration integration-short acc-cover acc-showcover docs

9
NOTICE
View File

@ -105,3 +105,12 @@ License - https://github.com/wI2L/jsondiff/blob/master/LICENSE
https://github.com/hexops/gotextdiff
Copyright (c) 2009 The Go Authors. All rights reserved.
License - https://github.com/hexops/gotextdiff/blob/main/LICENSE
https://github.com/BurntSushi/toml
Copyright (c) 2013 TOML authors
https://github.com/BurntSushi/toml/blob/master/COPYING
dario.cat/mergo
Copyright (c) 2013 Dario Castañé. All rights reserved.
Copyright (c) 2012 The Go Authors. All rights reserved.
https://github.com/darccio/mergo/blob/master/LICENSE

1
acceptance/.gitignore vendored Normal file
View File

@ -0,0 +1 @@
build

View File

@ -17,3 +17,5 @@ For more complex tests one can also use:
- `errcode` helper: if the command fails with non-zero code, it appends `Exit code: N` to the output but returns success to caller (bash), allowing continuation of script.
- `trace` helper: prints the arguments before executing the command.
- custom output files: redirect output to custom file (it must start with `out`), e.g. `$CLI bundle validate > out.txt 2> out.error.txt`.
See [selftest](./selftest) for a toy test.

View File

@ -1,9 +1,13 @@
package acceptance_test
import (
"context"
"encoding/json"
"errors"
"flag"
"fmt"
"io"
"net/http"
"os"
"os/exec"
"path/filepath"
@ -13,19 +17,45 @@ import (
"strings"
"testing"
"time"
"unicode/utf8"
"github.com/google/uuid"
"github.com/databricks/cli/internal/testutil"
"github.com/databricks/cli/libs/env"
"github.com/databricks/cli/libs/testdiff"
"github.com/databricks/cli/libs/testserver"
"github.com/databricks/databricks-sdk-go"
"github.com/stretchr/testify/require"
)
var KeepTmp = os.Getenv("KEEP_TMP") != ""
var (
KeepTmp bool
NoRepl bool
VerboseTest bool = os.Getenv("VERBOSE_TEST") != ""
)
// In order to debug CLI running under acceptance test, set this to full subtest name, e.g. "bundle/variables/empty"
// Then install your breakpoints and click "debug test" near TestAccept in VSCODE.
// example: var SingleTest = "bundle/variables/empty"
var SingleTest = ""
// If enabled, instead of compiling and running CLI externally, we'll start in-process server that accepts and runs
// CLI commands. The $CLI in test scripts is a helper that just forwards command-line arguments to this server (see bin/callserver.py).
// Also disables parallelism in tests.
var InprocessMode bool
func init() {
flag.BoolVar(&InprocessMode, "inprocess", SingleTest != "", "Run CLI in the same process as test (for debugging)")
flag.BoolVar(&KeepTmp, "keeptmp", false, "Do not delete TMP directory after run")
flag.BoolVar(&NoRepl, "norepl", false, "Do not apply any replacements (for debugging)")
}
const (
EntryPointScript = "script"
CleanupScript = "script.cleanup"
PrepareScript = "script.prepare"
MaxFileSize = 100_000
)
var Scripts = map[string]bool{
@ -35,37 +65,132 @@ var Scripts = map[string]bool{
}
func TestAccept(t *testing.T) {
testAccept(t, InprocessMode, SingleTest)
}
func TestInprocessMode(t *testing.T) {
if InprocessMode {
t.Skip("Already tested by TestAccept")
}
require.Equal(t, 1, testAccept(t, true, "selftest"))
}
func testAccept(t *testing.T, InprocessMode bool, singleTest string) int {
repls := testdiff.ReplacementsContext{}
cwd, err := os.Getwd()
require.NoError(t, err)
execPath := BuildCLI(t, cwd)
// $CLI is what test scripts are using
buildDir := filepath.Join(cwd, "build", fmt.Sprintf("%s_%s", runtime.GOOS, runtime.GOARCH))
// Download terraform and provider and create config; this also creates build directory.
RunCommand(t, []string{"python3", filepath.Join(cwd, "install_terraform.py"), "--targetdir", buildDir}, ".")
coverDir := os.Getenv("CLI_GOCOVERDIR")
if coverDir != "" {
require.NoError(t, os.MkdirAll(coverDir, os.ModePerm))
coverDir, err = filepath.Abs(coverDir)
require.NoError(t, err)
t.Logf("Writing coverage to %s", coverDir)
}
execPath := ""
if InprocessMode {
cmdServer := StartCmdServer(t)
t.Setenv("CMD_SERVER_URL", cmdServer.URL)
execPath = filepath.Join(cwd, "bin", "callserver.py")
} else {
execPath = BuildCLI(t, buildDir, coverDir)
}
t.Setenv("CLI", execPath)
repls.SetPath(execPath, "[CLI]")
// Make helper scripts available
t.Setenv("PATH", fmt.Sprintf("%s%c%s", filepath.Join(cwd, "bin"), os.PathListSeparator, os.Getenv("PATH")))
server := StartServer(t)
AddHandlers(server)
// Redirect API access to local server:
t.Setenv("DATABRICKS_HOST", fmt.Sprintf("http://127.0.0.1:%d", server.Port))
t.Setenv("DATABRICKS_TOKEN", "dapi1234")
tempHomeDir := t.TempDir()
repls.SetPath(tempHomeDir, "[TMPHOME]")
t.Logf("$TMPHOME=%v", tempHomeDir)
homeDir := t.TempDir()
// Do not read user's ~/.databrickscfg
t.Setenv(env.HomeEnvVar(), homeDir)
// Make use of uv cache; since we set HomeEnvVar to temporary directory, it is not picked up automatically
uvCache := getUVDefaultCacheDir(t)
t.Setenv("UV_CACHE_DIR", uvCache)
repls := testdiff.ReplacementsContext{}
repls.Set(execPath, "$CLI")
ctx := context.Background()
cloudEnv := os.Getenv("CLOUD_ENV")
if cloudEnv == "" {
defaultServer := testserver.New(t)
AddHandlers(defaultServer)
// Redirect API access to local server:
t.Setenv("DATABRICKS_HOST", defaultServer.URL)
homeDir := t.TempDir()
// Do not read user's ~/.databrickscfg
t.Setenv(env.HomeEnvVar(), homeDir)
}
terraformrcPath := filepath.Join(buildDir, ".terraformrc")
t.Setenv("TF_CLI_CONFIG_FILE", terraformrcPath)
t.Setenv("DATABRICKS_TF_CLI_CONFIG_FILE", terraformrcPath)
repls.SetPath(terraformrcPath, "[DATABRICKS_TF_CLI_CONFIG_FILE]")
terraformExecPath := filepath.Join(buildDir, "terraform")
if runtime.GOOS == "windows" {
terraformExecPath += ".exe"
}
t.Setenv("DATABRICKS_TF_EXEC_PATH", terraformExecPath)
t.Setenv("TERRAFORM", terraformExecPath)
repls.SetPath(terraformExecPath, "[TERRAFORM]")
// do it last so that full paths match first:
repls.SetPath(buildDir, "[BUILD_DIR]")
var config databricks.Config
if cloudEnv == "" {
// use fake token for local tests
config = databricks.Config{Token: "dbapi1234"}
} else {
// non-local tests rely on environment variables
config = databricks.Config{}
}
workspaceClient, err := databricks.NewWorkspaceClient(&config)
require.NoError(t, err)
user, err := workspaceClient.CurrentUser.Me(ctx)
require.NoError(t, err)
require.NotNil(t, user)
testdiff.PrepareReplacementsUser(t, &repls, *user)
testdiff.PrepareReplacementsWorkspaceClient(t, &repls, workspaceClient)
testdiff.PrepareReplacementsUUID(t, &repls)
testdiff.PrepareReplacementsDevVersion(t, &repls)
testdiff.PrepareReplacementSdkVersion(t, &repls)
testdiff.PrepareReplacementsGoVersion(t, &repls)
testDirs := getTests(t)
require.NotEmpty(t, testDirs)
if singleTest != "" {
testDirs = slices.DeleteFunc(testDirs, func(n string) bool {
return n != singleTest
})
require.NotEmpty(t, testDirs, "singleTest=%#v did not match any tests\n%#v", singleTest, testDirs)
}
for _, dir := range testDirs {
t.Run(dir, func(t *testing.T) {
t.Parallel()
runTest(t, dir, repls)
testName := strings.ReplaceAll(dir, "\\", "/")
t.Run(testName, func(t *testing.T) {
if !InprocessMode {
t.Parallel()
}
runTest(t, dir, coverDir, repls.Clone())
})
}
return len(testDirs)
}
func getTests(t *testing.T) []string {
@ -88,7 +213,19 @@ func getTests(t *testing.T) []string {
return testDirs
}
func runTest(t *testing.T, dir string, repls testdiff.ReplacementsContext) {
func runTest(t *testing.T, dir, coverDir string, repls testdiff.ReplacementsContext) {
config, configPath := LoadConfig(t, dir)
isEnabled, isPresent := config.GOOS[runtime.GOOS]
if isPresent && !isEnabled {
t.Skipf("Disabled via GOOS.%s setting in %s", runtime.GOOS, configPath)
}
cloudEnv := os.Getenv("CLOUD_ENV")
if config.LocalOnly && cloudEnv != "" {
t.Skipf("Disabled via LocalOnly setting in %s (CLOUD_ENV=%s)", configPath, cloudEnv)
}
var tmpDir string
var err error
if KeepTmp {
@ -101,6 +238,9 @@ func runTest(t *testing.T, dir string, repls testdiff.ReplacementsContext) {
tmpDir = t.TempDir()
}
repls.SetPathWithParents(tmpDir, "[TMPDIR]")
repls.Repls = append(repls.Repls, config.Repls...)
scriptContents := readMergedScriptContents(t, dir)
testutil.WriteFile(t, filepath.Join(tmpDir, EntryPointScript), scriptContents)
@ -111,70 +251,175 @@ func runTest(t *testing.T, dir string, repls testdiff.ReplacementsContext) {
args := []string{"bash", "-euo", "pipefail", EntryPointScript}
cmd := exec.Command(args[0], args[1:]...)
cmd.Env = os.Environ()
// Start a new server with a custom configuration if the acceptance test
// specifies a custom server stubs.
var server *testserver.Server
// Start a new server for this test if either:
// 1. A custom server spec is defined in the test configuration.
// 2. The test is configured to record requests and assert on them. We need
// a duplicate of the default server to record requests because the default
// server otherwise is a shared resource.
if cloudEnv == "" && (len(config.Server) > 0 || config.RecordRequests) {
server = testserver.New(t)
server.RecordRequests = config.RecordRequests
server.IncludeRequestHeaders = config.IncludeRequestHeaders
// If no custom server stubs are defined, add the default handlers.
if len(config.Server) == 0 {
AddHandlers(server)
}
for _, stub := range config.Server {
require.NotEmpty(t, stub.Pattern)
server.Handle(stub.Pattern, func(fakeWorkspace *testserver.FakeWorkspace, req *http.Request) (any, int) {
statusCode := http.StatusOK
if stub.Response.StatusCode != 0 {
statusCode = stub.Response.StatusCode
}
return stub.Response.Body, statusCode
})
}
cmd.Env = append(cmd.Env, "DATABRICKS_HOST="+server.URL)
}
if coverDir != "" {
// Creating individual coverage directory for each test, because writing to the same one
// results in sporadic failures like this one (only if tests are running in parallel):
// +error: coverage meta-data emit failed: writing ... rename .../tmp.covmeta.b3f... .../covmeta.b3f2c...: no such file or directory
coverDir = filepath.Join(coverDir, strings.ReplaceAll(dir, string(os.PathSeparator), "--"))
err := os.MkdirAll(coverDir, os.ModePerm)
require.NoError(t, err)
cmd.Env = append(cmd.Env, "GOCOVERDIR="+coverDir)
}
// Each local test should use a new token that will result into a new fake workspace,
// so that test don't interfere with each other.
if cloudEnv == "" {
tokenSuffix := strings.ReplaceAll(uuid.NewString(), "-", "")
token := "dbapi" + tokenSuffix
cmd.Env = append(cmd.Env, "DATABRICKS_TOKEN="+token)
repls.Set(token, "[DATABRICKS_TOKEN]")
}
// Write combined output to a file
out, err := os.Create(filepath.Join(tmpDir, "output.txt"))
require.NoError(t, err)
cmd.Stdout = out
cmd.Stderr = out
cmd.Dir = tmpDir
outB, err := cmd.CombinedOutput()
err = cmd.Run()
out := formatOutput(string(outB), err)
out = repls.Replace(out)
doComparison(t, filepath.Join(dir, "output.txt"), "script output", out)
// Write the requests made to the server to a output file if the test is
// configured to record requests.
if config.RecordRequests {
f, err := os.OpenFile(filepath.Join(tmpDir, "out.requests.txt"), os.O_CREATE|os.O_WRONLY, 0o644)
require.NoError(t, err)
for key := range outputs {
if key == "output.txt" {
// handled above
continue
for _, req := range server.Requests {
reqJson, err := json.MarshalIndent(req, "", " ")
require.NoError(t, err)
reqJsonWithRepls := repls.Replace(string(reqJson))
_, err = f.WriteString(reqJsonWithRepls + "\n")
require.NoError(t, err)
}
pathNew := filepath.Join(tmpDir, key)
newValBytes, err := os.ReadFile(pathNew)
if err != nil {
if errors.Is(err, os.ErrNotExist) {
t.Errorf("%s: expected to find this file but could not (%s)", key, tmpDir)
} else {
t.Errorf("%s: could not read: %s", key, err)
}
continue
}
pathExpected := filepath.Join(dir, key)
newVal := repls.Replace(string(newValBytes))
doComparison(t, pathExpected, pathNew, newVal)
err = f.Close()
require.NoError(t, err)
}
// Include exit code in output (if non-zero)
formatOutput(out, err)
require.NoError(t, out.Close())
printedRepls := false
// Compare expected outputs
for relPath := range outputs {
doComparison(t, repls, dir, tmpDir, relPath, &printedRepls)
}
// Make sure there are not unaccounted for new files
files, err := os.ReadDir(tmpDir)
require.NoError(t, err)
for _, f := range files {
name := f.Name()
if _, ok := inputs[name]; ok {
files := ListDir(t, tmpDir)
unexpected := []string{}
for _, relPath := range files {
if _, ok := inputs[relPath]; ok {
continue
}
if _, ok := outputs[name]; ok {
if _, ok := outputs[relPath]; ok {
continue
}
t.Errorf("Unexpected output: %s", f)
if strings.HasPrefix(name, "out") {
unexpected = append(unexpected, relPath)
if strings.HasPrefix(relPath, "out") {
// We have a new file starting with "out"
// Show the contents & support overwrite mode for it:
pathNew := filepath.Join(tmpDir, name)
newVal := testutil.ReadFile(t, pathNew)
newVal = repls.Replace(newVal)
doComparison(t, filepath.Join(dir, name), filepath.Join(tmpDir, name), newVal)
doComparison(t, repls, dir, tmpDir, relPath, &printedRepls)
}
}
if len(unexpected) > 0 {
t.Error("Test produced unexpected files:\n" + strings.Join(unexpected, "\n"))
}
}
func doComparison(t *testing.T, pathExpected, pathNew, valueNew string) {
valueNew = testdiff.NormalizeNewlines(valueNew)
valueExpected := string(readIfExists(t, pathExpected))
valueExpected = testdiff.NormalizeNewlines(valueExpected)
testdiff.AssertEqualTexts(t, pathExpected, pathNew, valueExpected, valueNew)
if testdiff.OverwriteMode {
if valueNew != "" {
t.Logf("Overwriting: %s", pathExpected)
testutil.WriteFile(t, pathExpected, valueNew)
} else {
t.Logf("Removing: %s", pathExpected)
_ = os.Remove(pathExpected)
func doComparison(t *testing.T, repls testdiff.ReplacementsContext, dirRef, dirNew, relPath string, printedRepls *bool) {
pathRef := filepath.Join(dirRef, relPath)
pathNew := filepath.Join(dirNew, relPath)
bufRef, okRef := tryReading(t, pathRef)
bufNew, okNew := tryReading(t, pathNew)
if !okRef && !okNew {
t.Errorf("Both files are missing or have errors: %s\npathRef: %s\npathNew: %s", relPath, pathRef, pathNew)
return
}
valueRef := testdiff.NormalizeNewlines(bufRef)
valueNew := testdiff.NormalizeNewlines(bufNew)
// Apply replacements to the new value only.
// The reference value is stored after applying replacements.
if !NoRepl {
valueNew = repls.Replace(valueNew)
}
// The test did not produce an expected output file.
if okRef && !okNew {
t.Errorf("Missing output file: %s\npathRef: %s\npathNew: %s", relPath, pathRef, pathNew)
testdiff.AssertEqualTexts(t, pathRef, pathNew, valueRef, valueNew)
if testdiff.OverwriteMode {
t.Logf("Removing output file: %s", relPath)
require.NoError(t, os.Remove(pathRef))
}
return
}
// The test produced an unexpected output file.
if !okRef && okNew {
t.Errorf("Unexpected output file: %s\npathRef: %s\npathNew: %s", relPath, pathRef, pathNew)
testdiff.AssertEqualTexts(t, pathRef, pathNew, valueRef, valueNew)
if testdiff.OverwriteMode {
t.Logf("Writing output file: %s", relPath)
testutil.WriteFile(t, pathRef, valueNew)
}
return
}
// Compare the reference and new values.
equal := testdiff.AssertEqualTexts(t, pathRef, pathNew, valueRef, valueNew)
if !equal && testdiff.OverwriteMode {
t.Logf("Overwriting existing output file: %s", relPath)
testutil.WriteFile(t, pathRef, valueNew)
}
if VerboseTest && !equal && printedRepls != nil && !*printedRepls {
*printedRepls = true
var items []string
for _, item := range repls.Repls {
items = append(items, fmt.Sprintf("REPL %s => %s", item.Old, item.New))
}
t.Log("Available replacements:\n" + strings.Join(items, "\n"))
}
}
@ -182,18 +427,23 @@ func doComparison(t *testing.T, pathExpected, pathNew, valueNew string) {
// Note, cleanups are not executed if main script fails; that's not a huge issue, since it runs it temp dir.
func readMergedScriptContents(t *testing.T, dir string) string {
scriptContents := testutil.ReadFile(t, filepath.Join(dir, EntryPointScript))
// Wrap script contents in a subshell such that changing the working
// directory only affects the main script and not cleanup.
scriptContents = "(\n" + scriptContents + ")\n"
prepares := []string{}
cleanups := []string{}
for {
x := readIfExists(t, filepath.Join(dir, CleanupScript))
if len(x) > 0 {
cleanups = append(cleanups, string(x))
x, ok := tryReading(t, filepath.Join(dir, CleanupScript))
if ok {
cleanups = append(cleanups, x)
}
x = readIfExists(t, filepath.Join(dir, PrepareScript))
if len(x) > 0 {
prepares = append(prepares, string(x))
x, ok = tryReading(t, filepath.Join(dir, PrepareScript))
if ok {
prepares = append(prepares, x)
}
if dir == "" || dir == "." {
@ -210,28 +460,30 @@ func readMergedScriptContents(t *testing.T, dir string) string {
return strings.Join(prepares, "\n")
}
func BuildCLI(t *testing.T, cwd string) string {
execPath := filepath.Join(cwd, "build", "databricks")
func BuildCLI(t *testing.T, buildDir, coverDir string) string {
execPath := filepath.Join(buildDir, "databricks")
if runtime.GOOS == "windows" {
execPath += ".exe"
}
start := time.Now()
args := []string{"go", "build", "-mod", "vendor", "-o", execPath}
cmd := exec.Command(args[0], args[1:]...)
cmd.Dir = ".."
out, err := cmd.CombinedOutput()
elapsed := time.Since(start)
t.Logf("%s took %s", args, elapsed)
require.NoError(t, err, "go build failed: %s: %s\n%s", args, err, out)
if len(out) > 0 {
t.Logf("go build output: %s: %s", args, out)
args := []string{
"go", "build",
"-mod", "vendor",
"-o", execPath,
}
// Quick check + warm up cache:
cmd = exec.Command(execPath, "--version")
out, err = cmd.CombinedOutput()
require.NoError(t, err, "%s --version failed: %s\n%s", execPath, err, out)
if coverDir != "" {
args = append(args, "-cover")
}
if runtime.GOOS == "windows" {
// Get this error on my local Windows:
// error obtaining VCS status: exit status 128
// Use -buildvcs=false to disable VCS stamping.
args = append(args, "-buildvcs=false")
}
RunCommand(t, args, "..")
return execPath
}
@ -252,29 +504,45 @@ func copyFile(src, dst string) error {
return err
}
func formatOutput(out string, err error) string {
func formatOutput(w io.Writer, err error) {
if err == nil {
return out
return
}
if exiterr, ok := err.(*exec.ExitError); ok {
exitCode := exiterr.ExitCode()
out += fmt.Sprintf("\nExit code: %d\n", exitCode)
fmt.Fprintf(w, "\nExit code: %d\n", exitCode)
} else {
out += fmt.Sprintf("\nError: %s\n", err)
fmt.Fprintf(w, "\nError: %s\n", err)
}
return out
}
func readIfExists(t *testing.T, path string) []byte {
data, err := os.ReadFile(path)
if err == nil {
return data
func tryReading(t *testing.T, path string) (string, bool) {
info, err := os.Stat(path)
if err != nil {
if !errors.Is(err, os.ErrNotExist) {
t.Errorf("%s: %s", path, err)
}
return "", false
}
if !errors.Is(err, os.ErrNotExist) {
t.Fatalf("%s: %s", path, err)
if info.Size() > MaxFileSize {
t.Errorf("%s: ignoring, too large: %d", path, info.Size())
return "", false
}
return []byte{}
data, err := os.ReadFile(path)
if err != nil {
// already checked ErrNotExist above
t.Errorf("%s: %s", path, err)
return "", false
}
if !utf8.Valid(data) {
t.Errorf("%s: not valid utf-8", path)
return "", false
}
return string(data), true
}
func CopyDir(src, dst string, inputs, outputs map[string]bool) error {
@ -289,8 +557,10 @@ func CopyDir(src, dst string, inputs, outputs map[string]bool) error {
return err
}
if strings.HasPrefix(name, "out") {
outputs[relPath] = true
if strings.HasPrefix(relPath, "out") {
if !info.IsDir() {
outputs[relPath] = true
}
return nil
} else {
inputs[relPath] = true
@ -309,3 +579,59 @@ func CopyDir(src, dst string, inputs, outputs map[string]bool) error {
return copyFile(path, destPath)
})
}
func ListDir(t *testing.T, src string) []string {
var files []string
err := filepath.Walk(src, func(path string, info os.FileInfo, err error) error {
if err != nil {
// Do not FailNow here.
// The output comparison is happening after this call which includes output.txt which
// includes errors printed by commands which include explanation why a given file cannot be read.
t.Errorf("Error when listing %s: path=%s: %s", src, path, err)
return nil
}
if info.IsDir() {
return nil
}
relPath, err := filepath.Rel(src, path)
if err != nil {
return err
}
files = append(files, relPath)
return nil
})
if err != nil {
t.Errorf("Failed to list %s: %s", src, err)
}
return files
}
func getUVDefaultCacheDir(t *testing.T) string {
// According to uv docs https://docs.astral.sh/uv/concepts/cache/#caching-in-continuous-integration
// the default cache directory is
// "A system-appropriate cache directory, e.g., $XDG_CACHE_HOME/uv or $HOME/.cache/uv on Unix and %LOCALAPPDATA%\uv\cache on Windows"
cacheDir, err := os.UserCacheDir()
require.NoError(t, err)
if runtime.GOOS == "windows" {
return cacheDir + "\\uv\\cache"
} else {
return cacheDir + "/uv"
}
}
func RunCommand(t *testing.T, args []string, dir string) {
start := time.Now()
cmd := exec.Command(args[0], args[1:]...)
cmd.Dir = dir
out, err := cmd.CombinedOutput()
elapsed := time.Since(start)
t.Logf("%s took %s", args, elapsed)
require.NoError(t, err, "%s failed: %s\n%s", args, err, out)
if len(out) > 0 {
t.Logf("%s output: %s", args, out)
}
}

View File

@ -0,0 +1,5 @@
[DEFAULT]
host = $DATABRICKS_HOST
[profile_name]
host = https://test@non-existing-subdomain.databricks.com

View File

@ -0,0 +1,14 @@
bundle:
name: test-auth
workspace:
host: $DATABRICKS_HOST
targets:
dev:
default: true
workspace:
host: $DATABRICKS_HOST
prod:
workspace:
host: https://bar.com

View File

@ -0,0 +1,32 @@
=== Inside the bundle, no flags
>>> errcode [CLI] current-user me
"[USERNAME]"
=== Inside the bundle, target flags
>>> errcode [CLI] current-user me -t dev
"[USERNAME]"
=== Inside the bundle, target and matching profile
>>> errcode [CLI] current-user me -t dev -p DEFAULT
"[USERNAME]"
=== Inside the bundle, profile flag not matching bundle host. Badness: should use profile from flag instead and not fail
>>> errcode [CLI] current-user me -p profile_name
Error: cannot resolve bundle auth configuration: config host mismatch: profile uses host https://non-existing-subdomain.databricks.com, but CLI configured to use [DATABRICKS_URL]
Exit code: 1
=== Inside the bundle, target and not matching profile
>>> errcode [CLI] current-user me -t dev -p profile_name
Error: cannot resolve bundle auth configuration: config host mismatch: profile uses host https://non-existing-subdomain.databricks.com, but CLI configured to use [DATABRICKS_URL]
Exit code: 1
=== Outside the bundle, no flags
>>> errcode [CLI] current-user me
"[USERNAME]"
=== Outside the bundle, profile flag
>>> errcode [CLI] current-user me -p profile_name
"[USERNAME]"

View File

@ -0,0 +1,30 @@
# Replace placeholder with an actual host URL
envsubst < databricks.yml > out.yml && mv out.yml databricks.yml
envsubst < .databrickscfg > out && mv out .databrickscfg
export DATABRICKS_CONFIG_FILE=.databrickscfg
host=$DATABRICKS_HOST
unset DATABRICKS_HOST
title "Inside the bundle, no flags"
trace errcode $CLI current-user me | jq .userName
title "Inside the bundle, target flags"
trace errcode $CLI current-user me -t dev | jq .userName
title "Inside the bundle, target and matching profile"
trace errcode $CLI current-user me -t dev -p DEFAULT | jq .userName
title "Inside the bundle, profile flag not matching bundle host. Badness: should use profile from flag instead and not fail"
trace errcode $CLI current-user me -p profile_name | jq .userName
title "Inside the bundle, target and not matching profile"
trace errcode $CLI current-user me -t dev -p profile_name
cd ..
export DATABRICKS_HOST=$host
title "Outside the bundle, no flags"
trace errcode $CLI current-user me | jq .userName
title "Outside the bundle, profile flag"
trace errcode $CLI current-user me -p profile_name | jq .userName

View File

@ -0,0 +1,8 @@
Badness = "When -p flag is used inside the bundle folder for any CLI commands, CLI use bundle host anyway instead of profile one"
# Some of the clouds have DATABRICKS_HOST variable setup without https:// prefix
# In the result, output is replaced with DATABRICKS_URL variable instead of DATABRICKS_HOST
# This is a workaround to replace DATABRICKS_URL with DATABRICKS_HOST
[[Repls]]
Old='DATABRICKS_HOST'
New='DATABRICKS_URL'

31
acceptance/bin/callserver.py Executable file
View File

@ -0,0 +1,31 @@
#!/usr/bin/env python3
import sys
import os
import json
import urllib.request
from urllib.parse import urlencode
env = {}
for key, value in os.environ.items():
if len(value) > 10_000:
sys.stderr.write(f"Dropping key={key} value len={len(value)}\n")
continue
env[key] = value
q = {
"args": " ".join(sys.argv[1:]),
"cwd": os.getcwd(),
"env": json.dumps(env),
}
url = os.environ["CMD_SERVER_URL"] + "/?" + urlencode(q)
if len(url) > 100_000:
sys.exit("url too large")
resp = urllib.request.urlopen(url)
assert resp.status == 200, (resp.status, resp.url, resp.headers)
result = json.load(resp)
sys.stderr.write(result["stderr"])
sys.stdout.write(result["stdout"])
exitcode = int(result["exitcode"])
sys.exit(exitcode)

View File

@ -4,6 +4,7 @@ Helper to sort blocks in text file. A block is a set of lines separated from oth
This is to workaround non-determinism in the output.
"""
import sys
blocks = []
@ -11,10 +12,10 @@ blocks = []
for line in sys.stdin:
if not line.strip():
if blocks and blocks[-1]:
blocks.append('')
blocks.append("")
continue
if not blocks:
blocks.append('')
blocks.append("")
blocks[-1] += line
blocks.sort()

10
acceptance/bin/sort_lines.py Executable file
View File

@ -0,0 +1,10 @@
#!/usr/bin/env python3
"""
Helper to sort lines in text file. Similar to 'sort' but no dependence on locale or presence of 'sort' in PATH.
"""
import sys
lines = sys.stdin.readlines()
lines.sort()
sys.stdout.write("".join(lines))

View File

@ -1 +0,0 @@
databricks

View File

@ -0,0 +1,50 @@
bundle:
name: same_name_libraries
variables:
cluster:
default:
spark_version: 15.4.x-scala2.12
node_type_id: i3.xlarge
data_security_mode: SINGLE_USER
num_workers: 0
spark_conf:
spark.master: "local[*, 4]"
spark.databricks.cluster.profile: singleNode
custom_tags:
ResourceClass: SingleNode
artifacts:
whl1:
type: whl
path: ./whl1
whl2:
type: whl
path: ./whl2
resources:
jobs:
test:
name: "test"
tasks:
- task_key: task1
new_cluster: ${var.cluster}
python_wheel_task:
entry_point: main
package_name: my_default_python
libraries:
- whl: ./whl1/dist/*.whl
- task_key: task2
new_cluster: ${var.cluster}
python_wheel_task:
entry_point: main
package_name: my_default_python
libraries:
- whl: ./whl2/dist/*.whl
- task_key: task3
new_cluster: ${var.cluster}
python_wheel_task:
entry_point: main
package_name: my_default_python
libraries:
- whl: ./whl1/dist/*.whl

View File

@ -0,0 +1,14 @@
>>> errcode [CLI] bundle deploy
Building whl1...
Building whl2...
Error: Duplicate local library name my_default_python-0.0.1-py3-none-any.whl
at resources.jobs.test.tasks[0].libraries[0].whl
resources.jobs.test.tasks[1].libraries[0].whl
in databricks.yml:36:15
databricks.yml:43:15
Local library names must be unique
Exit code: 1

View File

@ -0,0 +1,2 @@
trace errcode $CLI bundle deploy
rm -rf whl1 whl2

View File

@ -0,0 +1,36 @@
"""
setup.py configuration script describing how to build and package this project.
This file is primarily used by the setuptools library and typically should not
be executed directly. See README.md for how to deploy, test, and run
the my_default_python project.
"""
from setuptools import setup, find_packages
import sys
sys.path.append("./src")
import my_default_python
setup(
name="my_default_python",
version=my_default_python.__version__,
url="https://databricks.com",
author="[USERNAME]",
description="wheel file based on my_default_python/src",
packages=find_packages(where="./src"),
package_dir={"": "src"},
entry_points={
"packages": [
"main=my_default_python.main:main",
],
},
install_requires=[
# Dependencies in case the output wheel file is used as a library dependency.
# For defining dependencies, when this package is used in Databricks, see:
# https://docs.databricks.com/dev-tools/bundles/library-dependencies.html
"setuptools"
],
)

View File

@ -0,0 +1 @@
__version__ = "0.0.1"

View File

@ -0,0 +1 @@
print("hello")

View File

@ -0,0 +1,36 @@
"""
setup.py configuration script describing how to build and package this project.
This file is primarily used by the setuptools library and typically should not
be executed directly. See README.md for how to deploy, test, and run
the my_default_python project.
"""
from setuptools import setup, find_packages
import sys
sys.path.append("./src")
import my_default_python
setup(
name="my_default_python",
version=my_default_python.__version__,
url="https://databricks.com",
author="[USERNAME]",
description="wheel file based on my_default_python/src",
packages=find_packages(where="./src"),
package_dir={"": "src"},
entry_points={
"packages": [
"main=my_default_python.main:main",
],
},
install_requires=[
# Dependencies in case the output wheel file is used as a library dependency.
# For defining dependencies, when this package is used in Databricks, see:
# https://docs.databricks.com/dev-tools/bundles/library-dependencies.html
"setuptools"
],
)

View File

@ -0,0 +1 @@
__version__ = "0.0.1"

View File

@ -0,0 +1 @@
print("hello")

View File

@ -0,0 +1,2 @@
bundle:
name: debug

View File

@ -0,0 +1,15 @@
10:07:59 Debug: ApplyReadOnly pid=12345 mutator=validate mutator (read-only)=parallel
10:07:59 Debug: ApplyReadOnly pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=fast_validate(readonly)
10:07:59 Debug: ApplyReadOnly pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=fast_validate(readonly) mutator (read-only)=parallel
10:07:59 Debug: ApplyReadOnly pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=fast_validate(readonly) mutator (read-only)=parallel mutator (read-only)=validate:SingleNodeCluster
10:07:59 Debug: ApplyReadOnly pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=fast_validate(readonly) mutator (read-only)=parallel mutator (read-only)=validate:artifact_paths
10:07:59 Debug: ApplyReadOnly pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=fast_validate(readonly) mutator (read-only)=parallel mutator (read-only)=validate:job_cluster_key_defined
10:07:59 Debug: ApplyReadOnly pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=fast_validate(readonly) mutator (read-only)=parallel mutator (read-only)=validate:job_task_cluster_spec
10:07:59 Debug: ApplyReadOnly pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=validate:files_to_sync
10:07:59 Debug: ApplyReadOnly pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=validate:folder_permissions
10:07:59 Debug: ApplyReadOnly pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=validate:validate_sync_patterns
10:07:59 Debug: Path /Workspace/Users/[USERNAME]/.bundle/debug/default/files has type directory (ID: 0) pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=validate:files_to_sync
10:07:59 Debug: non-retriable error: pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=validate:files_to_sync sdk=true
< {} pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=validate:files_to_sync sdk=true
< {} pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=validate:files_to_sync sdk=true
< } pid=12345 mutator=validate mutator (read-only)=parallel mutator (read-only)=validate:files_to_sync sdk=true

View File

@ -0,0 +1,92 @@
10:07:59 Info: start pid=12345 version=[DEV_VERSION] args="[CLI], bundle, validate, --debug"
10:07:59 Debug: Found bundle root at [TMPDIR] (file [TMPDIR]/databricks.yml) pid=12345
10:07:59 Debug: Apply pid=12345 mutator=load
10:07:59 Info: Phase: load pid=12345 mutator=load
10:07:59 Debug: Apply pid=12345 mutator=load mutator=seq
10:07:59 Debug: Apply pid=12345 mutator=load mutator=seq mutator=EntryPoint
10:07:59 Debug: Apply pid=12345 mutator=load mutator=seq mutator=scripts.preinit
10:07:59 Debug: No script defined for preinit, skipping pid=12345 mutator=load mutator=seq mutator=scripts.preinit
10:07:59 Debug: Apply pid=12345 mutator=load mutator=seq mutator=ProcessRootIncludes
10:07:59 Debug: Apply pid=12345 mutator=load mutator=seq mutator=ProcessRootIncludes mutator=seq
10:07:59 Debug: Apply pid=12345 mutator=load mutator=seq mutator=VerifyCliVersion
10:07:59 Debug: Apply pid=12345 mutator=load mutator=seq mutator=EnvironmentsToTargets
10:07:59 Debug: Apply pid=12345 mutator=load mutator=seq mutator=ComputeIdToClusterId
10:07:59 Debug: Apply pid=12345 mutator=load mutator=seq mutator=InitializeVariables
10:07:59 Debug: Apply pid=12345 mutator=load mutator=seq mutator=DefineDefaultTarget(default)
10:07:59 Debug: Apply pid=12345 mutator=load mutator=seq mutator=PythonMutator(load)
10:07:59 Debug: Apply pid=12345 mutator=load mutator=seq mutator=validate:unique_resource_keys
10:07:59 Debug: Apply pid=12345 mutator=load mutator=seq mutator=SelectDefaultTarget
10:07:59 Debug: Apply pid=12345 mutator=load mutator=seq mutator=SelectDefaultTarget mutator=SelectTarget(default)
10:07:59 Debug: Apply pid=12345 mutator=<func>
10:07:59 Debug: Apply pid=12345 mutator=initialize
10:07:59 Info: Phase: initialize pid=12345 mutator=initialize
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=validate:AllResourcesHaveValues
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=RewriteSyncPaths
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=SyncDefaultPath
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=SyncInferRoot
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=PopulateCurrentUser
10:07:59 Debug: GET /api/2.0/preview/scim/v2/Me
< HTTP/1.1 200 OK
< {
< "id": "[USERID]",
< "userName": "[USERNAME]"
< } pid=12345 mutator=initialize mutator=seq mutator=PopulateCurrentUser sdk=true
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=LoadGitDetails
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=ApplySourceLinkedDeploymentPreset
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=DefineDefaultWorkspaceRoot
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=ExpandWorkspaceRoot
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=DefaultWorkspacePaths
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=PrependWorkspacePrefix
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=RewriteWorkspacePrefix
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=SetVariables
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=PythonMutator(init)
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=PythonMutator(load_resources)
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=PythonMutator(apply_mutators)
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=ResolveVariableReferences
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=ResolveResourceReferences
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=ResolveVariableReferences
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=MergeJobClusters
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=MergeJobParameters
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=MergeJobTasks
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=MergePipelineClusters
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=MergeApps
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=CaptureSchemaDependency
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=CheckPermissions
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=SetRunAs
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=OverrideCompute
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=ConfigureDashboardDefaults
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=ConfigureVolumeDefaults
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=ProcessTargetMode
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=ApplyPresets
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=DefaultQueueing
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=ExpandPipelineGlobPaths
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=ConfigureWSFS
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=TranslatePaths
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=PythonWrapperWarning
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=apps.Validate
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=ValidateSharedRootPermissions
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=ApplyBundlePermissions
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=FilterCurrentUserFromPermissions
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=metadata.AnnotateJobs
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=metadata.AnnotatePipelines
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=terraform.Initialize
10:07:59 Debug: Using Terraform from DATABRICKS_TF_EXEC_PATH at [TERRAFORM] pid=12345 mutator=initialize mutator=seq mutator=terraform.Initialize
10:07:59 Debug: Using Terraform CLI config from DATABRICKS_TF_CLI_CONFIG_FILE at [DATABRICKS_TF_CLI_CONFIG_FILE] pid=12345 mutator=initialize mutator=seq mutator=terraform.Initialize
10:07:59 Debug: Environment variables for Terraform: ...redacted... pid=12345 mutator=initialize mutator=seq mutator=terraform.Initialize
10:07:59 Debug: Apply pid=12345 mutator=initialize mutator=seq mutator=scripts.postinit
10:07:59 Debug: No script defined for postinit, skipping pid=12345 mutator=initialize mutator=seq mutator=scripts.postinit
10:07:59 Debug: Apply pid=12345 mutator=validate
10:07:59 Debug: GET /api/2.0/workspace/get-status?path=/Workspace/Users/[USERNAME]/.bundle/debug/default/files
< HTTP/1.1 404 Not Found
10:07:59 Debug: POST /api/2.0/workspace/mkdirs
> {
> "path": "/Workspace/Users/[USERNAME]/.bundle/debug/default/files"
> }
< HTTP/1.1 200 OK
10:07:59 Debug: GET /api/2.0/workspace/get-status?path=/Workspace/Users/[USERNAME]/.bundle/debug/default/files
< HTTP/1.1 200 OK
< {
< "object_type": "DIRECTORY",
< "path": "/Workspace/Users/[USERNAME]/.bundle/debug/default/files"
10:07:59 Info: completed execution pid=12345 exit_code=0

View File

@ -0,0 +1,7 @@
Name: debug
Target: default
Workspace:
User: [USERNAME]
Path: /Workspace/Users/[USERNAME]/.bundle/debug/default
Validation OK!

View File

@ -0,0 +1,4 @@
$CLI bundle validate --debug 2> full.stderr.txt
grep -vw parallel full.stderr.txt > out.stderr.txt
grep -w parallel full.stderr.txt | sed 's/[0-9]/0/g' | sort_lines.py > out.stderr.parallel.txt
rm full.stderr.txt

View File

@ -0,0 +1,18 @@
LocalOnly = true
[[Repls]]
# The keys are unsorted and also vary per OS
Old = 'Environment variables for Terraform: ([A-Z_ ,]+) '
New = 'Environment variables for Terraform: ...redacted... '
[[Repls]]
Old = 'pid=[0-9]+'
New = 'pid=12345'
[[Repls]]
Old = '\d\d:\d\d:\d\d'
New = '10:07:59'
[[Repls]]
Old = '\\'
New = '/'

View File

@ -0,0 +1,2 @@
bundle:
name: git_job

View File

@ -0,0 +1,17 @@
resources:
jobs:
out:
name: gitjob
tasks:
- task_key: test_task
notebook_task:
notebook_path: some/test/notebook.py
- task_key: test_task_2
notebook_task:
notebook_path: /Workspace/Users/foo@bar.com/some/test/notebook.py
source: WORKSPACE
git_source:
git_branch: main
git_commit: abcdef
git_provider: github
git_url: https://git.databricks.com

View File

@ -0,0 +1,2 @@
Job is using Git source, skipping downloading files
Job configuration successfully saved to out.job.yml

View File

@ -0,0 +1 @@
$CLI bundle generate job --existing-job-id 1234 --config-dir . --key out

View File

@ -0,0 +1,33 @@
LocalOnly = true # This test needs to run against stubbed Databricks API
[[Server]]
Pattern = "GET /api/2.1/jobs/get"
Response.Body = '''
{
"job_id": 11223344,
"settings": {
"name": "gitjob",
"git_source": {
"git_url": "https://git.databricks.com",
"git_provider": "github",
"git_branch": "main",
"git_commit": "abcdef"
},
"tasks": [
{
"task_key": "test_task",
"notebook_task": {
"notebook_path": "some/test/notebook.py"
}
},
{
"task_key": "test_task_2",
"notebook_task": {
"source": "WORKSPACE",
"notebook_path": "/Workspace/Users/foo@bar.com/some/test/notebook.py"
}
}
]
}
}
'''

View File

@ -0,0 +1,2 @@
bundle:
name: git-permerror

View File

@ -0,0 +1,81 @@
=== No permission to access .git. Badness: inferred flag is set to true even though we did not infer branch. bundle_root_path is not correct in subdir case.
>>> chmod 000 .git
>>> [CLI] bundle validate
Warn: failed to read .git: unable to load repository specific gitconfig: open config: permission denied
Error: unable to load repository specific gitconfig: open config: permission denied
Name: git-permerror
Target: default
Workspace:
User: [USERNAME]
Path: /Workspace/Users/[USERNAME]/.bundle/git-permerror/default
Found 1 error
Exit code: 1
>>> [CLI] bundle validate -o json
Warn: failed to read .git: unable to load repository specific gitconfig: open config: permission denied
Error: unable to load repository specific gitconfig: open config: permission denied
Exit code: 1
{
"bundle_root_path": "."
}
>>> withdir subdir/a/b [CLI] bundle validate -o json
Warn: failed to read .git: unable to load repository specific gitconfig: open config: permission denied
Error: unable to load repository specific gitconfig: open config: permission denied
Exit code: 1
{
"bundle_root_path": "."
}
=== No permissions to read .git/HEAD. Badness: warning is not shown. inferred is incorrectly set to true. bundle_root_path is not correct in subdir case.
>>> chmod 000 .git/HEAD
>>> [CLI] bundle validate -o json
Warn: failed to load current branch: open HEAD: permission denied
Warn: failed to load latest commit: open HEAD: permission denied
{
"bundle_root_path": "."
}
>>> withdir subdir/a/b [CLI] bundle validate -o json
Warn: failed to load current branch: open HEAD: permission denied
Warn: failed to load latest commit: open HEAD: permission denied
{
"bundle_root_path": "."
}
=== No permissions to read .git/config. Badness: inferred is incorretly set to true. bundle_root_path is not correct is subdir case.
>>> chmod 000 .git/config
>>> [CLI] bundle validate -o json
Warn: failed to read .git: unable to load repository specific gitconfig: open config: permission denied
Error: unable to load repository specific gitconfig: open config: permission denied
Exit code: 1
{
"bundle_root_path": "."
}
>>> withdir subdir/a/b [CLI] bundle validate -o json
Warn: failed to read .git: unable to load repository specific gitconfig: open config: permission denied
Error: unable to load repository specific gitconfig: open config: permission denied
Exit code: 1
{
"bundle_root_path": "."
}

View File

@ -0,0 +1,26 @@
mkdir myrepo
cd myrepo
cp ../databricks.yml .
git-repo-init
mkdir -p subdir/a/b
printf "=== No permission to access .git. Badness: inferred flag is set to true even though we did not infer branch. bundle_root_path is not correct in subdir case.\n"
trace chmod 000 .git
errcode trace $CLI bundle validate
errcode trace $CLI bundle validate -o json | jq .bundle.git
errcode trace withdir subdir/a/b $CLI bundle validate -o json | jq .bundle.git
printf "\n\n=== No permissions to read .git/HEAD. Badness: warning is not shown. inferred is incorrectly set to true. bundle_root_path is not correct in subdir case.\n"
chmod 700 .git
trace chmod 000 .git/HEAD
errcode trace $CLI bundle validate -o json | jq .bundle.git
errcode trace withdir subdir/a/b $CLI bundle validate -o json | jq .bundle.git
printf "\n\n=== No permissions to read .git/config. Badness: inferred is incorretly set to true. bundle_root_path is not correct is subdir case.\n"
chmod 666 .git/HEAD
trace chmod 000 .git/config
errcode trace $CLI bundle validate -o json | jq .bundle.git
errcode trace withdir subdir/a/b $CLI bundle validate -o json | jq .bundle.git
cd ..
rm -fr myrepo

View File

@ -0,0 +1,5 @@
Badness = "inferred flag is set to true incorrect; bundle_root_path is not correct; Warn and Error talk about the same; Warn goes to stderr, Error goes to stdout (for backward compat); Warning about permissions repeated twice"
[GOOS]
# This test relies on chmod which does not work on Windows
windows = false

View File

@ -0,0 +1,21 @@
>>> [CLI] bundle deploy --help
Deploy bundle
Usage:
databricks bundle deploy [flags]
Flags:
--auto-approve Skip interactive approvals that might be required for deployment.
-c, --cluster-id string Override cluster in the deployment with the given cluster ID.
--fail-on-active-runs Fail if there are running jobs or pipelines in the deployment.
--force Force-override Git branch validation.
--force-lock Force acquisition of deployment lock.
-h, --help help for deploy
Global Flags:
--debug enable debug logging
-o, --output type output type: text or json (default text)
-p, --profile string ~/.databrickscfg profile
-t, --target string bundle target to use (if applicable)
--var strings set values for variables defined in bundle config. Example: --var="foo=bar"

View File

@ -0,0 +1 @@
trace $CLI bundle deploy --help

View File

@ -0,0 +1,22 @@
>>> [CLI] bundle deployment --help
Deployment related commands
Usage:
databricks bundle deployment [command]
Available Commands:
bind Bind bundle-defined resources to existing resources
unbind Unbind bundle-defined resources from its managed remote resource
Flags:
-h, --help help for deployment
Global Flags:
--debug enable debug logging
-o, --output type output type: text or json (default text)
-p, --profile string ~/.databrickscfg profile
-t, --target string bundle target to use (if applicable)
--var strings set values for variables defined in bundle config. Example: --var="foo=bar"
Use "databricks bundle deployment [command] --help" for more information about a command.

View File

@ -0,0 +1 @@
trace $CLI bundle deployment --help

View File

@ -0,0 +1,18 @@
>>> [CLI] bundle destroy --help
Destroy deployed bundle resources
Usage:
databricks bundle destroy [flags]
Flags:
--auto-approve Skip interactive approvals for deleting resources and files
--force-lock Force acquisition of deployment lock.
-h, --help help for destroy
Global Flags:
--debug enable debug logging
-o, --output type output type: text or json (default text)
-p, --profile string ~/.databrickscfg profile
-t, --target string bundle target to use (if applicable)
--var strings set values for variables defined in bundle config. Example: --var="foo=bar"

View File

@ -0,0 +1 @@
trace $CLI bundle destroy --help

View File

@ -0,0 +1,24 @@
>>> [CLI] bundle generate dashboard --help
Generate configuration for a dashboard
Usage:
databricks bundle generate dashboard [flags]
Flags:
-s, --dashboard-dir string directory to write the dashboard representation to (default "src")
--existing-id string ID of the dashboard to generate configuration for
--existing-path string workspace path of the dashboard to generate configuration for
-f, --force force overwrite existing files in the output directory
-h, --help help for dashboard
--resource string resource key of dashboard to watch for changes
-d, --resource-dir string directory to write the configuration to (default "resources")
--watch watch for changes to the dashboard and update the configuration
Global Flags:
--debug enable debug logging
--key string resource key to use for the generated configuration
-o, --output type output type: text or json (default text)
-p, --profile string ~/.databrickscfg profile
-t, --target string bundle target to use (if applicable)
--var strings set values for variables defined in bundle config. Example: --var="foo=bar"

View File

@ -0,0 +1 @@
trace $CLI bundle generate dashboard --help

View File

@ -0,0 +1,21 @@
>>> [CLI] bundle generate job --help
Generate bundle configuration for a job
Usage:
databricks bundle generate job [flags]
Flags:
-d, --config-dir string Dir path where the output config will be stored (default "resources")
--existing-job-id int Job ID of the job to generate config for
-f, --force Force overwrite existing files in the output directory
-h, --help help for job
-s, --source-dir string Dir path where the downloaded files will be stored (default "src")
Global Flags:
--debug enable debug logging
--key string resource key to use for the generated configuration
-o, --output type output type: text or json (default text)
-p, --profile string ~/.databrickscfg profile
-t, --target string bundle target to use (if applicable)
--var strings set values for variables defined in bundle config. Example: --var="foo=bar"

View File

@ -0,0 +1 @@
trace $CLI bundle generate job --help

View File

@ -0,0 +1,21 @@
>>> [CLI] bundle generate pipeline --help
Generate bundle configuration for a pipeline
Usage:
databricks bundle generate pipeline [flags]
Flags:
-d, --config-dir string Dir path where the output config will be stored (default "resources")
--existing-pipeline-id string ID of the pipeline to generate config for
-f, --force Force overwrite existing files in the output directory
-h, --help help for pipeline
-s, --source-dir string Dir path where the downloaded files will be stored (default "src")
Global Flags:
--debug enable debug logging
--key string resource key to use for the generated configuration
-o, --output type output type: text or json (default text)
-p, --profile string ~/.databrickscfg profile
-t, --target string bundle target to use (if applicable)
--var strings set values for variables defined in bundle config. Example: --var="foo=bar"

View File

@ -0,0 +1 @@
trace $CLI bundle generate pipeline --help

View File

@ -0,0 +1,25 @@
>>> [CLI] bundle generate --help
Generate bundle configuration
Usage:
databricks bundle generate [command]
Available Commands:
app Generate bundle configuration for a Databricks app
dashboard Generate configuration for a dashboard
job Generate bundle configuration for a job
pipeline Generate bundle configuration for a pipeline
Flags:
-h, --help help for generate
--key string resource key to use for the generated configuration
Global Flags:
--debug enable debug logging
-o, --output type output type: text or json (default text)
-p, --profile string ~/.databrickscfg profile
-t, --target string bundle target to use (if applicable)
--var strings set values for variables defined in bundle config. Example: --var="foo=bar"
Use "databricks bundle generate [command] --help" for more information about a command.

View File

@ -0,0 +1 @@
trace $CLI bundle generate --help

View File

@ -0,0 +1,31 @@
>>> [CLI] bundle init --help
Initialize using a bundle template.
TEMPLATE_PATH optionally specifies which template to use. It can be one of the following:
- default-python: The default Python template for Notebooks / Delta Live Tables / Workflows
- default-sql: The default SQL template for .sql files that run with Databricks SQL
- dbt-sql: The dbt SQL template (databricks.com/blog/delivering-cost-effective-data-real-time-dbt-and-databricks)
- mlops-stacks: The Databricks MLOps Stacks template (github.com/databricks/mlops-stacks)
- a local file system path with a template directory
- a Git repository URL, e.g. https://github.com/my/repository
See https://docs.databricks.com/en/dev-tools/bundles/templates.html for more information on templates.
Usage:
databricks bundle init [TEMPLATE_PATH] [flags]
Flags:
--branch string Git branch to use for template initialization
--config-file string JSON file containing key value pairs of input parameters required for template initialization.
-h, --help help for init
--output-dir string Directory to write the initialized template to.
--tag string Git tag to use for template initialization
--template-dir string Directory path within a Git repository containing the template.
Global Flags:
--debug enable debug logging
-o, --output type output type: text or json (default text)
-p, --profile string ~/.databrickscfg profile
-t, --target string bundle target to use (if applicable)
--var strings set values for variables defined in bundle config. Example: --var="foo=bar"

View File

@ -0,0 +1 @@
trace $CLI bundle init --help

View File

@ -0,0 +1,17 @@
>>> [CLI] bundle open --help
Open a resource in the browser
Usage:
databricks bundle open [flags]
Flags:
--force-pull Skip local cache and load the state from the remote workspace
-h, --help help for open
Global Flags:
--debug enable debug logging
-o, --output type output type: text or json (default text)
-p, --profile string ~/.databrickscfg profile
-t, --target string bundle target to use (if applicable)
--var strings set values for variables defined in bundle config. Example: --var="foo=bar"

View File

@ -0,0 +1 @@
trace $CLI bundle open --help

View File

@ -0,0 +1,57 @@
>>> [CLI] bundle run --help
Run the job or pipeline identified by KEY.
The KEY is the unique identifier of the resource to run. In addition to
customizing the run using any of the available flags, you can also specify
keyword or positional arguments as shown in these examples:
databricks bundle run my_job -- --key1 value1 --key2 value2
Or:
databricks bundle run my_job -- value1 value2 value3
If the specified job uses job parameters or the job has a notebook task with
parameters, the first example applies and flag names are mapped to the
parameter names.
If the specified job does not use job parameters and the job has a Python file
task or a Python wheel task, the second example applies.
Usage:
databricks bundle run [flags] KEY
Job Flags:
--params stringToString comma separated k=v pairs for job parameters (default [])
Job Task Flags:
Note: please prefer use of job-level parameters (--param) over task-level parameters.
For more information, see https://docs.databricks.com/en/workflows/jobs/create-run-jobs.html#pass-parameters-to-a-databricks-job-task
--dbt-commands strings A list of commands to execute for jobs with DBT tasks.
--jar-params strings A list of parameters for jobs with Spark JAR tasks.
--notebook-params stringToString A map from keys to values for jobs with notebook tasks. (default [])
--pipeline-params stringToString A map from keys to values for jobs with pipeline tasks. (default [])
--python-named-params stringToString A map from keys to values for jobs with Python wheel tasks. (default [])
--python-params strings A list of parameters for jobs with Python tasks.
--spark-submit-params strings A list of parameters for jobs with Spark submit tasks.
--sql-params stringToString A map from keys to values for jobs with SQL tasks. (default [])
Pipeline Flags:
--full-refresh strings List of tables to reset and recompute.
--full-refresh-all Perform a full graph reset and recompute.
--refresh strings List of tables to update.
--refresh-all Perform a full graph update.
--validate-only Perform an update to validate graph correctness.
Flags:
-h, --help help for run
--no-wait Don't wait for the run to complete.
--restart Restart the run if it is already running.
Global Flags:
--debug enable debug logging
-o, --output type output type: text or json (default text)
-p, --profile string ~/.databrickscfg profile
-t, --target string bundle target to use (if applicable)
--var strings set values for variables defined in bundle config. Example: --var="foo=bar"

View File

@ -0,0 +1 @@
trace $CLI bundle run --help

View File

@ -0,0 +1,16 @@
>>> [CLI] bundle schema --help
Generate JSON Schema for bundle configuration
Usage:
databricks bundle schema [flags]
Flags:
-h, --help help for schema
Global Flags:
--debug enable debug logging
-o, --output type output type: text or json (default text)
-p, --profile string ~/.databrickscfg profile
-t, --target string bundle target to use (if applicable)
--var strings set values for variables defined in bundle config. Example: --var="foo=bar"

View File

@ -0,0 +1 @@
trace $CLI bundle schema --help

View File

@ -0,0 +1,17 @@
>>> [CLI] bundle summary --help
Summarize resources deployed by this bundle
Usage:
databricks bundle summary [flags]
Flags:
--force-pull Skip local cache and load the state from the remote workspace
-h, --help help for summary
Global Flags:
--debug enable debug logging
-o, --output type output type: text or json (default text)
-p, --profile string ~/.databrickscfg profile
-t, --target string bundle target to use (if applicable)
--var strings set values for variables defined in bundle config. Example: --var="foo=bar"

View File

@ -0,0 +1 @@
trace $CLI bundle summary --help

View File

@ -0,0 +1,19 @@
>>> [CLI] bundle sync --help
Synchronize bundle tree to the workspace
Usage:
databricks bundle sync [flags]
Flags:
--full perform full synchronization (default is incremental)
-h, --help help for sync
--interval duration file system polling interval (for --watch) (default 1s)
--output type type of the output format
--watch watch local file system for changes
Global Flags:
--debug enable debug logging
-p, --profile string ~/.databrickscfg profile
-t, --target string bundle target to use (if applicable)
--var strings set values for variables defined in bundle config. Example: --var="foo=bar"

View File

@ -0,0 +1 @@
trace $CLI bundle sync --help

View File

@ -0,0 +1,16 @@
>>> [CLI] bundle validate --help
Validate configuration
Usage:
databricks bundle validate [flags]
Flags:
-h, --help help for validate
Global Flags:
--debug enable debug logging
-o, --output type output type: text or json (default text)
-p, --profile string ~/.databrickscfg profile
-t, --target string bundle target to use (if applicable)
--var strings set values for variables defined in bundle config. Example: --var="foo=bar"

View File

@ -0,0 +1 @@
trace $CLI bundle validate --help

View File

@ -0,0 +1,33 @@
>>> [CLI] bundle --help
Databricks Asset Bundles let you express data/AI/analytics projects as code.
Online documentation: https://docs.databricks.com/en/dev-tools/bundles/index.html
Usage:
databricks bundle [command]
Available Commands:
deploy Deploy bundle
deployment Deployment related commands
destroy Destroy deployed bundle resources
generate Generate bundle configuration
init Initialize using a bundle template
open Open a resource in the browser
run Run a job or pipeline update
schema Generate JSON Schema for bundle configuration
summary Summarize resources deployed by this bundle
sync Synchronize bundle tree to the workspace
validate Validate configuration
Flags:
-h, --help help for bundle
--var strings set values for variables defined in bundle config. Example: --var="foo=bar"
Global Flags:
--debug enable debug logging
-o, --output type output type: text or json (default text)
-p, --profile string ~/.databrickscfg profile
-t, --target string bundle target to use (if applicable)
Use "databricks bundle [command] --help" for more information about a command.

View File

@ -0,0 +1 @@
trace $CLI bundle --help

View File

@ -0,0 +1,6 @@
bundle:
name: non_yaml_in_includes
include:
- test.py
- resources/*.yml

View File

@ -0,0 +1,10 @@
Error: Files in the 'include' configuration section must be YAML or JSON files.
in databricks.yml:5:4
The file test.py in the 'include' configuration section is not a YAML or JSON file, and only such files are supported. To include files to sync, specify them in the 'sync.include' configuration section instead.
Name: non_yaml_in_includes
Found 1 error
Exit code: 1

View File

@ -0,0 +1 @@
$CLI bundle validate

View File

@ -0,0 +1 @@
print("Hello world")

View File

@ -1,5 +1,5 @@
>>> $CLI bundle validate -o json -t default
>>> [CLI] bundle validate -o json -t default
{
"autoscale": {
"max_workers": 7,
@ -15,7 +15,7 @@
"spark_version": "13.3.x-scala2.12"
}
>>> $CLI bundle validate -o json -t development
>>> [CLI] bundle validate -o json -t development
{
"autoscale": {
"max_workers": 3,

View File

@ -1,10 +1,10 @@
>>> $CLI bundle validate -o json -t development
>>> [CLI] bundle validate -o json -t development
{
"foo": {
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/tester@databricks.com/.bundle/override_job_cluster/development/state/metadata.json"
"metadata_file_path": "/Workspace/Users/[USERNAME]/.bundle/override_job_cluster/development/state/metadata.json"
},
"edit_mode": "UI_LOCKED",
"format": "MULTI_TASK",
@ -27,12 +27,12 @@
}
}
>>> $CLI bundle validate -o json -t staging
>>> [CLI] bundle validate -o json -t staging
{
"foo": {
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/tester@databricks.com/.bundle/override_job_cluster/staging/state/metadata.json"
"metadata_file_path": "/Workspace/Users/[USERNAME]/.bundle/override_job_cluster/staging/state/metadata.json"
},
"edit_mode": "UI_LOCKED",
"format": "MULTI_TASK",

View File

@ -1,10 +1,10 @@
>>> $CLI bundle validate -o json -t development
>>> [CLI] bundle validate -o json -t development
{
"foo": {
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/tester@databricks.com/.bundle/override_job_cluster/development/state/metadata.json"
"metadata_file_path": "/Workspace/Users/[USERNAME]/.bundle/override_job_cluster/development/state/metadata.json"
},
"edit_mode": "UI_LOCKED",
"format": "MULTI_TASK",
@ -27,21 +27,21 @@
}
}
>>> $CLI bundle validate -t development
>>> [CLI] bundle validate -t development
Name: override_job_cluster
Target: development
Workspace:
User: tester@databricks.com
Path: /Workspace/Users/tester@databricks.com/.bundle/override_job_cluster/development
User: [USERNAME]
Path: /Workspace/Users/[USERNAME]/.bundle/override_job_cluster/development
Validation OK!
>>> $CLI bundle validate -o json -t staging
>>> [CLI] bundle validate -o json -t staging
{
"foo": {
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/tester@databricks.com/.bundle/override_job_cluster/staging/state/metadata.json"
"metadata_file_path": "/Workspace/Users/[USERNAME]/.bundle/override_job_cluster/staging/state/metadata.json"
},
"edit_mode": "UI_LOCKED",
"format": "MULTI_TASK",
@ -64,11 +64,11 @@ Validation OK!
}
}
>>> $CLI bundle validate -t staging
>>> [CLI] bundle validate -t staging
Name: override_job_cluster
Target: staging
Workspace:
User: tester@databricks.com
Path: /Workspace/Users/tester@databricks.com/.bundle/override_job_cluster/staging
User: [USERNAME]
Path: /Workspace/Users/[USERNAME]/.bundle/override_job_cluster/staging
Validation OK!

View File

@ -1,5 +1,5 @@
>>> errcode $CLI bundle validate -o json -t development
>>> errcode [CLI] bundle validate -o json -t development
Error: file ./test1.py not found

View File

@ -28,7 +28,7 @@
]
}
>>> errcode $CLI bundle validate -o json -t staging
>>> errcode [CLI] bundle validate -o json -t staging
Error: file ./test1.py not found
@ -63,14 +63,14 @@ Exit code: 1
]
}
>>> errcode $CLI bundle validate -t staging
>>> errcode [CLI] bundle validate -t staging
Error: file ./test1.py not found
Name: override_job_tasks
Target: staging
Workspace:
User: tester@databricks.com
Path: /Workspace/Users/tester@databricks.com/.bundle/override_job_tasks/staging
User: [USERNAME]
Path: /Workspace/Users/[USERNAME]/.bundle/override_job_tasks/staging
Found 1 error

View File

@ -1,5 +1,5 @@
>>> $CLI bundle validate -o json -t dev
>>> [CLI] bundle validate -o json -t dev
Warning: expected map, found string
at resources.clusters.my_cluster
in databricks.yml:6:17
@ -13,7 +13,7 @@ Warning: expected map, found string
}
}
>>> $CLI bundle validate -t dev
>>> [CLI] bundle validate -t dev
Warning: expected map, found string
at resources.clusters.my_cluster
in databricks.yml:6:17
@ -21,7 +21,7 @@ Warning: expected map, found string
Name: merge-string-map
Target: dev
Workspace:
User: tester@databricks.com
Path: /Workspace/Users/tester@databricks.com/.bundle/merge-string-map/dev
User: [USERNAME]
Path: /Workspace/Users/[USERNAME]/.bundle/merge-string-map/dev
Found 1 warning

View File

@ -1,5 +1,5 @@
>>> $CLI bundle validate -o json -t development
>>> [CLI] bundle validate -o json -t development
{
"foo": {
"clusters": [
@ -14,14 +14,14 @@
],
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/tester@databricks.com/.bundle/override_pipeline_cluster/development/state/metadata.json"
"metadata_file_path": "/Workspace/Users/[USERNAME]/.bundle/override_pipeline_cluster/development/state/metadata.json"
},
"name": "job",
"permissions": []
}
}
>>> $CLI bundle validate -o json -t staging
>>> [CLI] bundle validate -o json -t staging
{
"foo": {
"clusters": [
@ -36,7 +36,7 @@
],
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/tester@databricks.com/.bundle/override_pipeline_cluster/staging/state/metadata.json"
"metadata_file_path": "/Workspace/Users/[USERNAME]/.bundle/override_pipeline_cluster/staging/state/metadata.json"
},
"name": "job",
"permissions": []

View File

@ -1,5 +1,5 @@
bundle:
name: path_translation_nominal
name: fallback
include:
- "resources/*.yml"

View File

@ -0,0 +1,67 @@
[
{
"job_cluster_key": "default",
"notebook_task": {
"notebook_path": "/Workspace/Users/[USERNAME]/.bundle/fallback/development/files/src/notebook"
},
"task_key": "notebook_example"
},
{
"job_cluster_key": "default",
"spark_python_task": {
"python_file": "/Workspace/Users/[USERNAME]/.bundle/fallback/development/files/src/file.py"
},
"task_key": "spark_python_example"
},
{
"dbt_task": {
"commands": [
"dbt run",
"dbt run"
],
"project_directory": "/Workspace/Users/[USERNAME]/.bundle/fallback/development/files/src/dbt_project"
},
"job_cluster_key": "default",
"task_key": "dbt_example"
},
{
"job_cluster_key": "default",
"sql_task": {
"file": {
"path": "/Workspace/Users/[USERNAME]/.bundle/fallback/development/files/src/sql.sql"
},
"warehouse_id": "cafef00d"
},
"task_key": "sql_example"
},
{
"job_cluster_key": "default",
"libraries": [
{
"whl": "dist/wheel1.whl"
},
{
"whl": "dist/wheel2.whl"
}
],
"python_wheel_task": {
"package_name": "my_package"
},
"task_key": "python_wheel_example"
},
{
"job_cluster_key": "default",
"libraries": [
{
"jar": "target/jar1.jar"
},
{
"jar": "target/jar2.jar"
}
],
"spark_jar_task": {
"main_class_name": "com.example.Main"
},
"task_key": "spark_jar_example"
}
]

View File

@ -0,0 +1,22 @@
[
{
"file": {
"path": "/Workspace/Users/[USERNAME]/.bundle/fallback/development/files/src/file1.py"
}
},
{
"notebook": {
"path": "/Workspace/Users/[USERNAME]/.bundle/fallback/development/files/src/notebook1"
}
},
{
"file": {
"path": "/Workspace/Users/[USERNAME]/.bundle/fallback/development/files/src/file2.py"
}
},
{
"notebook": {
"path": "/Workspace/Users/[USERNAME]/.bundle/fallback/development/files/src/notebook2"
}
}
]

View File

@ -0,0 +1,16 @@
>>> [CLI] bundle validate -t development -o json
>>> [CLI] bundle validate -t error
Error: notebook this value is overridden not found. Local notebook references are expected
to contain one of the following file extensions: [.py, .r, .scala, .sql, .ipynb]
Name: fallback
Target: error
Workspace:
User: [USERNAME]
Path: /Workspace/Users/[USERNAME]/.bundle/fallback/error
Found 1 error
Exit code: 1

Some files were not shown because too many files have changed in this diff Show More