Compare commits

...

58 Commits

Author SHA1 Message Date
Shreyas Goenka 44586ffa05
address comments 2025-01-16 19:14:52 +01:00
Shreyas Goenka 0f62d0edcf
Merge remote-tracking branch 'origin' into refactor-bundle-init-squashed 2025-01-16 18:38:41 +01:00
Andrew Nester 511c8887a8
[Release] Release v0.239.0 (#2167)
### New feature announcement

#### Databricks Apps support

You can now manage Databricks Apps using DABs by defining an `app`
resource in your bundle configuration.
For more information see Databricks documentation
https://docs.databricks.com/en/dev-tools/bundles/resources.html#app

#### Referencing complex variables in complex variables

You can now reference complex variables within other complex variables.
For more details see https://github.com/databricks/cli/pull/2157

CLI:
* Filter out system clusters in cluster picker
([#2131](https://github.com/databricks/cli/pull/2131)).
* Add command line flags for fields that are not in the API request body
([#2155](https://github.com/databricks/cli/pull/2155)).

Bundles:
* Added support for Databricks Apps in DABs
([#1928](https://github.com/databricks/cli/pull/1928)).
* Allow artifact path to be located outside the sync root
([#2128](https://github.com/databricks/cli/pull/2128)).
* Retry app deployment if there is an active deployment in progress
([#2153](https://github.com/databricks/cli/pull/2153)).
* Resolve variables in a loop
([#2164](https://github.com/databricks/cli/pull/2164)).
* Improve resolution of complex variables within complex variables
([#2157](https://github.com/databricks/cli/pull/2157)).
* Added output message to warn about slower deployments with apps
([#2161](https://github.com/databricks/cli/pull/2161)).
* Patch references to UC schemas to capture dependencies automatically
([#1989](https://github.com/databricks/cli/pull/1989)).
* Format default-python template
([#2110](https://github.com/databricks/cli/pull/2110)).
* Encourage the use of root_path in production to ensure single
deployment ([#1712](https://github.com/databricks/cli/pull/1712)).
* Log warnings to stderr for "bundle validate -o json"
([#2109](https://github.com/databricks/cli/pull/2109)).

API Changes:
* Changed `databricks account federation-policy update` command with new
required argument order.
* Changed `databricks account service-principal-federation-policy
update` command with new required argument order.

OpenAPI commit 779817ed8d63031f5ea761fbd25ee84f38feec0d (2025-01-08)
Dependency updates:
* Upgrade TF provider to 1.63.0
([#2162](https://github.com/databricks/cli/pull/2162)).
* Bump golangci-lint version to v1.63.4 from v1.63.1
([#2114](https://github.com/databricks/cli/pull/2114)).
* Bump astral-sh/setup-uv from 4 to 5
([#2116](https://github.com/databricks/cli/pull/2116)).
* Bump golang.org/x/oauth2 from 0.24.0 to 0.25.0
([#2080](https://github.com/databricks/cli/pull/2080)).
* Bump github.com/hashicorp/hc-install from 0.9.0 to 0.9.1
([#2079](https://github.com/databricks/cli/pull/2079)).
* Bump golang.org/x/term from 0.27.0 to 0.28.0
([#2078](https://github.com/databricks/cli/pull/2078)).
* Bump github.com/databricks/databricks-sdk-go from 0.54.0 to 0.55.0
([#2126](https://github.com/databricks/cli/pull/2126)).

---------

Co-authored-by: shreyas-goenka <88374338+shreyas-goenka@users.noreply.github.com>
2025-01-16 16:25:17 +00:00
Denis Bilenko 2e70558dc1
Resolve variables in a loop (#2164)
## Changes
- Instead of doing 2 passes on variable resolution, do a loop until
there are no more updates (or we reach count 100).
- Stacked on top of #2163 which is a regression test for this:
acceptance/bundle/variables/complex-transitive-deep

## Tests
Existing tests, new regression tests.

These tests already passed before, added for completeness:
- acceptance/bundle/variables/cycle
- acceptance/bundle/variables/complex-cross-ref
2025-01-16 14:39:54 +00:00
shreyas-goenka f2bba632cb
Patch references to UC schemas to capture dependencies automatically (#1989)
## Changes
Fixes https://github.com/databricks/cli/issues/1977.  

This PR modifies the bundle configuration to capture the dependency that
a UC Volume or a DLT pipeline might have on a UC schema at deployment
time. It does so by replacing the schema name with a reference of the
form `${resources.schemas.foo.name}`.

For example:
The following UC Volume definition depends on the UC schema with the
name `schema_name`. This mutator converts this configuration

from:
```
resources:
  volumes:
    bar:
      catalog_name: catalog_name
      name: volume_name
      schema_name: schema_name

  schemas:
    foo:
      catalog_name: catalog_name
      name: schema_name
```

to:

```
resources:
  volumes:
    bar:
      catalog_name: catalog_name
      name: volume_name
      schema_name: ${resources.schemas.foo.name}`

  schemas:
    foo:
      catalog_name: catalog_name
      name: schema_name
```


## Tests
Unit tests and manually.
2025-01-16 13:27:00 +00:00
Andrew Nester fa87f22706
Changed warning message for apps (#2165)
## Changes
Changed warning message for apps

Original warning message added here:
https://github.com/databricks/cli/pull/2161
2025-01-16 13:03:35 +00:00
Denis Bilenko bc1610f6e6
Add a test for complex variable resolution with 3 levels (#2163)
Follow up to #2157. That PR repeated variable resolution. This test
still does not resolve fully but would resolve with 3 passes. This is
slightly different from complex-transitive-deeper - this test does not
show any errors, the issue is purely not enough passes.
2025-01-16 12:14:00 +00:00
Andrew Nester 98244606b3
Upgrade TF provider to 1.63.0 (#2162)
## Changes
No significant changes to call out for DABs.
2025-01-16 12:04:00 +00:00
Andrew Nester 8f34fc7961
Added output message to warn about slower deployments with apps (#2161)
## Changes
When users create one or more Databricks apps in their bundle it can
lead to initial bundle deployment being slower because apps need to wait
until their compute is fully provisioned and started.

This PR adds a message to warn about it. This message will be removed
when `no_compute` option becomes available in TF provider and used in
DABs (https://github.com/databricks/cli/pull/2144)
2025-01-16 11:39:59 +00:00
Denis Bilenko b273dc5942
Enable linter 'copyloopvar' and fix the issues (#2160)
## Changes
- Remove all unnecessary copies of the loop variable, it is not
necessary since Go 1.22 https://go.dev/blog/loopvar-preview
- Enable the linter that catches this issue
https://github.com/karamaru-alpha/copyloopvar

## Tests
Existing tests.
2025-01-16 11:20:50 +00:00
Andrew Nester a002a24e41
Add a unique schema for recreate pipeline test (#2159)
## Changes
Pipeline backend requires `target` to be always specified.

In order to do this we will create an unique schema as part of
`TestBundlePipelineRecreateWithoutAutoApprove` test which will be used
in pipelines

## Tests
```
    helpers_test.go:148: stderr: Destroy complete!
--- PASS: TestBundlePipelineRecreateWithoutAutoApprove (415.39s)
PASS
coverage: [no statements]
ok      github.com/databricks/cli/integration/bundle    416.141s        coverage: [no statements]
```
2025-01-15 17:22:45 +00:00
Denis Bilenko 30dec59781
Improve resolution of complex variables within complex variables (#2157)
## Changes
- Remove ResolveVariableReferencesInComplexVariables - it blocked
complex-within-complex for no good reason.
- Repeat regular resolution twice, it helps with a couple test cases we
have.

There may be a case for running it 3 times or more in a loop, but there
is no test case for that, so this PR is simple incremental improvement.

## Tests
Existing acceptance tests. Previously all unit tests for complex
variables were converted to acceptance tests, to capture this change and
ensure nothing breaks.
2025-01-15 18:03:43 +01:00
Denis Bilenko 39b03592d7
Migrate TestResolveComplexVariableWithVarReference (#2156)
This is the last test referencing
ResolveVariableReferencesInComplexVariables, allowing removal of that
mutator.
2025-01-15 17:52:17 +01:00
Denis Bilenko d53a78e926
Introduce $DATABRICKS_URL replacement in tests (#2158)
## Changes
It covers both https://$DATABRICKS_HOST and http://$DATABRICKS_HOST so
the test output does not change between local and the cloud.

## Tests
Existing tests using golden files (acceptance and integration) catch
this and were updated.
2025-01-15 17:24:12 +01:00
Andrew Nester 20179457b9
Process all the fields in top level request object even if it contains request body (#2155)
## Changes

When regenerating CLI with a new Go SDK
https://github.com/databricks/cli/pull/2126 I've noticed that some
parameters such as `no_compute` for apps are not added as flags for the
CLI commands.

This happened because we ignored all other top level fields if there's a
request body object field.

This PR relies on new AllFields method from Genkit which returns fields
from both request object and request body object.
2025-01-15 17:02:58 +01:00
Denis Bilenko 581565a1c4
Migrate more variable tests to acceptance (#2154) 2025-01-15 15:59:42 +01:00
Andrew Nester dd554412a6
Retry app deployment if there is an active deployment in progress (#2153)
## Changes
If before running an app, the app was stopped with an active deployment,
then Apps backend start it and does the auto-deploy of the last active
deployment. In such cases StartApp API won't return any active or
pending deployments in its response but doing the deploy immediately
after the start might result in the error `Cannot deploy app *** as
there is an active deployment in progress`.

From DABs side, we have to do a new deployment on every `bundle run`
(command which start the app and does deployment) because local files in
bundle might have been changed and users expect to have the app running
with new code.

Thus this PR works around the error by catching “deployment in progress”
error, getting any active / pending deployments, waits for them to
finish and start the new deployment again. If 2nd attempts fails, the
whole command fails.

## Tests
Added unit test.
2025-01-15 12:51:06 +01:00
Pieter Noordhuis 626045a17e
Use regular expressions for testdiff replacements (#2151)
## Changes

Replacement was split between the type `ReplacementContext` and the
`ReplaceOutput` function. The latter also ran a couple of regular
expressions. This change consolidates them such that it is up to the
caller to compose the set of replacements to use.

This change is required to accommodate UUID replacement in #2146.
2025-01-15 12:15:23 +01:00
Denis Bilenko 40e96b5af2
exec(test): Do not clear PATH; this breaks coverage on Windows (#2150)
## Changes
When setting up PATH in tests, put desired entry first but keep the rest
as well. Otherwise it fails on Windows

```
D:/a/cli/cli/libs/exec/exec_test.go:108
Error: Received unexpected error:
exit status 0xc0000135
```

Explanation from Claude:
> The error code 0xc0000135 is a Windows error indicating "Unable to
locate DLL"
> When code coverage is enabled, Go instruments the binary with coverage
tracking code, which requires additional DLL dependencies on Windows.

## Tests
Separate draft PR with coverage enabled on CI:
https://github.com/databricks/cli/pull/2141
2025-01-15 12:05:46 +01:00
Denis Bilenko 983a8a6633
Alias 'make' to 'make vendor fmt lint' (#2152)
The current default of building a binary is not frequently used.

The new default is useful after switching branches, stashing/unstashing,
AI changes.

Note "fmt" and "lint" are separate steps because "fmt" can fix
formatting and imports in presence of compilation errors and "lint"
cannot. Without compilation errors, "lint" also does formatting.
2025-01-15 11:41:06 +01:00
Denis Bilenko b76eee0e8c
Migrate resolution tests to acceptance tests (#2143) 2025-01-15 11:22:23 +01:00
Denis Bilenko 5592fa889e
acceptance: add -buildvcs=false when building on Windows (#2148)
I get an error on my local Windows otherwise: error obtaining VCS
status: exit status 128

On CI this does not make a difference.
2025-01-15 10:45:57 +01:00
Gleb Kanterov 25f8ee8d66
Format default-python template (#2110)
## Changes
Format code in default-python template, so it's already pre-formatted.

## Tests

```
$ databricks bundle init libs/template/templates/default-python
$ ruff format --diff my_project     
6 files already formatted
```
2025-01-15 10:40:29 +01:00
Denis Bilenko 55494a0bda
Add test about using variable in bundle.git.branch (#2118)
This test checks load git details functionality + variable interpolation
there.

The variables are not working there because LoadGitDetails mutator is
running before variable interpolation.

Additionally, correctly replace tmp path that is used for DATABRICKS_TF_EXEC_PATH
2025-01-15 10:34:51 +01:00
Denis Bilenko cc44e368b8
Golden files: always include JSON-ed string (cont) (#2147)
## Changes
Follow up to #2142 which did not actually enabled JSON conversion
because of reversed condition on err.

## Tests
Tested via #2118
2025-01-15 09:46:06 +01:00
Denis Bilenko 6a7eefa54b
Use the same test names on Win as on other OSes (#2149)
## Changes
Use name format "TestAccept/bundle/variables/host" (previously slashes
were reversed on Windows).

## Tests
Manually run "go test ./acceptance -v -run
TestAccept/bundle/variables/host" in Windows VM.
2025-01-15 09:43:40 +01:00
Pieter Noordhuis 82e35530b0
Add acceptance tests for builtin templates (#2135)
## Changes

To accommodate:
* Add the server URL to the set of output replacements
* Include a call to the permissions API to the dummy server
* Run the main script in a subshell to isolate working directory changes
2025-01-14 18:23:34 +00:00
dependabot[bot] 72e677d0ac
Bump github.com/databricks/databricks-sdk-go from 0.54.0 to 0.55.0 (#2126)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.54.0 to 0.55.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.55.0</h2>
<h3>Internal Changes</h3>
<ul>
<li>Bump staticcheck to 0.5.1 and add go 1.23 test coverage (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1106">#1106</a>).</li>
<li>Bump x/net, x/crypto dependencies (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1107">#1107</a>).</li>
<li>Create custom codeql.yml (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1114">#1114</a>).</li>
<li>Decouple serving and oauth2 package (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1110">#1110</a>).</li>
<li>Migrate workflows that need write access to use hosted runners (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1112">#1112</a>).</li>
<li>Move package credentials in config (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1115">#1115</a>).</li>
<li>Update Queries test (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1104">#1104</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <code>NoCompute</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/apps#CreateAppRequest">apps.CreateAppRequest</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseJob">jobs.BaseJob</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li>
<li>Added <code>PageToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetJobRequest">jobs.GetJobRequest</a>.</li>
<li>Added <code>HasMore</code> and <code>NextPageToken</code> fields for
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Job">jobs.Job</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li>
<li>Added <code>RunAs</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>.</li>
<li>Added <code>RunAs</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a>.</li>
<li>Added <code>AuthorizationDetails</code> and <code>EndpointUrl</code>
fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DataPlaneInfo">serving.DataPlaneInfo</a>.</li>
<li>[Breaking] Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#AccountFederationPolicyAPI">a.AccountFederationPolicy</a>
account-level service with new required argument order.</li>
<li>[Breaking] Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#ServicePrincipalFederationPolicyAPI">a.ServicePrincipalFederationPolicy</a>
account-level service with new required argument order.</li>
<li>Changed <code>UpdateMask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateAccountFederationPolicyRequest">oauth2.UpdateAccountFederationPolicyRequest</a>
to no longer be required.</li>
<li>Changed <code>UpdateMask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateServicePrincipalFederationPolicyRequest">oauth2.UpdateServicePrincipalFederationPolicyRequest</a>
to no longer be required.</li>
<li>[Breaking] Changed <code>DaysOfWeek</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#RestartWindow">pipelines.RestartWindow</a>
to type <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#DayOfWeekList">pipelines.DayOfWeekList</a>.</li>
</ul>
<p>OpenAPI SHA: 779817ed8d63031f5ea761fbd25ee84f38feec0d, Date:
2025-01-08</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>[Release] Release v0.55.0</h2>
<h3>Internal Changes</h3>
<ul>
<li>Bump staticcheck to 0.5.1 and add go 1.23 test coverage (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1106">#1106</a>).</li>
<li>Bump x/net, x/crypto dependencies (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1107">#1107</a>).</li>
<li>Create custom codeql.yml (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1114">#1114</a>).</li>
<li>Decouple serving and oauth2 package (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1110">#1110</a>).</li>
<li>Migrate workflows that need write access to use hosted runners (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1112">#1112</a>).</li>
<li>Move package credentials in config (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1115">#1115</a>).</li>
<li>Update Queries test (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1104">#1104</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <code>NoCompute</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/apps#CreateAppRequest">apps.CreateAppRequest</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseJob">jobs.BaseJob</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li>
<li>Added <code>PageToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetJobRequest">jobs.GetJobRequest</a>.</li>
<li>Added <code>HasMore</code> and <code>NextPageToken</code> fields for
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Job">jobs.Job</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li>
<li>Added <code>RunAs</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>.</li>
<li>Added <code>RunAs</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a>.</li>
<li>Added <code>AuthorizationDetails</code> and <code>EndpointUrl</code>
fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DataPlaneInfo">serving.DataPlaneInfo</a>.</li>
<li>[Breaking] Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#AccountFederationPolicyAPI">a.AccountFederationPolicy</a>
account-level service with new required argument order.</li>
<li>[Breaking] Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#ServicePrincipalFederationPolicyAPI">a.ServicePrincipalFederationPolicy</a>
account-level service with new required argument order.</li>
<li>Changed <code>UpdateMask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateAccountFederationPolicyRequest">oauth2.UpdateAccountFederationPolicyRequest</a>
to no longer be required.</li>
<li>Changed <code>UpdateMask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateServicePrincipalFederationPolicyRequest">oauth2.UpdateServicePrincipalFederationPolicyRequest</a>
to no longer be required.</li>
<li>[Breaking] Changed <code>DaysOfWeek</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#RestartWindow">pipelines.RestartWindow</a>
to type <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#DayOfWeekList">pipelines.DayOfWeekList</a>.</li>
</ul>
<p>OpenAPI SHA: 779817ed8d63031f5ea761fbd25ee84f38feec0d, Date:
2025-01-08</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b83a7262d5"><code>b83a726</code></a>
[Release] Release v0.55.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1117">#1117</a>)</li>
<li><a
href="23d9c1ea2b"><code>23d9c1e</code></a>
[Internal] Move package credentials in config (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1115">#1115</a>)</li>
<li><a
href="adc94cabf7"><code>adc94ca</code></a>
[Internal] Decouple serving and oauth2 package (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1110">#1110</a>)</li>
<li><a
href="83db3cbdab"><code>83db3cb</code></a>
[Internal] Create custom codeql.yml (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1114">#1114</a>)</li>
<li><a
href="2b55375727"><code>2b55375</code></a>
[Internal] Migrate workflows that need write access to use hosted
runners (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1">#1</a>...</li>
<li><a
href="03fb2681fa"><code>03fb268</code></a>
[Internal] Bump x/net, x/crypto dependencies (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1107">#1107</a>)</li>
<li><a
href="28e1a698ab"><code>28e1a69</code></a>
[Internal] Bump staticcheck to 0.5.1 and add go 1.23 test coverage (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1106">#1106</a>)</li>
<li><a
href="2399d721fe"><code>2399d72</code></a>
[Internal] Update Queries test (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1104">#1104</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.54.0...v0.55.0">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.54.0&new-version=0.55.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2025-01-14 16:02:34 +00:00
Denis Bilenko fca6abdfac
Golden files: always include JSON-ed string (#2142)
## Changes
Always include both verbatim and json version of replacement.

This helps when the string in question contains \\ or other chars that
need to be quoted.

Needed for https://github.com/databricks/cli/pull/2118

## Tests
Existing tests.
2025-01-14 15:50:55 +00:00
Denis Bilenko ccb2599b42
Add complex-transitive-deeper acceptance test (#2140)
Extension of complex-transitive test that shows an error instead of
simply failing to interpolate.
2025-01-14 15:38:20 +00:00
Denis Bilenko a5e09ab28a
Coverage for acceptance tests (#2123)
## Changes

Add two new make commands:
- make acc-cover: runs acceptance tests and outputs
coverage-acceptance.txt
- make acc-showcover: show coverage-acceptance.txt locally in browser

Using the GOCOVERDIR functionality:
https://go.dev/blog/integration-test-coverage

This works, but there are a couple of issues encountered:
- GOCOVERDIR does not play well with regular "go test -cover". Once this
fixed, we can simplify the code and have 'make cover' output coverage
for everything at once. We can also probably get rid of CLI_GOCOVERDIR.
https://github.com/golang/go/issues/66225
- When running tests in parallel to the same directory there is rare
conflict on writing covmeta file. For this reason each tests writes
coverage to their own directory which is then merged together by 'make
acc-cover'.

<!-- Summary of your changes that are easy to understand --


## Tests
Manually running the new make commands.
2025-01-14 14:19:00 +00:00
Denis Bilenko 2ae2b7e8c8
Enable acceptance tests for manually running against the cloud (#2120)
## Changes
- If CLOUD_ENV variable is set, acceptance will no longer set up server
& override DATABRICKS_HOST/DATABRICKS_TOKEN/HOME env vars.
- I've updated replacements logic in testdiff to use tester /
tester@databricks.com convention.

## Tests
Manually running current acceptance tests against dogfood on my laptop I
get all test pass except for 2 failures.

```
    --- FAIL: TestAccept/bundle/variables/env_overrides (0.09s)
    --- FAIL: TestAccept/bundle/variables/resolve-builtin (1.30s)
```
2025-01-14 13:50:28 +00:00
Andrew Nester fe31e4d02e
Fixed a typo in TestDeployBundleWithApp test (#2138)
## Changes
Fixed a typo in TestDeployBundleWithApp test

## Tests
```
   helpers_test.go:148: stderr: Destroy complete!
--- PASS: TestDeployBundleWithApp (647.51s)
PASS
coverage: [no statements]
ok      github.com/databricks/cli/integration/bundle    647.985s        coverage: [no statements]
```
2025-01-14 13:24:22 +00:00
Denis Bilenko 98a1e73a0f
Simplify replacements logic for golden files (#2132)
## Changes
- Do not sort, use fixed order of replacements.

## Tests
Existing tests.
2025-01-14 11:00:38 +00:00
Denis Bilenko 2b452973f3
Enable linter 'unconvert' and fix the issues found (#2136) 2025-01-14 10:56:38 +00:00
Pieter Noordhuis 5d9bc3b553
Allow artifact path to be located outside the sync root (#2128)
## Changes

We perform a check during path translation that the path being
referenced is contained in the bundle's sync root. If it isn't, it's not
a valid remote reference. However, this doesn't apply to paths that are
_always_ local, such as the artifact path. An artifact's build command
is executed in its path. Files created by the artifact build (e.g.
wheels or JARs) don't need to be in the sync root because they have a
dedicated and different upload path into `${workspace.artifact_path}`.

Therefore, this check that a path is contained in the bundle's sync root
doesn't apply to artifact paths. This change modifies the structure of
path translation to allow opting out of this check.

Fixes #1927.

## Tests

* Existing and new tests pass.
* Manually confirmed that building and using a wheel built outside the
sync root path works as expected.
* No acceptance tests because we don't run build as part of validate.
2025-01-14 08:34:55 +00:00
Denis Bilenko e682eeba80
Pin all github actions to commit hash (#2129)
## Changes
- Pin all github actions to commit hash.
- Modify vedantmgoyal2009/winget-releaser to use tag format that
dependabot can understand.

Pinning is done by
https://github.com/databricks/cli/blob/denik/pin-actions-script/pin_actions.py
(100% chatgpt authored). Commits and tags are verified manually.

This format should be recognized by dependabot enabled in
https://github.com/databricks/cli/pull/2112

## Tests
Existing tests.
2025-01-14 07:39:34 +00:00
Pieter Noordhuis e1f5f60a8d
Filter out system clusters in cluster picker (#2131)
## Changes

As of the clusters API v2.1 the results include system clusters. On
large workspaces this can lead to long load times and include many
irrelevant results. The cluster picker should only show interactive
clusters.

Also see #1754.

## Tests

Manually confirmed the picker runs fast on a large workspace.
2025-01-14 07:38:28 +00:00
Andrew Nester 913e10a037
Added support for Databricks Apps in DABs (#1928)
## Changes
Now it's possible to configure new `app` resource in bundle and point it
to the custom `source_code_path` location where Databricks App code is
defined.

On `databricks bundle deploy` DABs will create an app. All consecutive
`databricks bundle deploy` execution will update an existing app if
there are any updated

On `databricks bundle run <my_app>` DABs will execute app deployment. If
the app is not started yet, it will start the app first.

### Bundle configuration

```
bundle:
  name: apps

variables:
  my_job_id:
    description: "ID of job to run app"
    lookup:
      job: "My Job"
  databricks_name:
    description: "Name for app user"
  additional_flags:
    description: "Additional flags to run command app"
    default: ""
  my_app_config:
    type: complex
    description: "Configuration for my Databricks App"
    default:
      command:
        - flask
        - --app
        - hello
        - run
        - ${var.additional_flags}
      env:
        - name: DATABRICKS_NAME
          value: ${var.databricks_name}

resources:
  apps:
    my_app:
      name: "anester-app" # required and has to be unique
      description: "My App"
      source_code_path: ./app # required and points to location of app code
      config: ${var.my_app_config}
      resources:
        - name: "my-job"
          description: "A job for app to be able to run"
          job:
            id: ${var.my_job_id}
            permission: "CAN_MANAGE_RUN"
      permissions:
        - user_name: "foo@bar.com"
          level: "CAN_VIEW"
        - service_principal_name: "my_sp"
          level: "CAN_MANAGE"

targets:
  dev:
    variables:
      databricks_name: "Andrew (from dev)"
      additional_flags: --debug
  
  prod:
    variables:
      databricks_name: "Andrew (from prod)"
```

### Execution
1. `databricks bundle deploy -t dev`
2. `databricks bundle run my_app -t dev`

**If app is started**
```
✓ Getting the status of the app my-app
✓ App is in RUNNING state
✓ Preparing source code for new app deployment.
✓ Deployment is pending
✓ Starting app with command: flask --app hello run --debug
✓ App started successfully
You can access the app at <app-url>
```

**If app is not started**
```
✓ Getting the status of the app my-app
✓ App is in UNAVAILABLE state
✓ Starting the app my-app
✓ App is starting...
....
✓ App is starting...
✓ App is started!
✓ Preparing source code for new app deployment.
✓ Downloading source code from /Workspace/Users/...
✓ Starting app with command: flask --app hello run --debug
✓ App started successfully
You can access the app at <app-url>
```

## Tests
Added unit and config tests + manual test.

```
--- PASS: TestAccDeployBundleWithApp (404.59s)
PASS
coverage: 36.8% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       405.035s        coverage: 36.8% of statements in ./...
```
2025-01-13 16:43:48 +00:00
Denis Bilenko a6412e4334
Remove redundant lines from PrepareReplacementsUser (#2130)
They are not necessary because they are added below. Also, they will
cause a crash if u.Name is nil.
2025-01-13 16:12:03 +00:00
dependabot[bot] 8234604cad
Bump golang.org/x/term from 0.27.0 to 0.28.0 (#2078)
Bumps [golang.org/x/term](https://github.com/golang/term) from 0.27.0 to
0.28.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="40b02d69cd"><code>40b02d6</code></a>
go.mod: update golang.org/x dependencies</li>
<li>See full diff in <a
href="https://github.com/golang/term/compare/v0.27.0...v0.28.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/term&package-manager=go_modules&previous-version=0.27.0&new-version=0.28.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-13 13:26:55 +00:00
dependabot[bot] f8ab384bfb
Bump github.com/hashicorp/hc-install from 0.9.0 to 0.9.1 (#2079)
Bumps
[github.com/hashicorp/hc-install](https://github.com/hashicorp/hc-install)
from 0.9.0 to 0.9.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/hashicorp/hc-install/releases">github.com/hashicorp/hc-install's
releases</a>.</em></p>
<blockquote>
<h2>v0.9.1</h2>
<h2>What's Changed</h2>
<ul>
<li>build(deps): bump github.com/go-git/go-git/v5 from 5.12.0 to 5.13.0
by <a href="https://github.com/dependabot"><code>@​dependabot</code></a>
in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/268">hashicorp/hc-install#268</a></li>
<li>build(deps): bump github.com/ProtonMail/go-crypto from 1.1.0 to
1.1.2 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/261">hashicorp/hc-install#261</a></li>
<li>build(deps): bump github.com/ProtonMail/go-crypto from 1.1.0-alpha.2
to 1.1.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/259">hashicorp/hc-install#259</a></li>
<li>build(deps): bump github.com/ProtonMail/go-crypto from 1.1.2 to
1.1.3 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/263">hashicorp/hc-install#263</a></li>
<li>build(deps): bump golang.org/x/mod from 0.21.0 to 0.22.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/262">hashicorp/hc-install#262</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/imakewebthings"><code>@​imakewebthings</code></a>
made their first contribution in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/252">hashicorp/hc-install#252</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/hashicorp/hc-install/compare/v0.9.0...v0.9.1">https://github.com/hashicorp/hc-install/compare/v0.9.0...v0.9.1</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="a9cdf85469"><code>a9cdf85</code></a>
Prepare for 0.9.1 release (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/269">#269</a>)</li>
<li><a
href="18d08ba3e4"><code>18d08ba</code></a>
build(deps): Bump workflows to latest trusted versions (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/266">#266</a>)</li>
<li><a
href="e716f0ac3e"><code>e716f0a</code></a>
build(deps): bump github.com/go-git/go-git/v5 from 5.12.0 to 5.13.0 (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/268">#268</a>)</li>
<li><a
href="cca0f6dd33"><code>cca0f6d</code></a>
ci: Report code coverage (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/264">#264</a>)</li>
<li><a
href="131f8ffdb0"><code>131f8ff</code></a>
build(deps): bump github.com/ProtonMail/go-crypto from 1.1.2 to 1.1.3
(<a
href="https://redirect.github.com/hashicorp/hc-install/issues/263">#263</a>)</li>
<li><a
href="2609a7830a"><code>2609a78</code></a>
build(deps): bump golang.org/x/mod from 0.21.0 to 0.22.0 (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/262">#262</a>)</li>
<li><a
href="b9043f8dd1"><code>b9043f8</code></a>
build(deps): bump github.com/ProtonMail/go-crypto from 1.1.0 to 1.1.2
(<a
href="https://redirect.github.com/hashicorp/hc-install/issues/261">#261</a>)</li>
<li><a
href="c1dc8ac751"><code>c1dc8ac</code></a>
build(deps): bump github.com/ProtonMail/go-crypto from 1.1.0-alpha.2 to
1.1.0...</li>
<li><a
href="8ed2e0f78e"><code>8ed2e0f</code></a>
build(deps): Bump workflows to latest trusted versions (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/258">#258</a>)</li>
<li><a
href="7a0461e713"><code>7a0461e</code></a>
build(deps): Bump workflows to latest trusted versions (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/257">#257</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/hashicorp/hc-install/compare/v0.9.0...v0.9.1">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/hashicorp/hc-install | [>= 0.8.a, < 0.9] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/hashicorp/hc-install&package-manager=go_modules&previous-version=0.9.0&new-version=0.9.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-13 13:26:47 +00:00
dependabot[bot] 244a5b6bc6
Bump golang.org/x/oauth2 from 0.24.0 to 0.25.0 (#2080)
Bumps [golang.org/x/oauth2](https://github.com/golang/oauth2) from
0.24.0 to 0.25.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="49a531d12a"><code>49a531d</code></a>
all: make method and struct comments match the names</li>
<li>See full diff in <a
href="https://github.com/golang/oauth2/compare/v0.24.0...v0.25.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/oauth2&package-manager=go_modules&previous-version=0.24.0&new-version=0.25.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-13 13:26:35 +00:00
Denis Bilenko 1ead1b2e36
Move merge fix-ups after variable resolution (#2125)
## Changes
Move mutator.Merge{JobClusters,JobParameters,JobTasks,PipelineClusters}
after variable resolution. This helps with the case when key contains a
variable.

@pietern mentioned here
https://github.com/databricks/cli/pull/2101#pullrequestreview-2539168762
it should be safe.

## Tests
Existing acceptance that was capturing the bug is updated with corrected
output.
2025-01-13 13:01:31 +00:00
Denis Bilenko cae21b36de
Add a test re using variable in host (#2117)
Related issue: https://github.com/databricks/cli/issues/2095
2025-01-13 12:31:09 +00:00
Lennart Kats (databricks) 3e40a0c2f1
Encourage the use of root_path in production to ensure single deployment (#1712)
## Changes

This updates `mode: production` to allow `root_path` to indicate
uniqueness. Historically, we required `run_as` for this, which isn't
actually very effective for that purpose. `run_as` also had the problem
that it doesn't work for pipelines.

This is a cherry-pick from https://github.com/databricks/cli/pull/1387

---------

Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com>
2025-01-13 12:19:12 +00:00
Gleb Kanterov f8f804fe17
PythonMutator: update instrumentation (#2124)
## Changes
Update instrumentation for PythonMutator to handle `experimental/python`
config.

## Tests
Unit tests
2025-01-13 09:16:29 +00:00
dependabot[bot] d525ff67be
Bump astral-sh/setup-uv from 4 to 5 (#2116)
Bumps [astral-sh/setup-uv](https://github.com/astral-sh/setup-uv) from 4
to 5.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/setup-uv/releases">astral-sh/setup-uv's
releases</a>.</em></p>
<blockquote>
<h2>v5.0.0 🎄 Merry Christmas - Help fastly and users by default</h2>
<h2>Changes</h2>
<p>This christmans 🎄 release is a bit early bit still full of presents 🎁
Since we are changing some of the defaults this can lead to breaking
changes, thus the major version increase.</p>
<p>Here are the highlights:</p>
<h3><a
href="https://redirect.github.com/astral-sh/setup-uv/pull/193">Default
to enable-cache: true on GitHub hosted runners</a></h3>
<p>Did you know that that Fastly, the company hosting PyPI,
theoretically has to pay $12.5 million per month and so far have served
more than 2.41 <strong>exabytes</strong> of data?
<img
src="https://github.com/user-attachments/assets/f2f6cb3f-68f6-4e37-abb1-d3bf1f278533"
alt="image" /></p>
<p>This is why <a
href="https://redirect.github.com/astral-sh/setup-uv/issues/54">they
asked us</a> to turn on caching by default. After weighting the pros and
cons we decided to automatically upload the cache to the GitHub Actions
cache when running on GitHub hosted runners. You can still disable that
with <code>enable-cache: false</code>.</p>
<p>I remember when I first got into actions and didn't understand all
the magic. I was baffled that some actions did something behind the
scenes to make everything faster. I hope with this change we help a lot
of users who are don't want to or are afraid to understand what
<code>enable-cache</code> does.</p>
<h3><a
href="https://redirect.github.com/astral-sh/setup-uv/pull/185">Add
**/requirements*.txt to default cache-dependency-glob</a></h3>
<p>If caching is enabled we automatically searched for a
<code>uv.lock</code> file and when this changed we knew we had to
refresh the cache. A lot of projects don't use this but rather the good
old <code>requirements.txt</code>. We now automatically search for both
<code>uv.lock</code>and <code>requirements*.txt</code> (this means also
<code>requirements-test.txt</code>, <code>requirements-dev.txt</code>,
...) files.
You can change this with <code>cache-dependency-glob</code></p>
<h3><a
href="https://redirect.github.com/astral-sh/setup-uv/pull/194">Auto
activate venv when python-version is set</a></h3>
<p>Some workflows install packages on the fly. This automatically works
when using a python version that is already present on the runner. But
if uv installs the version, e.g. because it is a free-threaded version
or an old one, it is a <a
href="https://astral.sh/blog/python-build-standalone">standalone-build</a>
and installing packages &quot;into the system&quot; is not possible.</p>
<p>We now automatically create a new virtual environment with <code>uv
venv</code> and activate it for the rest of the workflow if
<code>python-version</code> is used. This means you can now do</p>
<pre lang="yaml"><code>- name: Install uv
  uses: astral-sh/setup-uv@auto-environment
  with:
    python-version: 3.13t
- run: uv pip install -i
https://pypi.anaconda.org/scientific-python-nightly-wheels/simple cython
</code></pre>
<h2>🚨 Breaking changes</h2>
<ul>
<li>Default to enable-cache: true on GitHub hosted runners <a
href="https://github.com/eifinger"><code>@​eifinger</code></a> (<a
href="https://redirect.github.com/astral-sh/setup-uv/issues/193">#193</a>)</li>
<li>Add **/requirements*.txt to default cache-dependency-glob <a
href="https://github.com/eifinger"><code>@​eifinger</code></a> (<a
href="https://redirect.github.com/astral-sh/setup-uv/issues/185">#185</a>)</li>
</ul>
<h2>🐛 Bug fixes</h2>
<ul>
<li>Always use api.github.com <a
href="https://github.com/eifinger"><code>@​eifinger</code></a> (<a
href="https://redirect.github.com/astral-sh/setup-uv/issues/191">#191</a>)</li>
</ul>
<h2>🚀 Enhancements</h2>
<ul>
<li>Auto activate venv when python-version is set <a
href="https://github.com/eifinger"><code>@​eifinger</code></a> (<a
href="https://redirect.github.com/astral-sh/setup-uv/issues/194">#194</a>)</li>
<li>Add python version to cache key <a
href="https://github.com/eifinger"><code>@​eifinger</code></a> (<a
href="https://redirect.github.com/astral-sh/setup-uv/issues/187">#187</a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="887a942a15"><code>887a942</code></a>
Set VIRTUAL_ENV to .venv instead of .venv/bin (<a
href="https://redirect.github.com/astral-sh/setup-uv/issues/210">#210</a>)</li>
<li><a
href="d174a24c07"><code>d174a24</code></a>
Align use of <code>actions/setup-python</code> with uv docu (<a
href="https://redirect.github.com/astral-sh/setup-uv/issues/207">#207</a>)</li>
<li><a
href="12c852e6ba"><code>12c852e</code></a>
Remove uv version from cache key (<a
href="https://redirect.github.com/astral-sh/setup-uv/issues/206">#206</a>)</li>
<li><a
href="180f8b4439"><code>180f8b4</code></a>
Fix wrong cacheDependencyPathHash (<a
href="https://redirect.github.com/astral-sh/setup-uv/issues/201">#201</a>)</li>
<li><a
href="e3fb95a689"><code>e3fb95a</code></a>
Warn instead of fail for no-dependency-glob (<a
href="https://redirect.github.com/astral-sh/setup-uv/issues/200">#200</a>)</li>
<li><a
href="2af22b5b2d"><code>2af22b5</code></a>
chore: update known checksums for 0.5.11 (<a
href="https://redirect.github.com/astral-sh/setup-uv/issues/198">#198</a>)</li>
<li><a
href="dd578776bb"><code>dd57877</code></a>
Auto activate venv when python-version is set (<a
href="https://redirect.github.com/astral-sh/setup-uv/issues/194">#194</a>)</li>
<li><a
href="85aa0bf0c1"><code>85aa0bf</code></a>
chore: update known checksums for 0.5.10 (<a
href="https://redirect.github.com/astral-sh/setup-uv/issues/196">#196</a>)</li>
<li><a
href="1f2cbfa7bb"><code>1f2cbfa</code></a>
Bump <code>@​types/node</code> from 22.10.1 to 22.10.2 (<a
href="https://redirect.github.com/astral-sh/setup-uv/issues/189">#189</a>)</li>
<li><a
href="25b3ce6330"><code>25b3ce6</code></a>
chore: update known checksums for 0.5.9 (<a
href="https://redirect.github.com/astral-sh/setup-uv/issues/195">#195</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/astral-sh/setup-uv/compare/v4...v5">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=astral-sh/setup-uv&package-manager=github_actions&previous-version=4&new-version=5)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-10 16:49:05 +00:00
Pieter Noordhuis dc3a157fdc
Remove cleanup in testcli package (#2108)
## Changes

The main CLI entry point used to be a global variable, and the global
state had to be cleaned up after every test run. This hasn't been the
case for a while, and instead, the CLI is initialized in a function
call. State accumulated by a single CLI "instance" can no longer leak
into other instances, so we no longer have to perform cleanup.

## Tests

Existing tests pass.
2025-01-10 10:16:53 +00:00
Denis Bilenko 99cd3fe184
Bump golangci-lint version to v1.63.4 from v1.63.1 (#2114) 2025-01-10 10:49:57 +01:00
Denis Bilenko 75cd582021
Remove lint.sh; re-add 'make fmt' (#2113)
See Makefile for explanation on difference between 'make fmt' and 'make
lint'.

I also removed lint.sh. Original motivation was to use it in aider, but
it's not a good fit there, because aider passes filenames and it does
not work well with most golang linters which requires whole packages to
work.

Follow up to #2062, #2056, #2051.
2025-01-10 10:49:33 +01:00
Denis Bilenko 72e833a897
Configure dependabot to check for new github-actions (#2112) 2025-01-10 09:39:00 +00:00
Denis Bilenko f2c4cae9f1
Increase close-after-stale from 7 to 30 days (#2111)
Giving 7 days to react before closing is too aggressive, IMO. Changed it
to 30.

Also changed 'stale' label from 30d to 60d.

Also removed dry-run setting, it does not appear to do anything.
2025-01-10 09:32:39 +00:00
Denis Bilenko 6d3b4159bd
Log warnings to stderr for "bundle validate -o json" (#2109)
## Changes
Previously diagnostics were not seen in JSON output mode. This change
prints them to stderr.

This also fixes acceptance tests to preprocess all output with
s/execPath/$CLI/ not just output.txt.

## Tests
Existing acceptance tests. In one case I've added non-json command to
check that they match in output.
2025-01-10 08:51:59 +00:00
shreyas-goenka b0c1c23630
Add `uuid` to builtin templates (#2088)
## Changes
This is useful to track telemetry associated with the templates and can
later be useful for functional usecases as well. Mlops stacks does the
same here: https://github.com/databricks/mlops-stacks/pull/185

## Tests
Existing tests.
2025-01-09 18:19:34 +00:00
Denis Bilenko a0455bcaef
Migrate bundle/tests/undefined_resources_test.go to acceptance test (#2106)
Add sort_blocks.py helper to deal with non-determinism.
2025-01-09 15:21:24 +00:00
Pieter Noordhuis 4b67e9f336
Pass tag to release as input to publish-winget workflow (#2107)
## Changes

This workflow only worked if it was triggered on the tag to publish
itself. This means it is not possible to release a version if the
workflow configuration at that tag is broken (as is the case for
v0.238.0 because of #2105).

This change adds a "tag" input that can be set when manually triggering
the workflow.

## Tests

* Succesful run with this change:
https://github.com/databricks/cli/actions/runs/12689281843
* Pull request that the run created:
https://github.com/microsoft/winget-pkgs/pull/209220
2025-01-09 12:07:29 +00:00
Pieter Noordhuis 3b3ede6e31
Update runner for the publish-winget job (#2105)
## Changes

This action uses a token to access the release artifacts and, as such,
needs to execute on the runner that's on the allowlist.

Related PRs:
* #2098
* #2077
2025-01-09 11:21:30 +00:00
219 changed files with 5762 additions and 1331 deletions

View File

@ -1 +1 @@
a6a317df8327c9b1e5cb59a03a42ffa2aabeef6d
779817ed8d63031f5ea761fbd25ee84f38feec0d

View File

@ -140,9 +140,9 @@ func new{{.PascalName}}() *cobra.Command {
{{- end}}
{{$method := .}}
{{ if not .IsJsonOnly }}
{{range $request.Fields -}}
{{range .AllFields -}}
{{- if not .Required -}}
{{if .Entity.IsObject }}// TODO: complex arg: {{.Name}}
{{if .Entity.IsObject}}{{if not (eq . $method.RequestBodyField) }}// TODO: complex arg: {{.Name}}{{end}}
{{else if .Entity.IsAny }}// TODO: any: {{.Name}}
{{else if .Entity.ArrayValue }}// TODO: array: {{.Name}}
{{else if .Entity.MapValue }}// TODO: map via StringToStringVar: {{.Name}}

View File

@ -4,3 +4,7 @@ updates:
directory: "/"
schedule:
interval: "weekly"
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "monthly"

View File

@ -18,7 +18,7 @@ jobs:
pull-requests: write
steps:
- uses: actions/stale@v9
- uses: actions/stale@28ca1036281a5e5922ead5184a1bbf96e5fc984e # v9.0.0
with:
stale-issue-message: This issue has not received a response in a while. If you want to keep this issue open, please leave a comment below and auto-close will be canceled.
stale-pr-message: This PR has not received an update in a while. If you want to keep this PR open, please leave a comment below or push a new commit and auto-close will be canceled.
@ -31,10 +31,8 @@ jobs:
exempt-pr-labels: No Autoclose
# Issue timing
days-before-stale: 30
days-before-close: 7
days-before-stale: 60
days-before-close: 30
repo-token: ${{ secrets.GITHUB_TOKEN }}
loglevel: DEBUG
# TODO: Remove dry-run after merge when confirmed it works correctly
dry-run: true

View File

@ -25,7 +25,7 @@ jobs:
if: "${{ github.event.pull_request.head.repo.fork }}"
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Delete old comments
env:

View File

@ -20,7 +20,7 @@ jobs:
steps:
- name: Generate GitHub App Token
id: generate-token
uses: actions/create-github-app-token@v1
uses: actions/create-github-app-token@c1a285145b9d317df6ced56c09f525b5c2b6f755 # v1.11.1
with:
app-id: ${{ secrets.DECO_WORKFLOW_TRIGGER_APP_ID }}
private-key: ${{ secrets.DECO_WORKFLOW_TRIGGER_PRIVATE_KEY }}

View File

@ -23,7 +23,7 @@ jobs:
steps:
- name: Generate GitHub App Token
id: generate-token
uses: actions/create-github-app-token@v1
uses: actions/create-github-app-token@c1a285145b9d317df6ced56c09f525b5c2b6f755 # v1.11.1
with:
app-id: ${{ secrets.DECO_WORKFLOW_TRIGGER_APP_ID }}
private-key: ${{ secrets.DECO_WORKFLOW_TRIGGER_PRIVATE_KEY }}

View File

@ -2,15 +2,27 @@ name: publish-winget
on:
workflow_dispatch:
inputs:
tag:
description: 'Tag to publish'
default: ''
jobs:
publish-to-winget-pkgs:
runs-on: windows-latest
runs-on:
group: databricks-protected-runner-group
labels: windows-server-latest
environment: release
steps:
- uses: vedantmgoyal2009/winget-releaser@93fd8b606a1672ec3e5c6c3bb19426be68d1a8b0 # https://github.com/vedantmgoyal2009/winget-releaser/releases/tag/v2
- uses: vedantmgoyal2009/winget-releaser@93fd8b606a1672ec3e5c6c3bb19426be68d1a8b0 # v2
with:
identifier: Databricks.DatabricksCLI
installers-regex: 'windows_.*-signed\.zip$' # Only signed Windows releases
token: ${{ secrets.ENG_DEV_ECOSYSTEM_BOT_TOKEN }}
fork-user: eng-dev-ecosystem-bot
# Use the tag from the input, or the ref name if the input is not provided.
# The ref name is equal to the tag name when this workflow is triggered by the "sign-cli" command.
release-tag: ${{ inputs.tag || github.ref_name }}

View File

@ -45,20 +45,20 @@ jobs:
steps:
- name: Checkout repository and submodules
uses: actions/checkout@v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Setup Go
uses: actions/setup-go@v5
uses: actions/setup-go@3041bf56c941b39c61721a86cd11f3bb1338122a # v5.2.0
with:
go-version: 1.23.4
- name: Setup Python
uses: actions/setup-python@v5
uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b # v5.3.0
with:
python-version: '3.9'
- name: Install uv
uses: astral-sh/setup-uv@v4
uses: astral-sh/setup-uv@887a942a15af3a7626099df99e897a18d9e5ab3a # v5.1.0
- name: Set go env
run: |
@ -79,8 +79,8 @@ jobs:
name: lint
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-go@v5
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/setup-go@3041bf56c941b39c61721a86cd11f3bb1338122a # v5.2.0
with:
go-version: 1.23.4
# Use different schema from regular job, to avoid overwriting the same key
@ -95,9 +95,9 @@ jobs:
# Exit with status code 1 if there are differences (i.e. unformatted files)
git diff --exit-code
- name: golangci-lint
uses: golangci/golangci-lint-action@v6
uses: golangci/golangci-lint-action@971e284b6050e8a5849b72094c50ab08da042db8 # v6.1.1
with:
version: v1.63.1
version: v1.63.4
args: --timeout=15m
validate-bundle-schema:
@ -106,10 +106,10 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Setup Go
uses: actions/setup-go@v5
uses: actions/setup-go@3041bf56c941b39c61721a86cd11f3bb1338122a # v5.2.0
with:
go-version: 1.23.4
# Use different schema from regular job, to avoid overwriting the same key

View File

@ -26,13 +26,13 @@ jobs:
steps:
- name: Checkout repository and submodules
uses: actions/checkout@v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
fetch-depth: 0
fetch-tags: true
- name: Setup Go
uses: actions/setup-go@v5
uses: actions/setup-go@3041bf56c941b39c61721a86cd11f3bb1338122a # v5.2.0
with:
go-version: 1.23.4
@ -48,27 +48,27 @@ jobs:
- name: Run GoReleaser
id: releaser
uses: goreleaser/goreleaser-action@v6
uses: goreleaser/goreleaser-action@9ed2f89a662bf1735a48bc8557fd212fa902bebf # v6.1.0
with:
version: ~> v2
args: release --snapshot --skip docker
- name: Upload macOS binaries
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4.6.0
with:
name: cli_darwin_snapshot
path: |
dist/*_darwin_*/
- name: Upload Linux binaries
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4.6.0
with:
name: cli_linux_snapshot
path: |
dist/*_linux_*/
- name: Upload Windows binaries
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4.6.0
with:
name: cli_windows_snapshot
path: |
@ -88,7 +88,7 @@ jobs:
# Snapshot release may only be updated for commits to the main branch.
if: github.ref == 'refs/heads/main'
uses: softprops/action-gh-release@v1
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
with:
name: Snapshot
prerelease: true

View File

@ -18,13 +18,13 @@ jobs:
steps:
- name: Checkout repository and submodules
uses: actions/checkout@v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
fetch-depth: 0
fetch-tags: true
- name: Setup Go
uses: actions/setup-go@v5
uses: actions/setup-go@3041bf56c941b39c61721a86cd11f3bb1338122a # v5.2.0
with:
go-version: 1.23.4
@ -37,7 +37,7 @@ jobs:
# Log into the GitHub Container Registry. The goreleaser action will create
# the docker images and push them to the GitHub Container Registry.
- uses: "docker/login-action@v3"
- uses: docker/login-action@9780b0c442fbb1117ed29e0efdff1e18412f7567 # v3.3.0
with:
registry: "ghcr.io"
username: "${{ github.actor }}"
@ -46,11 +46,11 @@ jobs:
# QEMU is required to build cross platform docker images using buildx.
# It allows virtualization of the CPU architecture at the application level.
- name: Set up QEMU dependency
uses: docker/setup-qemu-action@v3
uses: docker/setup-qemu-action@53851d14592bedcffcf25ea515637cff71ef929a # v3.3.0
- name: Run GoReleaser
id: releaser
uses: goreleaser/goreleaser-action@v6
uses: goreleaser/goreleaser-action@9ed2f89a662bf1735a48bc8557fd212fa902bebf # v6.1.0
with:
version: ~> v2
args: release
@ -71,7 +71,7 @@ jobs:
echo "VERSION=${VERSION:1}" >> $GITHUB_ENV
- name: Update setup-cli
uses: actions/github-script@v7
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7.0.1
with:
github-token: ${{ secrets.DECO_GITHUB_TOKEN }}
script: |
@ -99,7 +99,7 @@ jobs:
echo "VERSION=${VERSION:1}" >> $GITHUB_ENV
- name: Update homebrew-tap
uses: actions/github-script@v7
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7.0.1
with:
github-token: ${{ secrets.DECO_GITHUB_TOKEN }}
script: |
@ -140,7 +140,7 @@ jobs:
echo "VERSION=${VERSION:1}" >> $GITHUB_ENV
- name: Update CLI version in the VSCode extension
uses: actions/github-script@v7
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7.0.1
with:
github-token: ${{ secrets.DECO_GITHUB_TOKEN }}
script: |

1
.gitignore vendored
View File

@ -20,6 +20,7 @@ dist/
*.log
coverage.txt
coverage-acceptance.txt
__pycache__
*.pyc

View File

@ -15,6 +15,7 @@ linters:
- intrange
- mirror
- perfsprint
- unconvert
linters-settings:
govet:
enable-all: true
@ -41,6 +42,8 @@ linters-settings:
disable:
# good check, but we have too many assert.(No)?Errorf? so excluding for now
- require-error
copyloopvar:
check-alias: true
issues:
exclude-dirs-use-default: false # recommended by docs https://golangci-lint.run/usage/false-positives/
max-issues-per-linter: 1000

View File

@ -1,5 +1,49 @@
# Version changelog
## [Release] Release v0.239.0
### New feature announcement
#### Databricks Apps support
You can now manage Databricks Apps using DABs by defining an `app` resource in your bundle configuration.
For more information see Databricks documentation https://docs.databricks.com/en/dev-tools/bundles/resources.html#app
#### Referencing complex variables in complex variables
You can now reference complex variables within other complex variables.
For more details see https://github.com/databricks/cli/pull/2157
CLI:
* Filter out system clusters in cluster picker ([#2131](https://github.com/databricks/cli/pull/2131)).
* Add command line flags for fields that are not in the API request body ([#2155](https://github.com/databricks/cli/pull/2155)).
Bundles:
* Added support for Databricks Apps in DABs ([#1928](https://github.com/databricks/cli/pull/1928)).
* Allow artifact path to be located outside the sync root ([#2128](https://github.com/databricks/cli/pull/2128)).
* Retry app deployment if there is an active deployment in progress ([#2153](https://github.com/databricks/cli/pull/2153)).
* Resolve variables in a loop ([#2164](https://github.com/databricks/cli/pull/2164)).
* Improve resolution of complex variables within complex variables ([#2157](https://github.com/databricks/cli/pull/2157)).
* Added output message to warn about slower deployments with apps ([#2161](https://github.com/databricks/cli/pull/2161)).
* Patch references to UC schemas to capture dependencies automatically ([#1989](https://github.com/databricks/cli/pull/1989)).
* Format default-python template ([#2110](https://github.com/databricks/cli/pull/2110)).
* Encourage the use of root_path in production to ensure single deployment ([#1712](https://github.com/databricks/cli/pull/1712)).
* Log warnings to stderr for "bundle validate -o json" ([#2109](https://github.com/databricks/cli/pull/2109)).
API Changes:
* Changed `databricks account federation-policy update` command with new required argument order.
* Changed `databricks account service-principal-federation-policy update` command with new required argument order.
OpenAPI commit 779817ed8d63031f5ea761fbd25ee84f38feec0d (2025-01-08)
Dependency updates:
* Upgrade TF provider to 1.63.0 ([#2162](https://github.com/databricks/cli/pull/2162)).
* Bump golangci-lint version to v1.63.4 from v1.63.1 ([#2114](https://github.com/databricks/cli/pull/2114)).
* Bump astral-sh/setup-uv from 4 to 5 ([#2116](https://github.com/databricks/cli/pull/2116)).
* Bump golang.org/x/oauth2 from 0.24.0 to 0.25.0 ([#2080](https://github.com/databricks/cli/pull/2080)).
* Bump github.com/hashicorp/hc-install from 0.9.0 to 0.9.1 ([#2079](https://github.com/databricks/cli/pull/2079)).
* Bump golang.org/x/term from 0.27.0 to 0.28.0 ([#2078](https://github.com/databricks/cli/pull/2078)).
* Bump github.com/databricks/databricks-sdk-go from 0.54.0 to 0.55.0 ([#2126](https://github.com/databricks/cli/pull/2126)).
## [Release] Release v0.238.0
Bundles:

View File

@ -1,15 +1,21 @@
default: build
default: vendor fmt lint
PACKAGES=./acceptance/... ./libs/... ./internal/... ./cmd/... ./bundle/... .
GOTESTSUM_FORMAT ?= pkgname-and-test-fails
lint:
./lint.sh ./...
golangci-lint run --fix
lintcheck:
golangci-lint run ./...
# Note 'make lint' will do formatting as well. However, if there are compilation errors,
# formatting/goimports will not be applied by 'make lint'. However, it will be applied by 'make fmt'.
# If you need to ensure that formatting & imports are always fixed, do "make fmt lint"
fmt:
golangci-lint run --enable-only="gofmt,gofumpt,goimports" --fix ./...
test:
gotestsum --format ${GOTESTSUM_FORMAT} --no-summary=skipped -- ${PACKAGES}
@ -19,6 +25,17 @@ cover:
showcover:
go tool cover -html=coverage.txt
acc-cover:
rm -fr ./acceptance/build/cover/
CLI_GOCOVERDIR=build/cover go test ./acceptance
rm -fr ./acceptance/build/cover-merged/
mkdir -p acceptance/build/cover-merged/
go tool covdata merge -i $$(printf '%s,' acceptance/build/cover/* | sed 's/,$$//') -o acceptance/build/cover-merged/
go tool covdata textfmt -i acceptance/build/cover-merged -o coverage-acceptance.txt
acc-showcover:
go tool cover -html=coverage-acceptance.txt
build: vendor
go build -mod vendor
@ -39,4 +56,4 @@ integration:
integration-short:
$(INTEGRATION) -short
.PHONY: lint lintcheck test cover showcover build snapshot vendor schema integration integration-short
.PHONY: lint lintcheck fmt test cover showcover build snapshot vendor schema integration integration-short acc-cover acc-showcover

View File

@ -1,6 +1,7 @@
package acceptance_test
import (
"context"
"errors"
"fmt"
"io"
@ -17,6 +18,7 @@ import (
"github.com/databricks/cli/internal/testutil"
"github.com/databricks/cli/libs/env"
"github.com/databricks/cli/libs/testdiff"
"github.com/databricks/databricks-sdk-go"
"github.com/stretchr/testify/require"
)
@ -35,26 +37,67 @@ var Scripts = map[string]bool{
}
func TestAccept(t *testing.T) {
execPath := BuildCLI(t)
cwd, err := os.Getwd()
require.NoError(t, err)
coverDir := os.Getenv("CLI_GOCOVERDIR")
if coverDir != "" {
require.NoError(t, os.MkdirAll(coverDir, os.ModePerm))
coverDir, err = filepath.Abs(coverDir)
require.NoError(t, err)
t.Logf("Writing coverage to %s", coverDir)
}
execPath := BuildCLI(t, cwd, coverDir)
// $CLI is what test scripts are using
t.Setenv("CLI", execPath)
server := StartServer(t)
AddHandlers(server)
// Redirect API access to local server:
t.Setenv("DATABRICKS_HOST", fmt.Sprintf("http://127.0.0.1:%d", server.Port))
t.Setenv("DATABRICKS_TOKEN", "dapi1234")
// Make helper scripts available
t.Setenv("PATH", fmt.Sprintf("%s%c%s", filepath.Join(cwd, "bin"), os.PathListSeparator, os.Getenv("PATH")))
homeDir := t.TempDir()
// Do not read user's ~/.databrickscfg
t.Setenv(env.HomeEnvVar(), homeDir)
repls := testdiff.ReplacementsContext{}
repls.Set(execPath, "$CLI")
tempHomeDir := t.TempDir()
repls.Set(tempHomeDir, "$TMPHOME")
t.Logf("$TMPHOME=%v", tempHomeDir)
// Prevent CLI from downloading terraform in each test:
t.Setenv("DATABRICKS_TF_EXEC_PATH", tempHomeDir)
ctx := context.Background()
cloudEnv := os.Getenv("CLOUD_ENV")
if cloudEnv == "" {
server := StartServer(t)
AddHandlers(server)
// Redirect API access to local server:
t.Setenv("DATABRICKS_HOST", server.URL)
t.Setenv("DATABRICKS_TOKEN", "dapi1234")
homeDir := t.TempDir()
// Do not read user's ~/.databrickscfg
t.Setenv(env.HomeEnvVar(), homeDir)
}
workspaceClient, err := databricks.NewWorkspaceClient()
require.NoError(t, err)
user, err := workspaceClient.CurrentUser.Me(ctx)
require.NoError(t, err)
require.NotNil(t, user)
testdiff.PrepareReplacementsUser(t, &repls, *user)
testdiff.PrepareReplacementsWorkspaceClient(t, &repls, workspaceClient)
testDirs := getTests(t)
require.NotEmpty(t, testDirs)
for _, dir := range testDirs {
t.Run(dir, func(t *testing.T) {
testName := strings.ReplaceAll(dir, "\\", "/")
t.Run(testName, func(t *testing.T) {
t.Parallel()
runTest(t, dir)
runTest(t, dir, coverDir, repls)
})
}
}
@ -79,7 +122,7 @@ func getTests(t *testing.T) []string {
return testDirs
}
func runTest(t *testing.T, dir string) {
func runTest(t *testing.T, dir, coverDir string, repls testdiff.ReplacementsContext) {
var tmpDir string
var err error
if KeepTmp {
@ -102,11 +145,20 @@ func runTest(t *testing.T, dir string) {
args := []string{"bash", "-euo", "pipefail", EntryPointScript}
cmd := exec.Command(args[0], args[1:]...)
if coverDir != "" {
// Creating individual coverage directory for each test, because writing to the same one
// results in sporadic failures like this one (only if tests are running in parallel):
// +error: coverage meta-data emit failed: writing ... rename .../tmp.covmeta.b3f... .../covmeta.b3f2c...: no such file or directory
coverDir = filepath.Join(coverDir, strings.ReplaceAll(dir, string(os.PathSeparator), "--"))
err := os.MkdirAll(coverDir, os.ModePerm)
require.NoError(t, err)
cmd.Env = append(os.Environ(), "GOCOVERDIR="+coverDir)
}
cmd.Dir = tmpDir
outB, err := cmd.CombinedOutput()
out := formatOutput(string(outB), err)
out = strings.ReplaceAll(out, os.Getenv("CLI"), "$CLI")
out = repls.Replace(out)
doComparison(t, filepath.Join(dir, "output.txt"), "script output", out)
for key := range outputs {
@ -125,7 +177,8 @@ func runTest(t *testing.T, dir string) {
continue
}
pathExpected := filepath.Join(dir, key)
doComparison(t, pathExpected, pathNew, string(newValBytes))
newVal := repls.Replace(string(newValBytes))
doComparison(t, pathExpected, pathNew, newVal)
}
// Make sure there are not unaccounted for new files
@ -146,6 +199,7 @@ func runTest(t *testing.T, dir string) {
// Show the contents & support overwrite mode for it:
pathNew := filepath.Join(tmpDir, name)
newVal := testutil.ReadFile(t, pathNew)
newVal = repls.Replace(newVal)
doComparison(t, filepath.Join(dir, name), filepath.Join(tmpDir, name), newVal)
}
}
@ -171,6 +225,11 @@ func doComparison(t *testing.T, pathExpected, pathNew, valueNew string) {
// Note, cleanups are not executed if main script fails; that's not a huge issue, since it runs it temp dir.
func readMergedScriptContents(t *testing.T, dir string) string {
scriptContents := testutil.ReadFile(t, filepath.Join(dir, EntryPointScript))
// Wrap script contents in a subshell such that changing the working
// directory only affects the main script and not cleanup.
scriptContents = "(\n" + scriptContents + ")\n"
prepares := []string{}
cleanups := []string{}
@ -199,16 +258,30 @@ func readMergedScriptContents(t *testing.T, dir string) string {
return strings.Join(prepares, "\n")
}
func BuildCLI(t *testing.T) string {
cwd, err := os.Getwd()
require.NoError(t, err)
func BuildCLI(t *testing.T, cwd, coverDir string) string {
execPath := filepath.Join(cwd, "build", "databricks")
if runtime.GOOS == "windows" {
execPath += ".exe"
}
start := time.Now()
args := []string{"go", "build", "-mod", "vendor", "-o", execPath}
args := []string{
"go", "build",
"-mod", "vendor",
"-o", execPath,
}
if coverDir != "" {
args = append(args, "-cover")
}
if runtime.GOOS == "windows" {
// Get this error on my local Windows:
// error obtaining VCS status: exit status 128
// Use -buildvcs=false to disable VCS stamping.
args = append(args, "-buildvcs=false")
}
cmd := exec.Command(args[0], args[1:]...)
cmd.Dir = ".."
out, err := cmd.CombinedOutput()

21
acceptance/bin/sort_blocks.py Executable file
View File

@ -0,0 +1,21 @@
#!/usr/bin/env python3
"""
Helper to sort blocks in text file. A block is a set of lines separated from others by empty line.
This is to workaround non-determinism in the output.
"""
import sys
blocks = []
for line in sys.stdin:
if not line.strip():
if blocks and blocks[-1]:
blocks.append('')
continue
if not blocks:
blocks.append('')
blocks[-1] += line
blocks.sort()
print("\n".join(blocks))

View File

@ -0,0 +1,6 @@
{
"project_name": "my_dbt_sql",
"http_path": "/sql/2.0/warehouses/f00dcafe",
"default_catalog": "main",
"personal_schemas": "yes, use a schema based on the current user name during development"
}

View File

@ -0,0 +1,32 @@
>>> $CLI bundle init dbt-sql --config-file ./input.json
Welcome to the dbt template for Databricks Asset Bundles!
A workspace was selected based on your current profile. For information about how to change this, see https://docs.databricks.com/dev-tools/cli/profiles.html.
workspace_host: $DATABRICKS_URL
📊 Your new project has been created in the 'my_dbt_sql' directory!
If you already have dbt installed, just type 'cd my_dbt_sql; dbt init' to get started.
Refer to the README.md file for full "getting started" guide and production setup instructions.
>>> $CLI bundle validate -t dev
Name: my_dbt_sql
Target: dev
Workspace:
Host: $DATABRICKS_URL
User: $USERNAME
Path: /Workspace/Users/$USERNAME/.bundle/my_dbt_sql/dev
Validation OK!
>>> $CLI bundle validate -t prod
Name: my_dbt_sql
Target: prod
Workspace:
Host: $DATABRICKS_URL
User: $USERNAME
Path: /Workspace/Users/$USERNAME/.bundle/my_dbt_sql/prod
Validation OK!

View File

@ -0,0 +1,5 @@
trace $CLI bundle init dbt-sql --config-file ./input.json
cd my_dbt_sql
trace $CLI bundle validate -t dev
trace $CLI bundle validate -t prod

View File

@ -0,0 +1 @@
rm -fr my_dbt_sql

View File

@ -0,0 +1,6 @@
{
"project_name": "my_default_python",
"include_notebook": "yes",
"include_dlt": "yes",
"include_python": "yes"
}

View File

@ -0,0 +1,30 @@
>>> $CLI bundle init default-python --config-file ./input.json
Welcome to the default Python template for Databricks Asset Bundles!
Workspace to use (auto-detected, edit in 'my_default_python/databricks.yml'): $DATABRICKS_URL
✨ Your new project has been created in the 'my_default_python' directory!
Please refer to the README.md file for "getting started" instructions.
See also the documentation at https://docs.databricks.com/dev-tools/bundles/index.html.
>>> $CLI bundle validate -t dev
Name: my_default_python
Target: dev
Workspace:
Host: $DATABRICKS_URL
User: $USERNAME
Path: /Workspace/Users/$USERNAME/.bundle/my_default_python/dev
Validation OK!
>>> $CLI bundle validate -t prod
Name: my_default_python
Target: prod
Workspace:
Host: $DATABRICKS_URL
User: $USERNAME
Path: /Workspace/Users/$USERNAME/.bundle/my_default_python/prod
Validation OK!

View File

@ -0,0 +1,5 @@
trace $CLI bundle init default-python --config-file ./input.json
cd my_default_python
trace $CLI bundle validate -t dev
trace $CLI bundle validate -t prod

View File

@ -0,0 +1 @@
rm -fr my_default_python

View File

@ -0,0 +1,6 @@
{
"project_name": "my_default_sql",
"http_path": "/sql/2.0/warehouses/f00dcafe",
"default_catalog": "main",
"personal_schemas": "yes, automatically use a schema based on the current user name during development"
}

View File

@ -0,0 +1,32 @@
>>> $CLI bundle init default-sql --config-file ./input.json
Welcome to the default SQL template for Databricks Asset Bundles!
A workspace was selected based on your current profile. For information about how to change this, see https://docs.databricks.com/dev-tools/cli/profiles.html.
workspace_host: $DATABRICKS_URL
✨ Your new project has been created in the 'my_default_sql' directory!
Please refer to the README.md file for "getting started" instructions.
See also the documentation at https://docs.databricks.com/dev-tools/bundles/index.html.
>>> $CLI bundle validate -t dev
Name: my_default_sql
Target: dev
Workspace:
Host: $DATABRICKS_URL
User: $USERNAME
Path: /Workspace/Users/$USERNAME/.bundle/my_default_sql/dev
Validation OK!
>>> $CLI bundle validate -t prod
Name: my_default_sql
Target: prod
Workspace:
Host: $DATABRICKS_URL
User: $USERNAME
Path: /Workspace/Users/$USERNAME/.bundle/my_default_sql/prod
Validation OK!

View File

@ -0,0 +1,5 @@
trace $CLI bundle init default-sql --config-file ./input.json
cd my_default_sql
trace $CLI bundle validate -t dev
trace $CLI bundle validate -t prod

View File

@ -0,0 +1 @@
rm -fr my_default_sql

View File

@ -4,7 +4,7 @@
"foo": {
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/tester@databricks.com/.bundle/override_job_cluster/development/state/metadata.json"
"metadata_file_path": "/Workspace/Users/$USERNAME/.bundle/override_job_cluster/development/state/metadata.json"
},
"edit_mode": "UI_LOCKED",
"format": "MULTI_TASK",
@ -32,7 +32,7 @@
"foo": {
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/tester@databricks.com/.bundle/override_job_cluster/staging/state/metadata.json"
"metadata_file_path": "/Workspace/Users/$USERNAME/.bundle/override_job_cluster/staging/state/metadata.json"
},
"edit_mode": "UI_LOCKED",
"format": "MULTI_TASK",

View File

@ -20,7 +20,6 @@ targets:
jobs:
foo:
job_clusters:
# This does not work because merging is done before resolution
- job_cluster_key: "${var.mykey}"
new_cluster:
node_type_id: i3.xlarge

View File

@ -4,22 +4,17 @@
"foo": {
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/tester@databricks.com/.bundle/override_job_cluster/development/state/metadata.json"
"metadata_file_path": "/Workspace/Users/$USERNAME/.bundle/override_job_cluster/development/state/metadata.json"
},
"edit_mode": "UI_LOCKED",
"format": "MULTI_TASK",
"job_clusters": [
{
"job_cluster_key": "key",
"new_cluster": {
"spark_version": "13.3.x-scala2.12"
}
},
{
"job_cluster_key": "key",
"new_cluster": {
"node_type_id": "i3.xlarge",
"num_workers": 1
"num_workers": 1,
"spark_version": "13.3.x-scala2.12"
}
}
],
@ -36,8 +31,8 @@
Name: override_job_cluster
Target: development
Workspace:
User: tester@databricks.com
Path: /Workspace/Users/tester@databricks.com/.bundle/override_job_cluster/development
User: $USERNAME
Path: /Workspace/Users/$USERNAME/.bundle/override_job_cluster/development
Validation OK!
@ -46,22 +41,17 @@ Validation OK!
"foo": {
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/tester@databricks.com/.bundle/override_job_cluster/staging/state/metadata.json"
"metadata_file_path": "/Workspace/Users/$USERNAME/.bundle/override_job_cluster/staging/state/metadata.json"
},
"edit_mode": "UI_LOCKED",
"format": "MULTI_TASK",
"job_clusters": [
{
"job_cluster_key": "key",
"new_cluster": {
"spark_version": "13.3.x-scala2.12"
}
},
{
"job_cluster_key": "key",
"new_cluster": {
"node_type_id": "i3.2xlarge",
"num_workers": 4
"num_workers": 4,
"spark_version": "13.3.x-scala2.12"
}
}
],
@ -78,7 +68,7 @@ Validation OK!
Name: override_job_cluster
Target: staging
Workspace:
User: tester@databricks.com
Path: /Workspace/Users/tester@databricks.com/.bundle/override_job_cluster/staging
User: $USERNAME
Path: /Workspace/Users/$USERNAME/.bundle/override_job_cluster/staging
Validation OK!

View File

@ -0,0 +1,6 @@
>>> errcode $CLI bundle validate -o json -t development
Error: file ./test1.py not found
Exit code: 1

View File

@ -1,8 +1,3 @@
>>> errcode $CLI bundle validate -o json -t development
Error: file ./test1.py not found
Exit code: 1
{
"name": "job",
"queue": {
@ -36,6 +31,7 @@ Exit code: 1
>>> errcode $CLI bundle validate -o json -t staging
Error: file ./test1.py not found
Exit code: 1
{
"name": "job",
@ -66,3 +62,16 @@ Exit code: 1
}
]
}
>>> errcode $CLI bundle validate -t staging
Error: file ./test1.py not found
Name: override_job_tasks
Target: staging
Workspace:
User: $USERNAME
Path: /Workspace/Users/$USERNAME/.bundle/override_job_tasks/staging
Found 1 error
Exit code: 1

View File

@ -1,2 +1,3 @@
trace errcode $CLI bundle validate -o json -t development | jq .resources.jobs.foo
trace errcode $CLI bundle validate -o json -t development 2> out.development.stderr.txt | jq .resources.jobs.foo
trace errcode $CLI bundle validate -o json -t staging | jq .resources.jobs.foo
trace errcode $CLI bundle validate -t staging

View File

@ -1,5 +1,9 @@
>>> $CLI bundle validate -o json -t dev
Warning: expected map, found string
at resources.clusters.my_cluster
in databricks.yml:6:17
{
"clusters": {
"my_cluster": {
@ -17,7 +21,7 @@ Warning: expected map, found string
Name: merge-string-map
Target: dev
Workspace:
User: tester@databricks.com
Path: /Workspace/Users/tester@databricks.com/.bundle/merge-string-map/dev
User: $USERNAME
Path: /Workspace/Users/$USERNAME/.bundle/merge-string-map/dev
Found 1 warning

View File

@ -14,7 +14,7 @@
],
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/tester@databricks.com/.bundle/override_pipeline_cluster/development/state/metadata.json"
"metadata_file_path": "/Workspace/Users/$USERNAME/.bundle/override_pipeline_cluster/development/state/metadata.json"
},
"name": "job",
"permissions": []
@ -36,7 +36,7 @@
],
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/tester@databricks.com/.bundle/override_pipeline_cluster/staging/state/metadata.json"
"metadata_file_path": "/Workspace/Users/$USERNAME/.bundle/override_pipeline_cluster/staging/state/metadata.json"
},
"name": "job",
"permissions": []

View File

@ -0,0 +1,19 @@
Error: experiment undefined-experiment is not defined
at resources.experiments.undefined-experiment
in databricks.yml:11:26
Error: job undefined-job is not defined
at resources.jobs.undefined-job
in databricks.yml:6:19
Error: pipeline undefined-pipeline is not defined
at resources.pipelines.undefined-pipeline
in databricks.yml:14:24
Found 3 errors
Name: undefined-job
Target: default
Exit code: 1

View File

@ -0,0 +1,2 @@
# We need sort_blocks.py because the order of diagnostics is currently randomized
$CLI bundle validate 2>&1 | sort_blocks.py

View File

@ -0,0 +1,12 @@
bundle:
name: complex-cross-ref
variables:
a:
default:
a_1: 500
a_2: ${var.b.b_2}
b:
default:
b_1: ${var.a.a_1}
b_2: 2.5

View File

@ -0,0 +1,22 @@
{
"a": {
"default": {
"a_1": 500,
"a_2": 2.5
},
"value": {
"a_1": 500,
"a_2": 2.5
}
},
"b": {
"default": {
"b_1": 500,
"b_2": 2.5
},
"value": {
"b_1": 500,
"b_2": 2.5
}
}
}

View File

@ -0,0 +1 @@
$CLI bundle validate -o json | jq .variables

View File

@ -0,0 +1,7 @@
bundle:
name: cycle
variables:
a:
default:
hello: ${var.a}

View File

@ -0,0 +1,9 @@
Warning: Detected unresolved variables after 11 resolution rounds
Name: cycle
Target: default
Workspace:
User: $USERNAME
Path: /Workspace/Users/$USERNAME/.bundle/cycle/default
Found 1 warning

View File

@ -0,0 +1 @@
$CLI bundle validate

View File

@ -0,0 +1,10 @@
bundle:
name: cycle
variables:
a:
default:
hello: ${var.b}
b:
default:
hello: ${var.a}

View File

@ -0,0 +1,9 @@
Warning: Detected unresolved variables after 11 resolution rounds
Name: cycle
Target: default
Workspace:
User: $USERNAME
Path: /Workspace/Users/$USERNAME/.bundle/cycle/default
Found 1 warning

View File

@ -0,0 +1 @@
$CLI bundle validate

View File

@ -0,0 +1,27 @@
# This example works and properly merges resources.jobs.job1.job_clusters.new_cluster and ${var.cluster}.
# retaining num_workers, spark_version and overriding node_type_id.
bundle:
name: TestResolveComplexVariable
variables:
cluster:
type: "complex"
value:
node_type_id: "Standard_DS3_v2"
num_workers: 2
resources:
jobs:
job1:
job_clusters:
- new_cluster:
node_type_id: "random"
spark_version: 13.3.x-scala2.12
targets:
dev:
resources:
jobs:
job1:
job_clusters:
- new_cluster: ${var.cluster}

View File

@ -0,0 +1,10 @@
[
{
"job_cluster_key": "",
"new_cluster": {
"node_type_id": "Standard_DS3_v2",
"num_workers": 2,
"spark_version": "13.3.x-scala2.12"
}
}
]

View File

@ -0,0 +1 @@
$CLI bundle validate -o json | jq .resources.jobs.job1.job_clusters

View File

@ -0,0 +1,21 @@
bundle:
name: complex-transitive
variables:
catalog:
default: hive_metastore
spark_conf_1:
default:
"spark.databricks.sql.initial.catalog.name": ${var.catalog}
spark_conf:
default: ${var.spark_conf_1}
etl_cluster_config:
type: complex
default:
spark_version: 14.3.x-scala2.12
runtime_engine: PHOTON
spark_conf: ${var.spark_conf}
resources:
clusters:
my_cluster: ${var.etl_cluster_config}

View File

@ -0,0 +1,3 @@
{
"spark.databricks.sql.initial.catalog.name": "hive_metastore"
}

View File

@ -0,0 +1,2 @@
# Currently, this incorrectly outputs variable reference instead of resolved value
$CLI bundle validate -o json | jq '.resources.clusters.my_cluster.spark_conf'

View File

@ -0,0 +1,22 @@
bundle:
name: complex-transitive-deeper
variables:
catalog_1:
default:
name: hive_metastore
catalog:
default: ${var.catalog_1}
spark_conf:
default:
"spark.databricks.sql.initial.catalog.name": ${var.catalog.name}
etl_cluster_config:
type: complex
default:
spark_version: 14.3.x-scala2.12
runtime_engine: PHOTON
spark_conf: ${var.spark_conf}
resources:
clusters:
my_cluster: ${var.etl_cluster_config}

View File

@ -0,0 +1,7 @@
Error: expected a map to index "variables.catalog.value.name", found string
{
"my_cluster": "${var.etl_cluster_config}"
}
Exit code: 1

View File

@ -0,0 +1,2 @@
# Currently, this errors instead of interpolating variables
$CLI bundle validate -o json | jq '.resources.clusters'

View File

@ -1,3 +1,3 @@
{
"spark.databricks.sql.initial.catalog.name": "${var.catalog}"
"spark.databricks.sql.initial.catalog.name": "hive_metastore"
}

View File

@ -0,0 +1,17 @@
bundle:
name: TestResolveComplexVariableWithVarReference
variables:
package_version:
default: "1.0.0"
cluster_libraries:
type: "complex"
default:
- pypi:
package: "cicd_template==${var.package_version}"
resources:
jobs:
job1:
tasks:
- libraries: ${var.cluster_libraries}

View File

@ -0,0 +1,12 @@
[
{
"libraries": [
{
"pypi": {
"package": "cicd_template==1.0.0"
}
}
],
"task_key": ""
}
]

View File

@ -0,0 +1 @@
$CLI bundle validate -o json | jq .resources.jobs.job1.tasks

View File

@ -0,0 +1,34 @@
# Does not work currently, explicitly disabled, even though it works if you remove 'type: "complex"' lines
# Also fails to merge clusters.
bundle:
name: TestResolveComplexVariableReferencesWithComplexVariablesError
variables:
cluster:
type: "complex"
value:
node_type_id: "Standard_DS3_v2"
num_workers: 2
spark_conf: "${var.spark_conf}"
spark_conf:
type: "complex"
value:
spark.executor.memory: "4g"
spark.executor.cores: "2"
resources:
jobs:
job1:
job_clusters:
- job_cluster_key: my_cluster
new_cluster:
node_type_id: "random"
targets:
dev:
resources:
jobs:
job1:
job_clusters:
- job_cluster_key: my_cluster
new_cluster: ${var.cluster}

View File

@ -0,0 +1,17 @@
Warning: unknown field: node_type_id
at resources.jobs.job1.job_clusters[0]
in databricks.yml:25:11
[
{
"job_cluster_key": "my_cluster",
"new_cluster": {
"node_type_id": "Standard_DS3_v2",
"num_workers": 2,
"spark_conf": {
"spark.executor.cores": "2",
"spark.executor.memory": "4g"
}
}
}
]

View File

@ -0,0 +1 @@
$CLI bundle validate -o json | jq .resources.jobs.job1.job_clusters

View File

@ -4,7 +4,7 @@
"my_job": {
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/tester@databricks.com/.bundle/complex-variables/default/state/metadata.json"
"metadata_file_path": "/Workspace/Users/$USERNAME/.bundle/complex-variables/default/state/metadata.json"
},
"edit_mode": "UI_LOCKED",
"format": "MULTI_TASK",

View File

@ -4,7 +4,7 @@
"my_job": {
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/tester@databricks.com/.bundle/complex-variables/dev/state/metadata.json"
"metadata_file_path": "/Workspace/Users/$USERNAME/.bundle/complex-variables/dev/state/metadata.json"
},
"edit_mode": "UI_LOCKED",
"format": "MULTI_TASK",

View File

@ -4,7 +4,7 @@
"my_job": {
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/tester@databricks.com/.bundle/complex-variables-multiple-files/dev/state/metadata.json"
"metadata_file_path": "/Workspace/Users/$USERNAME/.bundle/complex-variables-multiple-files/dev/state/metadata.json"
},
"edit_mode": "UI_LOCKED",
"format": "MULTI_TASK",

View File

@ -0,0 +1,8 @@
bundle:
name: cycle
variables:
a:
default: ${var.b}
b:
default: ${var.a}

View File

@ -0,0 +1,14 @@
Error: cycle detected in field resolution: variables.a.default -> var.b -> var.a -> var.b
{
"a": {
"default": "${var.b}",
"value": "${var.b}"
},
"b": {
"default": "${var.a}",
"value": "${var.a}"
}
}
Exit code: 1

View File

@ -0,0 +1 @@
$CLI bundle validate -o json | jq .variables

View File

@ -3,8 +3,8 @@ Error: no value assigned to required variable a. Assignment can be done through
Name: empty${var.a}
Target: default
Workspace:
User: tester@databricks.com
Path: /Workspace/Users/tester@databricks.com/.bundle/empty${var.a}/default
User: $USERNAME
Path: /Workspace/Users/$USERNAME/.bundle/empty${var.a}/default
Found 1 error

View File

@ -14,8 +14,8 @@ Error: no value assigned to required variable b. Assignment can be done through
Name: test bundle
Target: env-missing-a-required-variable-assignment
Workspace:
User: tester@databricks.com
Path: /Workspace/Users/tester@databricks.com/.bundle/test bundle/env-missing-a-required-variable-assignment
User: $USERNAME
Path: /Workspace/Users/$USERNAME/.bundle/test bundle/env-missing-a-required-variable-assignment
Found 1 error

View File

@ -0,0 +1,19 @@
bundle:
name: git
git:
# This is currently not supported
branch: ${var.deployment_branch}
variables:
deployment_branch:
# By setting deployment_branch to "" we set bundle.git.branch to "" which is the same unsetting it.
# This this should make CLI read branch from git and update bundle.git.branch accordingly. It should
# Also set bundle.git.inferred to true.
default: ""
targets:
prod:
default: true
dev:
variables:
deployment_branch: dev-branch

View File

@ -0,0 +1,98 @@
>>> $CLI bundle validate -o json
{
"bundle": {
"environment": "prod",
"git": {
"actual_branch": "main",
"branch": "",
"bundle_root_path": ".",
},
"name": "git",
"target": "prod",
"terraform": {
"exec_path": "$TMPHOME"
}
},
"sync": {
"paths": [
"."
]
},
"targets": null,
"variables": {
"deployment_branch": {
"default": "",
"value": ""
}
},
"workspace": {
"artifact_path": "/Workspace/Users/$USERNAME/.bundle/git/prod/artifacts",
"current_user": {
"short_name": "$USERNAME",
"userName": "$USERNAME"
},
"file_path": "/Workspace/Users/$USERNAME/.bundle/git/prod/files",
"resource_path": "/Workspace/Users/$USERNAME/.bundle/git/prod/resources",
"root_path": "/Workspace/Users/$USERNAME/.bundle/git/prod",
"state_path": "/Workspace/Users/$USERNAME/.bundle/git/prod/state"
}
}
>>> $CLI bundle validate
Name: git
Target: prod
Workspace:
User: $USERNAME
Path: /Workspace/Users/$USERNAME/.bundle/git/prod
Validation OK!
>>> $CLI bundle validate -o json -t dev
{
"bundle": {
"environment": "dev",
"git": {
"actual_branch": "main",
"branch": "dev-branch",
"bundle_root_path": ".",
},
"name": "git",
"target": "dev",
"terraform": {
"exec_path": "$TMPHOME"
}
},
"sync": {
"paths": [
"."
]
},
"targets": null,
"variables": {
"deployment_branch": {
"default": "dev-branch",
"value": "dev-branch"
}
},
"workspace": {
"artifact_path": "/Workspace/Users/$USERNAME/.bundle/git/dev/artifacts",
"current_user": {
"short_name": "$USERNAME",
"userName": "$USERNAME"
},
"file_path": "/Workspace/Users/$USERNAME/.bundle/git/dev/files",
"resource_path": "/Workspace/Users/$USERNAME/.bundle/git/dev/resources",
"root_path": "/Workspace/Users/$USERNAME/.bundle/git/dev",
"state_path": "/Workspace/Users/$USERNAME/.bundle/git/dev/state"
}
}
>>> $CLI bundle validate -t dev
Name: git
Target: dev
Workspace:
User: $USERNAME
Path: /Workspace/Users/$USERNAME/.bundle/git/dev
Validation OK!

View File

@ -0,0 +1,6 @@
git-repo-init
trace $CLI bundle validate -o json | grep -v '"commit"'
trace $CLI bundle validate
trace $CLI bundle validate -o json -t dev | grep -v '"commit"'
trace $CLI bundle validate -t dev | grep -v '"commit"'
rm -fr .git

View File

@ -0,0 +1,10 @@
bundle:
name: host
variables:
host:
default: https://nonexistent123.staging.cloud.databricks.com
workspace:
# This is currently not supported
host: ${var.host}

View File

@ -0,0 +1,38 @@
>>> errcode $CLI bundle validate -o json
Error: failed during request visitor: parse "https://${var.host}": invalid character "{" in host name
{
"bundle": {
"environment": "default",
"name": "host",
"target": "default"
},
"sync": {
"paths": [
"."
]
},
"targets": null,
"variables": {
"host": {
"default": "https://nonexistent123.staging.cloud.databricks.com"
}
},
"workspace": {
"host": "${var.host}"
}
}
Exit code: 1
>>> errcode $CLI bundle validate
Error: failed during request visitor: parse "https://${var.host}": invalid character "{" in host name
Name: host
Target: default
Workspace:
Host: ${var.host}
Found 1 error
Exit code: 1

View File

@ -0,0 +1,2 @@
trace errcode $CLI bundle validate -o json
trace errcode $CLI bundle validate

View File

@ -1,8 +1,8 @@
{
"artifact_path": "TestResolveVariableReferences/bar/artifacts",
"current_user": {
"short_name": "tester",
"userName": "tester@databricks.com"
"short_name": "$USERNAME",
"userName": "$USERNAME"
},
"file_path": "TestResolveVariableReferences/bar/baz",
"resource_path": "TestResolveVariableReferences/bar/resources",

View File

@ -0,0 +1,23 @@
bundle:
name: TestResolveVariableReferencesForPrimitiveNonStringFields
variables:
no_alert_for_canceled_runs: {}
no_alert_for_skipped_runs: {}
min_workers: {}
max_workers: {}
spot_bid_max_price: {}
resources:
jobs:
job1:
notification_settings:
no_alert_for_canceled_runs: ${var.no_alert_for_canceled_runs}
no_alert_for_skipped_runs: ${var.no_alert_for_skipped_runs}
tasks:
- new_cluster:
autoscale:
min_workers: ${var.min_workers}
max_workers: ${var.max_workers}
azure_attributes:
spot_bid_max_price: ${var.spot_bid_max_price}

View File

@ -0,0 +1,52 @@
{
"variables": {
"max_workers": {
"value": "2"
},
"min_workers": {
"value": "1"
},
"no_alert_for_canceled_runs": {
"value": "true"
},
"no_alert_for_skipped_runs": {
"value": "false"
},
"spot_bid_max_price": {
"value": "0.5"
}
},
"jobs": {
"job1": {
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/$USERNAME/.bundle/TestResolveVariableReferencesForPrimitiveNonStringFields/default/state/metadata.json"
},
"edit_mode": "UI_LOCKED",
"format": "MULTI_TASK",
"notification_settings": {
"no_alert_for_canceled_runs": true,
"no_alert_for_skipped_runs": false
},
"permissions": [],
"queue": {
"enabled": true
},
"tags": {},
"tasks": [
{
"new_cluster": {
"autoscale": {
"max_workers": 2,
"min_workers": 1
},
"azure_attributes": {
"spot_bid_max_price": 0.5
}
},
"task_key": ""
}
]
}
}
}

View File

@ -0,0 +1,4 @@
export BUNDLE_VAR_no_alert_for_skipped_runs=false
export BUNDLE_VAR_max_workers=2
export BUNDLE_VAR_min_workers=3 # shadowed by --var below
$CLI bundle validate -o json --var no_alert_for_canceled_runs=true --var min_workers=1 --var spot_bid_max_price=0.5 | jq '{ variables, jobs: .resources.jobs }'

View File

@ -0,0 +1,9 @@
bundle:
name: TestResolveVariableReferencesToBundleVariables
workspace:
root_path: "${bundle.name}/${var.foo}"
variables:
foo:
value: "bar"

View File

@ -0,0 +1,11 @@
{
"artifact_path": "TestResolveVariableReferencesToBundleVariables/bar/artifacts",
"current_user": {
"short_name": "$USERNAME",
"userName": "$USERNAME"
},
"file_path": "TestResolveVariableReferencesToBundleVariables/bar/files",
"resource_path": "TestResolveVariableReferencesToBundleVariables/bar/resources",
"root_path": "TestResolveVariableReferencesToBundleVariables/bar",
"state_path": "TestResolveVariableReferencesToBundleVariables/bar/state"
}

View File

@ -0,0 +1 @@
$CLI bundle validate -o json | jq .workspace

View File

@ -8,8 +8,8 @@ Error: no value assigned to required variable b. Assignment can be done through
Name: ${var.a} ${var.b}
Target: default
Workspace:
User: tester@databricks.com
Path: /Workspace/Users/tester@databricks.com/.bundle/${var.a} ${var.b}/default
User: $USERNAME
Path: /Workspace/Users/$USERNAME/.bundle/${var.a} ${var.b}/default
Found 1 error

View File

@ -12,7 +12,7 @@
"continuous": true,
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/tester@databricks.com/.bundle/foobar/use-default-variable-values/state/metadata.json"
"metadata_file_path": "/Workspace/Users/$USERNAME/.bundle/foobar/use-default-variable-values/state/metadata.json"
},
"name": "a_string",
"permissions": []
@ -33,7 +33,7 @@
"continuous": true,
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/tester@databricks.com/.bundle/foobar/override-string-variable/state/metadata.json"
"metadata_file_path": "/Workspace/Users/$USERNAME/.bundle/foobar/override-string-variable/state/metadata.json"
},
"name": "overridden_string",
"permissions": []
@ -54,7 +54,7 @@
"continuous": true,
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/tester@databricks.com/.bundle/foobar/override-int-variable/state/metadata.json"
"metadata_file_path": "/Workspace/Users/$USERNAME/.bundle/foobar/override-int-variable/state/metadata.json"
},
"name": "a_string",
"permissions": []
@ -75,7 +75,7 @@
"continuous": false,
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/tester@databricks.com/.bundle/foobar/override-both-bool-and-string-variables/state/metadata.json"
"metadata_file_path": "/Workspace/Users/$USERNAME/.bundle/foobar/override-both-bool-and-string-variables/state/metadata.json"
},
"name": "overridden_string",
"permissions": []

View File

@ -1,6 +1,3 @@
# Prevent CLI from downloading terraform in each test:
export DATABRICKS_TF_EXEC_PATH=/tmp/
errcode() {
# Temporarily disable 'set -e' to prevent the script from exiting on error
set +e
@ -34,3 +31,12 @@ trace() {
return $?
}
git-repo-init() {
git init -qb main
git config --global core.autocrlf false
git config user.name "Tester"
git config user.email "tester@databricks.com"
git add databricks.yml
git commit -qm 'Add databricks.yml'
}

View File

@ -2,11 +2,11 @@ package acceptance_test
import (
"encoding/json"
"net"
"net/http"
"net/http/httptest"
"testing"
"github.com/databricks/databricks-sdk-go/service/catalog"
"github.com/databricks/databricks-sdk-go/service/compute"
"github.com/databricks/databricks-sdk-go/service/iam"
"github.com/databricks/databricks-sdk-go/service/workspace"
@ -14,8 +14,7 @@ import (
type TestServer struct {
*httptest.Server
Mux *http.ServeMux
Port int
Mux *http.ServeMux
}
type HandlerFunc func(r *http.Request) (any, error)
@ -23,12 +22,10 @@ type HandlerFunc func(r *http.Request) (any, error)
func NewTestServer() *TestServer {
mux := http.NewServeMux()
server := httptest.NewServer(mux)
port := server.Listener.Addr().(*net.TCPAddr).Port
return &TestServer{
Server: server,
Mux: mux,
Port: port,
}
}
@ -126,4 +123,27 @@ func AddHandlers(server *TestServer) {
ResourceId: "1001",
}, nil
})
server.Handle("/api/2.1/unity-catalog/current-metastore-assignment", func(r *http.Request) (any, error) {
return catalog.MetastoreAssignment{
DefaultCatalogName: "main",
}, nil
})
server.Handle("/api/2.0/permissions/directories/1001", func(r *http.Request) (any, error) {
return workspace.WorkspaceObjectPermissions{
ObjectId: "1001",
ObjectType: "DIRECTORY",
AccessControlList: []workspace.WorkspaceObjectAccessControlResponse{
{
UserName: "tester@databricks.com",
AllPermissions: []workspace.WorkspaceObjectPermission{
{
PermissionLevel: "CAN_MANAGE",
},
},
},
},
}, nil
})
}

View File

@ -0,0 +1,50 @@
package apps
import (
"context"
"github.com/databricks/cli/bundle"
"github.com/databricks/cli/bundle/config"
"github.com/databricks/cli/libs/diag"
"github.com/databricks/cli/libs/dyn"
"github.com/databricks/cli/libs/dyn/dynvar"
)
type interpolateVariables struct{}
func (i *interpolateVariables) Apply(ctx context.Context, b *bundle.Bundle) diag.Diagnostics {
pattern := dyn.NewPattern(
dyn.Key("resources"),
dyn.Key("apps"),
dyn.AnyKey(),
dyn.Key("config"),
)
tfToConfigMap := map[string]string{}
for k, r := range config.SupportedResources() {
tfToConfigMap[r.TerraformResourceName] = k
}
err := b.Config.Mutate(func(root dyn.Value) (dyn.Value, error) {
return dyn.MapByPattern(root, pattern, func(p dyn.Path, v dyn.Value) (dyn.Value, error) {
return dynvar.Resolve(v, func(path dyn.Path) (dyn.Value, error) {
key, ok := tfToConfigMap[path[0].Key()]
if ok {
path = dyn.NewPath(dyn.Key("resources"), dyn.Key(key)).Append(path[1:]...)
}
return dyn.GetByPath(root, path)
})
})
})
return diag.FromErr(err)
}
func (i *interpolateVariables) Name() string {
return "apps.InterpolateVariables"
}
func InterpolateVariables() bundle.Mutator {
return &interpolateVariables{}
}

View File

@ -0,0 +1,49 @@
package apps
import (
"context"
"testing"
"github.com/databricks/cli/bundle"
"github.com/databricks/cli/bundle/config"
"github.com/databricks/cli/bundle/config/resources"
"github.com/databricks/databricks-sdk-go/service/apps"
"github.com/stretchr/testify/require"
)
func TestAppInterpolateVariables(t *testing.T) {
b := &bundle.Bundle{
Config: config.Root{
Resources: config.Resources{
Apps: map[string]*resources.App{
"my_app_1": {
App: &apps.App{
Name: "my_app_1",
},
Config: map[string]any{
"command": []string{"echo", "hello"},
"env": []map[string]string{
{"name": "JOB_ID", "value": "${databricks_job.my_job.id}"},
},
},
},
"my_app_2": {
App: &apps.App{
Name: "my_app_2",
},
},
},
Jobs: map[string]*resources.Job{
"my_job": {
ID: "123",
},
},
},
},
}
diags := bundle.Apply(context.Background(), b, InterpolateVariables())
require.Empty(t, diags)
require.Equal(t, []any{map[string]any{"name": "JOB_ID", "value": "123"}}, b.Config.Resources.Apps["my_app_1"].Config["env"])
require.Nil(t, b.Config.Resources.Apps["my_app_2"].Config)
}

View File

@ -0,0 +1,29 @@
package apps
import (
"context"
"github.com/databricks/cli/bundle"
"github.com/databricks/cli/libs/cmdio"
"github.com/databricks/cli/libs/diag"
)
type slowDeployMessage struct{}
// TODO: needs to be removed when when no_compute option becomes available in TF provider and used in DABs
// See https://github.com/databricks/cli/pull/2144
func (v *slowDeployMessage) Apply(ctx context.Context, b *bundle.Bundle) diag.Diagnostics {
if len(b.Config.Resources.Apps) > 0 {
cmdio.LogString(ctx, "Note: Databricks apps included in this bundle may increase initial deployment time due to compute provisioning.")
}
return nil
}
func (v *slowDeployMessage) Name() string {
return "apps.SlowDeployMessage"
}
func SlowDeployMessage() bundle.Mutator {
return &slowDeployMessage{}
}

View File

@ -0,0 +1,97 @@
package apps
import (
"bytes"
"context"
"fmt"
"path"
"strings"
"sync"
"github.com/databricks/cli/bundle"
"github.com/databricks/cli/bundle/config/resources"
"github.com/databricks/cli/bundle/deploy"
"github.com/databricks/cli/libs/diag"
"github.com/databricks/cli/libs/filer"
"golang.org/x/sync/errgroup"
"gopkg.in/yaml.v3"
)
type uploadConfig struct {
filerFactory deploy.FilerFactory
}
func (u *uploadConfig) Apply(ctx context.Context, b *bundle.Bundle) diag.Diagnostics {
var diags diag.Diagnostics
errGroup, ctx := errgroup.WithContext(ctx)
mu := sync.Mutex{}
for key, app := range b.Config.Resources.Apps {
// If the app has a config, we need to deploy it first.
// It means we need to write app.yml file with the content of the config field
// to the remote source code path of the app.
if app.Config != nil {
appPath := strings.TrimPrefix(app.SourceCodePath, b.Config.Workspace.FilePath)
buf, err := configToYaml(app)
if err != nil {
return diag.FromErr(err)
}
f, err := u.filerFactory(b)
if err != nil {
return diag.FromErr(err)
}
errGroup.Go(func() error {
err := f.Write(ctx, path.Join(appPath, "app.yml"), buf, filer.OverwriteIfExists)
if err != nil {
mu.Lock()
diags = append(diags, diag.Diagnostic{
Severity: diag.Error,
Summary: "Failed to save config",
Detail: fmt.Sprintf("Failed to write %s file: %s", path.Join(app.SourceCodePath, "app.yml"), err),
Locations: b.Config.GetLocations("resources.apps." + key),
})
mu.Unlock()
}
return nil
})
}
}
if err := errGroup.Wait(); err != nil {
return diags.Extend(diag.FromErr(err))
}
return diags
}
// Name implements bundle.Mutator.
func (u *uploadConfig) Name() string {
return "apps:UploadConfig"
}
func UploadConfig() bundle.Mutator {
return &uploadConfig{
filerFactory: func(b *bundle.Bundle) (filer.Filer, error) {
return filer.NewWorkspaceFilesClient(b.WorkspaceClient(), b.Config.Workspace.FilePath)
},
}
}
func configToYaml(app *resources.App) (*bytes.Buffer, error) {
buf := bytes.NewBuffer(nil)
enc := yaml.NewEncoder(buf)
enc.SetIndent(2)
err := enc.Encode(app.Config)
defer enc.Close()
if err != nil {
return nil, fmt.Errorf("failed to encode app config to yaml: %w", err)
}
return buf, nil
}

View File

@ -0,0 +1,75 @@
package apps
import (
"bytes"
"context"
"os"
"path/filepath"
"testing"
"github.com/databricks/cli/bundle"
"github.com/databricks/cli/bundle/config"
"github.com/databricks/cli/bundle/config/mutator"
"github.com/databricks/cli/bundle/config/resources"
"github.com/databricks/cli/bundle/internal/bundletest"
mockfiler "github.com/databricks/cli/internal/mocks/libs/filer"
"github.com/databricks/cli/libs/dyn"
"github.com/databricks/cli/libs/filer"
"github.com/databricks/cli/libs/vfs"
"github.com/databricks/databricks-sdk-go/service/apps"
"github.com/stretchr/testify/mock"
"github.com/stretchr/testify/require"
)
func TestAppUploadConfig(t *testing.T) {
root := t.TempDir()
err := os.MkdirAll(filepath.Join(root, "my_app"), 0o700)
require.NoError(t, err)
b := &bundle.Bundle{
BundleRootPath: root,
SyncRootPath: root,
SyncRoot: vfs.MustNew(root),
Config: config.Root{
Workspace: config.Workspace{
RootPath: "/Workspace/Users/foo@bar.com/",
},
Resources: config.Resources{
Apps: map[string]*resources.App{
"my_app": {
App: &apps.App{
Name: "my_app",
},
SourceCodePath: "./my_app",
Config: map[string]any{
"command": []string{"echo", "hello"},
"env": []map[string]string{
{"name": "MY_APP", "value": "my value"},
},
},
},
},
},
},
}
mockFiler := mockfiler.NewMockFiler(t)
mockFiler.EXPECT().Write(mock.Anything, "my_app/app.yml", bytes.NewBufferString(`command:
- echo
- hello
env:
- name: MY_APP
value: my value
`), filer.OverwriteIfExists).Return(nil)
u := uploadConfig{
filerFactory: func(b *bundle.Bundle) (filer.Filer, error) {
return mockFiler, nil
},
}
bundletest.SetLocation(b, ".", []dyn.Location{{File: filepath.Join(root, "databricks.yml")}})
diags := bundle.Apply(context.Background(), b, bundle.Seq(mutator.TranslatePaths(), &u))
require.NoError(t, diags.Error())
}

53
bundle/apps/validate.go Normal file
View File

@ -0,0 +1,53 @@
package apps
import (
"context"
"fmt"
"path"
"strings"
"github.com/databricks/cli/bundle"
"github.com/databricks/cli/libs/diag"
)
type validate struct{}
func (v *validate) Apply(ctx context.Context, b *bundle.Bundle) diag.Diagnostics {
var diags diag.Diagnostics
possibleConfigFiles := []string{"app.yml", "app.yaml"}
usedSourceCodePaths := make(map[string]string)
for key, app := range b.Config.Resources.Apps {
if _, ok := usedSourceCodePaths[app.SourceCodePath]; ok {
diags = append(diags, diag.Diagnostic{
Severity: diag.Error,
Summary: "Duplicate app source code path",
Detail: fmt.Sprintf("app resource '%s' has the same source code path as app resource '%s', this will lead to the app configuration being overriden by each other", key, usedSourceCodePaths[app.SourceCodePath]),
Locations: b.Config.GetLocations(fmt.Sprintf("resources.apps.%s.source_code_path", key)),
})
}
usedSourceCodePaths[app.SourceCodePath] = key
for _, configFile := range possibleConfigFiles {
appPath := strings.TrimPrefix(app.SourceCodePath, b.Config.Workspace.FilePath)
cf := path.Join(appPath, configFile)
if _, err := b.SyncRoot.Stat(cf); err == nil {
diags = append(diags, diag.Diagnostic{
Severity: diag.Error,
Summary: configFile + " detected",
Detail: fmt.Sprintf("remove %s and use 'config' property for app resource '%s' instead", cf, app.Name),
})
}
}
}
return diags
}
func (v *validate) Name() string {
return "apps.Validate"
}
func Validate() bundle.Mutator {
return &validate{}
}

View File

@ -0,0 +1,97 @@
package apps
import (
"context"
"path/filepath"
"testing"
"github.com/databricks/cli/bundle"
"github.com/databricks/cli/bundle/config"
"github.com/databricks/cli/bundle/config/mutator"
"github.com/databricks/cli/bundle/config/resources"
"github.com/databricks/cli/bundle/internal/bundletest"
"github.com/databricks/cli/internal/testutil"
"github.com/databricks/cli/libs/dyn"
"github.com/databricks/cli/libs/vfs"
"github.com/databricks/databricks-sdk-go/service/apps"
"github.com/stretchr/testify/require"
)
func TestAppsValidate(t *testing.T) {
tmpDir := t.TempDir()
testutil.Touch(t, tmpDir, "app1", "app.yml")
testutil.Touch(t, tmpDir, "app2", "app.py")
b := &bundle.Bundle{
BundleRootPath: tmpDir,
SyncRootPath: tmpDir,
SyncRoot: vfs.MustNew(tmpDir),
Config: config.Root{
Workspace: config.Workspace{
FilePath: "/foo/bar/",
},
Resources: config.Resources{
Apps: map[string]*resources.App{
"app1": {
App: &apps.App{
Name: "app1",
},
SourceCodePath: "./app1",
},
"app2": {
App: &apps.App{
Name: "app2",
},
SourceCodePath: "./app2",
},
},
},
},
}
bundletest.SetLocation(b, ".", []dyn.Location{{File: filepath.Join(tmpDir, "databricks.yml")}})
diags := bundle.Apply(context.Background(), b, bundle.Seq(mutator.TranslatePaths(), Validate()))
require.Len(t, diags, 1)
require.Equal(t, "app.yml detected", diags[0].Summary)
require.Contains(t, diags[0].Detail, "app.yml and use 'config' property for app resource")
}
func TestAppsValidateSameSourcePath(t *testing.T) {
tmpDir := t.TempDir()
testutil.Touch(t, tmpDir, "app1", "app.py")
b := &bundle.Bundle{
BundleRootPath: tmpDir,
SyncRootPath: tmpDir,
SyncRoot: vfs.MustNew(tmpDir),
Config: config.Root{
Workspace: config.Workspace{
FilePath: "/foo/bar/",
},
Resources: config.Resources{
Apps: map[string]*resources.App{
"app1": {
App: &apps.App{
Name: "app1",
},
SourceCodePath: "./app1",
},
"app2": {
App: &apps.App{
Name: "app2",
},
SourceCodePath: "./app1",
},
},
},
},
}
bundletest.SetLocation(b, ".", []dyn.Location{{File: filepath.Join(tmpDir, "databricks.yml")}})
diags := bundle.Apply(context.Background(), b, bundle.Seq(mutator.TranslatePaths(), Validate()))
require.Len(t, diags, 1)
require.Equal(t, "Duplicate app source code path", diags[0].Summary)
require.Contains(t, diags[0].Detail, "has the same source code path as app resource")
}

View File

@ -57,6 +57,9 @@ type Bundle struct {
// It is loaded from the bundle configuration files and mutators may update it.
Config config.Root
// Target stores a snapshot of the Root.Bundle.Target configuration when it was selected by SelectTarget.
Target *config.Target `json:"target_config,omitempty" bundle:"internal"`
// Metadata about the bundle deployment. This is the interface Databricks services
// rely on to integrate with bundles when they need additional information about
// a bundle deployment.

View File

@ -0,0 +1,37 @@
package generate
import (
"github.com/databricks/cli/libs/dyn"
"github.com/databricks/cli/libs/dyn/convert"
"github.com/databricks/databricks-sdk-go/service/apps"
)
func ConvertAppToValue(app *apps.App, sourceCodePath string, appConfig map[string]any) (dyn.Value, error) {
ac, err := convert.FromTyped(appConfig, dyn.NilValue)
if err != nil {
return dyn.NilValue, err
}
ar, err := convert.FromTyped(app.Resources, dyn.NilValue)
if err != nil {
return dyn.NilValue, err
}
// The majority of fields of the app struct are read-only.
// We copy the relevant fields manually.
dv := map[string]dyn.Value{
"name": dyn.NewValue(app.Name, []dyn.Location{{Line: 1}}),
"description": dyn.NewValue(app.Description, []dyn.Location{{Line: 2}}),
"source_code_path": dyn.NewValue(sourceCodePath, []dyn.Location{{Line: 3}}),
}
if ac.Kind() != dyn.KindNil {
dv["config"] = ac.WithLocations([]dyn.Location{{Line: 4}})
}
if ar.Kind() != dyn.KindNil {
dv["resources"] = ar.WithLocations([]dyn.Location{{Line: 5}})
}
return dyn.V(dv), nil
}

View File

@ -221,6 +221,8 @@ func (m *applyPresets) Apply(ctx context.Context, b *bundle.Bundle) diag.Diagnos
dashboard.DisplayName = prefix + dashboard.DisplayName
}
// Apps: No presets
return diags
}

Some files were not shown because too many files have changed in this diff Show More