## Changes
Added a warning when `config` section is used in apps
## Why
To avoid the confusion between using apps in DABs and outside of DABs,
we want to provide only one way of configuring apps runtime
configuration - by using `app.yml` file in the root of the app.
## Tests
Added acceptance tests
## Changes
Lock setuptools version to 75.8.2 (latest as of March 3, 2025)
## Why
As part of the tests `uv install` was installing latest version of
setuptools which led to all tests started to fail on Feb 25 when 75.8.1
setuptools version was released and which changed the naming of the
output built artifacts
https://setuptools.pypa.io/en/stable/history.html#v75-8-1
This change prevents us from breakages like this.
## Tests
Existing tests pass
## Changes
Same name libraries check only valid for local libraries. Local
libraries are only supported for Whl and Jar types. Hence we can
restrict matching pattern only to these libraries.
## Tests
Existing acceptance tests pass
## Changes
This PR adds the auth.EnvVars function, which is a list of all
environment variables that the SDK uses to read auth configuration.
This is useful for spawning subprocesses since you can unset all auth
environment variables to clean up the environment before configuring the
auth.
It's used in #2278 today and will also be useful for the `databricks
bundle exec` command.
## Tests
Unit test.
## Changes
Add mutex synchronization in cmdio logger Log() method.
## Why
Since we issue multiple calls to underlying writer, we should lock the
whole method, otherwise we can get broken messages. One that can be
easily reproduced today is
```
hyperfine -m 100 --show-output 'go test ./acceptance -run ^TestAccept$/^bundle$/^artifacts$/^whl_multiple$ -count=1'
...
-Uploading my_test_code-0.0.1-py3-none-any.whl...
-Uploading my_test_code_2-0.0.1-py3-none-any.whl...
+Uploading my_test_code-0.0.1-py3-none-any.whl...Uploading my_test_code_2-0.0.1-py3-none-any.whl...
Error: Command terminated with non-zero exit code 1 in benchmark iteration 54. Use the '-i'/'--ignore-failure' option if you want to ignore this. Alternatively, use the '--show-output' option to debug what went wrong.
```
An alternative could be to prepare a message fully in a local buffer and
write it in one call (I’m assuming underlying writer is still
synchronized). However, that’s more complicated and unclear if it’s
worth it, perf-wise.
## Tests
With this change I’m running the same hyperfine command with 1000
iterations with no failures.
## Changes
- Remove bundle.Parallel & bundle.ReadOnlyBundle.
- Add bundle.ApplyParallel, as a helper to migrate from bundle.Parallel.
- Keep ReadOnlyMutator as a separate type but it's now a subtype of
Mutator so it works on regular *Bundle. Having it as a separate type
prevents non-readonly mutators being passed to ApplyParallel
- validate.Validate becomes a function (was Mutator).
## Why
This a follow up to #2390 where we removed most of the tools to
construct chains of mutators. Same motivation applies here.
When it comes to read-only bundles, it's a leaky abstraction -- since
it's a shallow copy, it does not actually guarantee or enforce readonly
access to bundle. A better approach would be to run parallel operations
on independent narrowly-focused deep-copied structs, with just enough
information to carry out the task (this is not implemented here, but the
eventual goal). Now that we can just write regular code in phases and
not limited to mutator interface, we can switch to that approach.
## Tests
Existing tests.
---------
Co-authored-by: shreyas-goenka <88374338+shreyas-goenka@users.noreply.github.com>
## Changes
Rewrite bundle/tests/python_wheel_test.go into acceptance tests. The
same configs are used, but the test now runs 'bundle deploy' and in
addition to checking the files on the file system, also checks that the
files were uploaded and records jobs/create request.
There is a new test helper bin/find.py which filters out paths based on
regex, asserts on number of expected results. I've added it because
'find' on Windows behaves differently, so this helps avoid
cross-platform differences.
Bumps
[actions/upload-artifact](https://github.com/actions/upload-artifact)
from 4.6.0 to 4.6.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/actions/upload-artifact/releases">actions/upload-artifact's
releases</a>.</em></p>
<blockquote>
<h2>v4.6.1</h2>
<h2>What's Changed</h2>
<ul>
<li>Update to use artifact 2.2.2 package by <a
href="https://github.com/yacaovsnc"><code>@yacaovsnc</code></a> in <a
href="https://redirect.github.com/actions/upload-artifact/pull/673">actions/upload-artifact#673</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/actions/upload-artifact/compare/v4...v4.6.1">https://github.com/actions/upload-artifact/compare/v4...v4.6.1</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="4cec3d8aa0"><code>4cec3d8</code></a>
Merge pull request <a
href="https://redirect.github.com/actions/upload-artifact/issues/673">#673</a>
from actions/yacaovsnc/artifact_2.2.2</li>
<li><a
href="e9fad966cc"><code>e9fad96</code></a>
license cache update for artifact</li>
<li><a
href="b26fd06e9d"><code>b26fd06</code></a>
Update to use artifact 2.2.2 package</li>
<li>See full diff in <a
href="65c4c4a1dd...4cec3d8aa0">compare
view</a></li>
</ul>
</details>
<br />
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
<details>
<summary>Dependabot commands and options</summary>
<br />
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
</details>
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
## Changes
1. Add a Why section to the pull request template
2. Slightly improve language in the existing sections
## Why
Providing the right context for the reviewer in the PR description is
important as it usually cannot be inferred from the code itself. The new
section directly prompts the requester to provide such context
## Tests
Checked that the markdown is still rendered correctly in the local
viewer
## Changes
1. remove t.Skip directive from TestAuthDescribeSuccess integration test
2. change the test code to account for environments where host value can
be prefixed with the https protocol part
## Why
This enables integration test that exercises `databricks auth describe`
command, which was previously throwing errors in the CI/CD pipeline
## Tests
Integration test is passing
Ignore output files using gitignore syntax.
## Changes
New Ignore setting in test.toml that will ignore specified files (syntax
is gitignore).
## Why
I'm using it in #2396 to ignore virtual env. It includes a lot of files.
The regular 'rm -fr .venv' approach only works if script get to that
point, but due to errors it might abort early. In that cases test runner
prints all unexpected files, polluting output. Ignoring those files at
test runner level ensure you never see them.
## Tests
Updated selftest/basic.
## Changes
1. Refactored `TestSparkJarTaskDeployAndRunOnVolumes` and
`TestSparkJarTaskDeployAndRunOnWorkspace` to use a table-driven approach
for better organization of similar tests
2. Implemented `testutil.HasJDK()` to replace `testutil.RequireJDK` to
be able to skip tests
3. Ensured the test suite properly fails if no compatible Java version
is found
## Why
It can be tricky to have Java 8 installed on modern dev environments
(e.g. Mac on Apple M3 chip). The absence of which previously caused the
Spark Jar task tests to fail when run locally. This refactoring allows
such environments to be able to run "SparkJar" tests using a newer
Databricks Runtime.
## Tests
1. Ran `TestSparkJarTaskDeployAndRunOnVolumes` and
`TestSparkJarTaskDeployAndRunOnWorkspace` locally on Mac with Java11
installed.
2. Checked that tests against older runtimes are still being run and
passing in CI/CD environments
## Changes
This PR adds a warning which gives users clear guidance when they try to
use variable interpolation for an auth field.
## Tests
Modify existing acceptance test.
## Changes
Defining an include section in config files other than the main
`databricks.yml` file fails silently. With this PR users will get a
warning when they try this.
## Tests
Acceptance test.
## Tests
Manually, I have a test that fails.
Before:
```
=== NAME TestAccept
server.go:195:
----------------------------------------
No stub found for pattern: GET /api/2.1/clusters/get
To stub a response for this request, you can add
the following to test.toml:
[[Server]]
Pattern = "GET /api/2.1/clusters/get"
Response.Body = '''
<response body here>
'''
Response.StatusCode = <response status-code here>
----------------------------------------
```
After:
```
server.go:203: No handler for URL: /api/2.1/clusters/get?cluster_id=0717-132531-5opeqon1
Body: [0 bytes]
For acceptance tests, add this to test.toml:
[[Server]]
Pattern = "GET /api/2.1/clusters/get"
Response.Body = '<response body here>'
# Response.StatusCode = <response code if not 200>
```
## Changes
- Instead of constructing chains of mutators and then executing them,
execute them directly.
- Remove functionality related to chain-building: Seq, If, Defer,
newPhase, logString.
- Phases become functions that apply the changes directly rather than
construct mutator chains that will be called later.
- Add a helper ApplySeq to call multiple mutators, use it where
Apply+Seq were used before.
This is intended to be a refactoring without functional changes, but
there are a few behaviour changes:
- Since defer() is used to call unlock instead of bundle.Defer()
unlocking will now happen even in case of panics.
- In --debug, the phase names are are still logged once before start of
the phase but each entry no longer has 'seq' or phase name in it.
- The message "Deployment complete!" was printed even if
terraform.Apply() mutator had an error. It no longer does that.
## Motivation
The use of the chains was necessary when mutators were returning a list
of other mutators instead of calling them directly. But that has since
been removed, so now the chain machinery have no purpose anymore.
Use of direct functions simplifies the logic and makes bugs more
apparent and easy to fix.
Other improvements that this unlocks:
- Simpler stacktraces/debugging (breakpoints).
- Use of functions with narrowly scoped API: instead of mutators that
receive full bundle config, we can use focused functions that only deal
with sections they care about prepareGitSettings(currentGitSection) ->
updatedGitSection. This makes the data flow more apparent.
- Parallel computations across mutators (within phase): launch
goroutines fetching data from APIs at the beggining, process them once
they are ready.
## Tests
Existing tests.
## Changes
Instead of LocalOnly with non-composable semantics there are two
composable options:
- Local - enable test locally
- Cloud - enable test on the cloud
By default Cloud is switched off except in bundle (but not in
bundle/variables and bundle/help).
## Tests
Using this in #2383 to have test that runs on cloud but not locally.
## Changes
Since at this moment we set default to 'no', interactively it should
also default to 'no'. However, it just uses the first option.
## Tests
Manually running `cli bundle init default-python`
## Changes
- Add 'serverless' prompt to default-python template (default is
currently set to "no").
- This is a simplified version of
https://github.com/databricks/cli/pull/2348 with 'auto' functionality
removed.
## Tests
- Split default-python into default-python/classic,
default-python/serverless, default-python/serverless-customcatalog.
- Manually check that "bundle init default-python" with serverless=yes
can be deployed and run on dogfood and test env.
## Changes
Now when `profile` flag is used we won't pick up host from bundle
anymore and use the one provided by -p flag
Previous behaviour in the context of bundle
```
databricks current-user me -p profile_name
Error: cannot resolve bundle auth configuration: config host mismatch: profile uses host https://non-existing-subdomain.databricks.com, but CLI configured to use https://foo.com
```
New behaviour (make an api call)
```
databricks current-user me -p profile_name
{
email: "foo@bar.com"
...
}
```
We still load bundle configuration when `-t` flag provide because we
want to load host information from the target.
Fixes#1358
## Tests
Added acceptance test
## Changes
1. Change the **default-python** bundle template to set
`data_security_mode` of a cluster to SINGLE_USER
2. Change the **experimental-jobs-as-code** bundle template to set
`data_security_mode` of a cluster to SINGLE_USER
## Why
Explicitly adding this field saves experienced users from confusion onto
what security mode is applied to the cluster
## Tests
Changed existing unit and integration tests to pass with this change
## Changes
This PR:
1. No longer sets the `DATABRICKS_CLI_PARENT_PID` environment variable
since it was never required in the first place and was mistakenly merged
in the inital PR.
2. Performs minor cleanup based on post merge feedback in
https://github.com/databricks/cli/pull/2354.
## Tests
N/A
## Changes
Added PyPi and Maven libraries tests
Needed for this PR since we don't currently have any coverage for PyPi
or Maven libraries
https://github.com/databricks/cli/pull/2382
## Changes
Previously using python wheel tasks in the tasks with compute referering
to interactive cluster defied in the same bundle would produce a warning
like below
```
GET /api/2.1/clusters/get?cluster_id=${resources.clusters.development_cluster.id}
< HTTP/2.0 400 Bad Request
< {
< "error_code": "INVALID_PARAMETER_VALUE",
< "message": "Cluster ${resources.clusters.development_cluster.id} does not exist"
< } pid=14465 mutator=seq mutator=initialize mutator=seq mutator=PythonWrapperWarning sdk=true
```
This PR fixes it by making sure that we check spark version for such
clusters based on its bundle configuration and don't make API calls
## Tests
Added acceptance test
## Changes
CLI generation template was using RequiredPathField from incorrect
request entity (body field from request and not request itself). Thus
for some of the commands required path parameters were not required when
--json was specified.
## Tests
Regenerated commands work correctly
## Changes
I'm not completely sure what is lowercasing it, we can investigate this
later. This unblocks tests on main.
> Name: project_name_kLvfBngEcUEI
> Uploading
project_name_klvfbngecuei-0.0.1+[NUMID].[NUMID]-py3-none-any.whl...
## Tests
`CLOUD_ENV=aws go test --timeout 3h -v -run TestDefaultPython/3.9
./integration/bundle`
## Changes
- Get rid of artifacts.DetectPackages which is a thin wrapper around
artifacts/whl.DetectPackage
- Get rid of parsing name out of setup.py. Do not randomize either, use
a static one.
## Tests
Existing tests.
## Changes
Added missing .gitignore files to templates
## Tests
There were some incorrect snapshots of gitignore files in acceptance
tests, probably generated by testing infra. Updated them to new files
---------
Co-authored-by: Lennart Kats (databricks) <lennart.kats@databricks.com>
## Changes
Previously, one could not set `LocalOnly=true` in parent directory and
then override it with `LocalOnly=false` in child directory. This is
because, `false` is considered empty value by mergo.
In order to distinguish between 'explicitly set to false' and 'not set',
I've changed all simple variables in config to be pointers. Now, one can
always override those, because non-nil pointers are not null (with
mergo.WithoutDereference option).
## Tests
Manually:
```
~/work/cli/acceptance/bundle/templates/default-python % cat test.toml # add this new file
LocalOnly = false
~/work/cli/acceptance/bundle/templates/default-python % CLOUD_ENV=aws go test ../../.. -run ^TestAccept$/^bundle$/^templates$/^default-python$ -v
(the test is no longer skipped)
```