## Changes
This PR allows us to define custom server stubs in a `test.toml` file.
Note: A followup PR will add functionality to do assertions on the API
request itself.
## Tests
New acceptance test.
The default is 88 which reformats too much.
This has no effect on templates but affects Python script in this PR
https://github.com/databricks/cli/pull/2267
For context, we do not set any line length for golang and have 177 .go
files with max line length 150 or more.
## Changes
This PR registers the `server.Close()` function to be run during test
cleanup in the server initialization function. This ensures that all
test servers are closed as soon as the test they are scoped to finish.
Motivated by https://github.com/databricks/cli/pull/2255/files where a
regression was introduced where we did not close the test server.
## Tests
N/A
## Changes
Followup from
https://github.com/databricks/cli/pull/2209#pullrequestreview-2580308075.
This PR adds an integration test to validate that the API type bindings
work against the telemetry endpoint.
## Tests
N/A
---------
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
## Changes
Fix relative path errors in the Python mutator that was failing during
deployment since v0.239.1.
Before that:
```
% databricks bundle deploy
Deploying resources...
Updating deployment state...
Error: failed to compute relative path for job jobs_as_code_project_job: Rel: can't make resources/jobs_as_code_project_job.py relative to /Users/$USER/jobs_as_code_project
```
As a result, the bundle was deployed, but the deployment state wasn't
updated.
## Tests
Unit tests, adding acceptance tests in
https://github.com/databricks/cli/pull/2254
## Changes
Added support for double underscore variable references.
Previously we made this restriction stronger with no particular reason,
TF provider supports multiple underscores and thus DABs should do as
well.
Fixes#1753
## Tests
Added acceptance and integration tests
## Changes
These types correspond to the telemetry protobufs defined in universe.
## Tests
No tests are needed since this PR only adds the type bindings.
---------
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
## Changes
This change is required to enable tracking execution time telemetry for
bundle commands. In order to track execution time for the command
generally, we need to have the databricks auth configuration available
at this section of the code:
41bbd89257/cmd/root/root.go (L99)
In order to do this we can rely on the `configUsed` context key.
Most commands rely on the `root.MustWorkspaceClient` function which
automatically sets the client config in the `configUsed` context key.
Bundle commands, however, do not do so. They instead store their
workspace clients in the `&bundle.Bundle{}` object.
With this PR, the `configUsed` context key will be set for all `bundle`
commands. Functionally nothing changes.
## Tests
Existing tests. Also manually verified that either
`root.MustConfigureBundle` or `utils.ConfigureBundleWithVariables` is
called for all bundle commands (except `bundle init`) thus ensuring this
context key would be set for all bundle commands.
refs for the functions:
1. `root.MustConfigureBundle`:
41bbd89257/cmd/root/bundle.go (L88)
2. `utils.ConfigureBundleWithVariables`:
41bbd89257/cmd/bundle/utils/utils.go (L19)
---------
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
## Changes
This allows DABs to avoid waiting for the compute to start when app is
initially created as part of "bundle deploy" which significantly
improves deploy time.
Always set no_compute to true for apps
## Tests
Covered by `TestDeployBundleWithApp`, currently fails until TF provider
is upgraded to the version supporting `no_compute` option
## Changes
Noticed this when working on
https://github.com/databricks/cli/pull/2221. `<` is a special HTML
character that is encoded during text replacement when using
`AssertEqualTexts`.
## Tests
N/A
## Changes
- If CLOUD_ENV is set to do not override with dummy value. This allows
running acceptance tests as integration tests.
- Needed for https://github.com/databricks/cli/pull/2242
## Tests
Manually run the test suite against dogfood. `CLOUD_ENV=aws go test
./acceptance`
## Changes
- Do not start replacement / comparison if file is too large or not
valid utf-8.
- This helps to prevent replacements if there is accidentally a large
binary (e.g. terraform).
## Tests
Found this problem when working on
https://github.com/databricks/cli/pull/2242 -- the tests tried to
applied replacements on terraform binary and crashed. With this change,
an error is reported instead.
## Changes
- Replace development version with $DEV_VERSION
- Update experimental-jobs-as-code to make use of it.
## Tests
- Existing tests.
- Using this in https://github.com/databricks/cli/pull/2213
There is a speed up in 0.5s but it is still 4.4s, so something else is
slow there.
Benchmarking bundle/templates/experimental-jobs-as-code:
```
# Without UV_CACHE_DIR
~/work/cli/acceptance/bundle/templates/experimental-jobs-as-code % hyperfine --warmup 2 'testme -count=1'
Benchmark 1: testme -count=1
Time (mean ± σ): 4.950 s ± 0.079 s [User: 2.730 s, System: 8.524 s]
Range (min … max): 4.838 s … 5.076 s 10 runs
# With UV_CACHE_DIR
~/work/cli/acceptance/bundle/templates/experimental-jobs-as-code % hyperfine --warmup 2 'testme -count=1'
Benchmark 1: testme -count=1
Time (mean ± σ): 4.410 s ± 0.049 s [User: 2.669 s, System: 8.710 s]
Range (min … max): 4.324 s … 4.467 s 10 runs
```
## Changes
- Remove DetectInterpreters from DetectExecutable call: python3 or
python should always be on on the PATH. We don't need to detect
non-standard situations like python3.10 is present but python3 is not.
- I moved DetectInterpreters to cmd/labs where it is still used.
This is a follow up to https://github.com/databricks/cli/pull/2034
## Tests
Existing tests.
## Changes
- Ability to extend a list of replacements via test.toml
- Modify selftest to both demo this feature and to get rid of sed on
Windows.
## Tests
Acceptance tests. I'm also using it
https://github.com/databricks/cli/pull/2213 for things like pid.
## Changes
- File comparison files in acceptance test, print the contents of all
applied replacements. Do it once per test.
- Remove duplicate entries in replacement list.
## Tests
Manually, change out files of existing test, you'll get this printed
once, after first assertion:
```
acceptance_test.go:307: Available replacements:
REPL /Users/denis\.bilenko/work/cli/acceptance/build/databricks => $$CLI
REPL /private/var/folders/5y/9kkdnjw91p11vsqwk0cvmk200000gp/T/TestAccept598522733/001 => $$TMPHOME
...
```
## Changes
- New test covering failures in reading .git. One case results in error,
some result in warning (not shown).
- New helper withdir runs commands in a subdirectory.
## Tests
New acceptance test.
## Changes
For the most recent release, I had to re-run the "publish-winget" action
a couple of times before it passed. The underlying issue that causes the
failure should be solved by the latest version of the action, but upon
inspection of the latest version, I found that it always installs the
latest version of [Komac](https://github.com/russellbanks/Komac). To
both fix the issue and lock this down further, I updated our action to
call Komac directly instead of relying on a separate action to do this
for us.
## Tests
Successful run in
https://github.com/databricks/cli/actions/runs/12951529979.
## Changes
- Acceptance tests load test.toml to configure test behaviour.
- If file is not found in the test directory, parents are searched,
until the test root.
- Currently there is one option: runtime.GOOS to switch off tests per
OS.
## Tests
Using it in https://github.com/databricks/cli/pull/2223 to disable test
on Windows that cannot be run there.
## Changes
When adding path, a few things should take care of:
- symlink expansion
- forward/backward slashes, so that tests could do sed 's/\\\\/\//g' to
make it pass on Windows (see
acceptance/bundle/syncroot/dotdot-git/script)
SetPath() function takes care of both.
This PR uses SetPath() on all paths consistently.
## Tests
Existing tests.
## Changes
New source of default values for variables - variable file
`.databricks/bundle/<target>/variable-overrides.json`
CLI tries to stat and read that file every time during variable
initialisation phase
<!-- Summary of your changes that are easy to understand -->
## Tests
Acceptance tests
## Changes
`include` section is used only to include other bundle configuration
YAML files. If any other file type is used, raise an error and guide
users to use `sync.include` instead
## Tests
Added acceptance test
---------
Co-authored-by: Julia Crawford (Databricks) <julia.crawford@databricks.com>
## Changes
- Propagate env vars USE_SDK_V2_RESOURCES and $USE_SDK_V2_DATA_SOURCES
to terraform
- This are troubleshooting helpers for resources migrated to new plugin
framework, recommended here:
https://registry.terraform.io/providers/databricks/databricks/latest/docs/guides/troubleshooting#plugin-framework-migration-problems
- This current unblocks deploying quality monitors, see
https://github.com/databricks/terraform-provider-databricks/issues/4229#issuecomment-2520344690
## Tests
Manually testing that I can deploy quality monitor after this change
with `USE_SDK_V2_RESOURCES="databricks_quality_monitor"` set
### Main branch:
```
~/work/databricks_quality_monitor_repro % USE_SDK_V2_RESOURCES="databricks_quality_monitor" ../cli/cli-main bundle deploy
Uploading bundle files to /Workspace/Users/denis.bilenko@databricks.com/.bundle/quality_monitor_bundle/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!
Error: terraform apply: exit status 1
Error: Provider produced inconsistent result after apply
When applying changes to databricks_quality_monitor.monitor_trips, provider
"provider[\"registry.terraform.io/databricks/databricks\"]" produced an
unexpected new value: .data_classification_config: block count changed from 0
to 1.
This is a bug in the provider, which should be reported in the provider's own
issue tracker.
```
### This branch:
```
~/work/databricks_quality_monitor_repro % USE_SDK_V2_RESOURCES="databricks_quality_monitor" ../cli/cli bundle deploy
Uploading bundle files to /Workspace/Users/denis.bilenko@databricks.com/.bundle/quality_monitor_bundle/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!
```
### Config:
```
~/work/databricks_quality_monitor_repro % cat databricks.yml
bundle:
name: quality_monitor_bundle
resources:
quality_monitors:
monitor_trips:
table_name: main.denis-bilenko-cuj-pe34.trips_sanitized_1
output_schema_name: main.denis-bilenko-cuj-pe34
assets_dir: /Workspace/Users/${workspace.current_user.userName}/quality_monitor_issue
snapshot: {}
```
## Changes
If there are unreadable files in a directory, raise an error but
continue with further diagnostics, because the answer is in the script
output.
## Tests
Manually - I'm working on some tests that create unreadable files, the
report is much better with this change.
## Changes
Allows custom untyped fields in the root config in json-schema so it
doesn't highlight errors when using yaml-anchors.
Example use case:
```
tags: &job-tags
environment: ${bundle.target}
resources:
jobs:
db1:
tags:
<<: *job-tags
db1:
tags:
<<: *job-tags
```
One downside is that we don't highlight any unknown top-level properties
anymore (but they will still fail during CLI validation)
## Tests
Manually checked behavior in VSCode - it doesn't show validation error.
Also checked that other typed properties are still suggested