## Changes
Fixes an `Error: no value assigned to required variable <variable>.`
when the main complex variable definition is defined in one file but
target override is defined in separate file which is included in the
main one.
## Tests
Added regression test
## Changes
DLT pipeline recreations are destructive. They can lead to lost history
of previous updates, outage of the tables temporarily and are
potentially computationally expensive. Thus we make a breaking change
where a prompt is shown to the user if there configuration changes will
lead to a DLT recreation.
Users can skip the prompt by specifying the `--auto-approve` flag.
This PR also fixes an issue with our test runner where logs from the
cmdio.Logger would not get propagated to the reader returned by our
cobra test runner.
## Tests
Manually, and new unit and integration tests.
```
➜ bundle-playground-3 cli bundle deploy
Uploading bundle files to /Users/63ec021d-b0c6-49c0-93a0-5123953a1cb2/.bundle/test/development/files...
The following DLT pipelines will be recreated. Underlying tables will be unavailable for a transient period until the newly recreated pipelines are run once successfully. History of previous pipeline update runs will be lost because of recreation:
recreate pipeline foo
Would you like to proceed? [y/n]: n
Deployment cancelled!
```
## Changes
With https://github.com/databricks/cli/pull/1370 we started to error if
a profile name was not provided in a non-tty setting. The Databricks
VSCode extension, however, uses the `auth login` command to simply
refresh the tokens. Thus, this PR is a regression fix for that use case.
## Tests
Manually, `databricks auth login --host
https://e2-dogfood.staging.cloud.databricks.com` no longer errors.
Instead it successfully refreshes the credentials.
## Changes
This updates the templates to include a `permissions` section. Having a
permissions section is a best practice, is helpful to understand the
notion of permissions, and helps diagnose permission errors
(https://github.com/databricks/cli/pull/1386).
This is a cherry-pick from https://github.com/databricks/cli/pull/1387.
This change was verified to work both in dev and prod. Existing unit
tests validate the validity of the templates in these modes.
## Changes
We were not using the readers and writers set in the test fixtures in
the progress logger. This PR fixes that. It also modifies
`TestAccAbortBind`, which was implicitly relying on the bug.
I encountered this bug while working on
https://github.com/databricks/cli/pull/1672.
## Tests
Manually.
From non-tty:
```
Error: failed to bind the resource, err: This bind operation requires user confirmation, but the current console does not support prompting. Please specify --auto-approve if you would like to skip prompts and proceed.
```
From tty, bind works as expected.
```
Confirm import changes? Changes will be remotely applied only after running 'bundle deploy'. [y/n]: y
Updating deployment state...
Successfully bound databricks_pipeline with an id '9d2dedbb-f522-4503-96ba-4bc4d5bfa77d'. Run 'bundle deploy' to deploy changes to your workspace
```
## Changes
Explain the error when the `databricks-pydabs` package is not installed
or the Python environment isn't correctly activated.
Example output:
```
Error: python mutator process failed: ".venv/bin/python3 -m databricks.bundles.build --phase load --input .../input.json --output .../output.json --diagnostics .../diagnostics.json: exit status 1", use --debug to enable logging
.../.venv/bin/python3: Error while finding module specification for 'databricks.bundles.build' (ModuleNotFoundError: No module named 'databricks')
Explanation: 'databricks-pydabs' library is not installed in the Python environment.
If using Python wheels, ensure that 'databricks-pydabs' is included in the dependencies,
and that the wheel is installed in the Python environment:
$ .venv/bin/pip install -e .
If using a virtual environment, ensure it is specified as the venv_path property in databricks.yml,
or activate the environment before running CLI commands:
experimental:
pydabs:
venv_path: .venv
```
## Tests
Unit tests
## Changes
Preserve diagnostics if there are any errors or warnings when
PythonMutator normalizes output. If anything goes wrong during
conversion, diagnostics contain the relevant location and path.
## Tests
Unit tests
## Changes
This ensures that the CLI and Terraform can both use an Azure CLI
session configured under a non-standard path. This is the default
behavior on Azure DevOps when using the AzureCLI@2 task.
Fixes#1722.
## Tests
Unit test.
## Changes
If not explicitly quoted, the YAML loader interprets a value like
`2024-08-29` as a timestamp. Such a value is usually intended to be a
string instead. Our normalization logic was not able to turn a time
value back into the original string.
This change boxes the time value to include its original string
representation. Normalization of one of these values into a string can
now use the original input value.
## Tests
Unit tests in `libs/dyn/convert`.
## Changes
Consider serverless clusters as compatible for Python wheel tasks.
Fixes a `Python wheel tasks require compute with DBR 13.3+ to include
local libraries` warning shown for serverless clusters
## Changes
* Provide a more helpful error when using an artifact_path based on
/Volumes
* Allow the use of short_names in /Volumes paths
## Example cases
Example of a valid /Volumes artifact_path:
* `artifact_path:
/Volumes/catalog/schema/${workspace.current_user.short_name}/libs`
Example of an invalid /Volumes path (when using `mode: development`):
* `artifact_path: /Volumes/catalog/schema/libs`
* Resulting error: `artifact_path should contain the current username or
${workspace.current_user.short_name} to ensure uniqueness when using
'mode: development'`
## Changes
Fixes issue introduced here https://github.com/databricks/cli/pull/1699
where PyPi packages were treated as local library.
The reason is that `libraryPath` returns an empty string as a path for
PyPi packages and then `IsLibraryLocal` treated empty string as local
path.
Both of these functions are fixed in this PR.
## Tests
Added regression test
## Changes
This changes makes sure we ignore CLI version check on development
builds of the CLI.
Before:
```
$ cat databricks.yml | grep cli_version
databricks_cli_version: ">= 0.223.1"
$ cli bundle deploy
Error: Databricks CLI version constraint not satisfied. Required: >= 0.223.1, current: 0.0.0-dev+06b169284737
```
after
```
...
$ cli bundle deploy
...
Warning: Ignoring Databricks CLI version constraint for development build. Required: >= 0.223.1, current: 0.0.0-dev+d52d6f08fcd5
```
## Tests
<!-- How is this tested? -->
## Changes
With hc-install version `0.8.0` there was a regression where debug logs
would be leaked into stderr. Reported upstream in
https://github.com/hashicorp/hc-install/issues/239.
Meanwhile we need to revert and pin to version`0.7.0`. This PR also
includes a regression test.
## Tests
Regression test.
## Changes
This field allows a user to configure paths to synchronize to the
workspace.
Allowed values are relative paths to files and directories anchored at
the directory where the field is set. If one or more values traverse up
the directory tree (to an ancestor of the bundle root directory), the
CLI will dynamically determine the root path to use to ensure that the
file tree structure remains intact.
For example, given a `databricks.yml` in `my_bundle` that includes:
```yaml
sync:
paths:
- ../common
- .
```
Then upon synchronization, the workspace will look like:
```
.
├── common
│ └── lib.py
└── my_bundle
├── databricks.yml
└── notebook.py
```
If not set behavior remains identical.
## Tests
* Newly added unit tests for the mutators and under `bundle/tests`.
* Manually confirmed a bundle without this configuration works the same.
* Manually confirmed a bundle with this configuration works.
## Changes
Fixes#1701
## Tests
```
Usage:
databricks clusters list [flags]
Flags:
--cluster-sources []string Filter clusters by source
--cluster-states []string Filter clusters by states
-h, --help help for list
--is-pinned Filter clusters by pinned status
--page-size int Use this field to specify the maximum number of results to be returned by the server.
--page-token string Use next_page_token or prev_page_token returned from the previous request to list the next or previous page of clusters respectively.
--policy-id string Filter clusters by policy id
```
## Changes
In https://github.com/databricks/cli/pull/1490 we regressed and started
using the development mode prefix for UC schemas regardless of the mode
of the bundle target.
This PR fixes the regression and adds a regression test
## Tests
Failing integration tests pass now.
## Changes
While experimenting with DAB I discovered that requirements libraries
are being ignored.
One thing worth mentioning is that `bundle validate` runs successfully,
but `bundle deploy` fails. This PR only covers the second part.
## Tests
<!-- How is this tested? -->
Added a unit test
## Changes
`TestAccFilerWorkspaceFilesExtensionsErrorsOnDupName` recently started
failing in our nightlies because the upstream `import` API was changed
to [prohibit conflicting file
paths](https://docs.databricks.com/en/release-notes/product/2024/august.html#files-can-no-longer-have-identical-names-in-workspace-folders).
Because existing conflicting file paths can still be grandfathered in,
we need to retain coverage for the test. To do this, this PR:
1. Removes the failing
`TestAccFilerWorkspaceFilesExtensionsErrorsOnDupName`
2. Add an equivalent unit test with the `list` and `get-status` API
calls mocked.
## Changes
Make `pydabs/venv_path` optional. When not specified, CLI detects the
Python interpreter using `python.DetectExecutable`, the same way as for
`artifacts`. `python.DetectExecutable` works correctly if a virtual
environment is activated or `python3` is available on PATH through other
means.
Extract the venv detection code from PyDABs into `libs/python/detect`.
This code will be used when we implement the `python/venv_path` section
in `databricks.yml`.
## Tests
Unit tests and manually
---------
Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com>
## Changes
These tests inadvertently re-ran mutators, the first time through
`loadTarget` and the second time by running `phases.Initialize()`
themselves. Some of the mutators that are executed in
`phases.Initialize()` are also run as part of `loadTarget`. This is
overdue a refactor to make it unambiguous what runs when. Until then,
this removes the duplicated execution.
## Tests
Unit tests pass.
## Changes
This PR addressed post-merge feedback from
https://github.com/databricks/cli/pull/1673.
## Tests
Unit tests, and manually.
```
Error: experiment undefined-experiment is not defined
at resources.experiments.undefined-experiment
in databricks.yml:11:26
Error: job undefined-job is not defined
at resources.jobs.undefined-job
in databricks.yml:6:19
Error: pipeline undefined-pipeline is not defined
at resources.pipelines.undefined-pipeline
in databricks.yml:14:24
Name: undefined-job
Target: default
Found 3 errors
```
## Changes
This adds configurable transformations based on the transformations
currently seen in `mode: development`.
Example databricks.yml showcasing how some transformations:
```
bundle:
name: my_bundle
targets:
dev:
presets:
prefix: "myprefix_" # prefix all resource names with myprefix_
pipelines_development: true # set development to true by default for pipelines
trigger_pause_status: PAUSED # set pause_status to PAUSED by default for all triggers and schedules
jobs_max_concurrent_runs: 10 # set max_concurrent runs to 10 by default for all jobs
tags:
dev: true
```
## Tests
* Existing process_target_mode tests that were adapted to use this new
code
* Unit tests specific for the new mutator
* Unit tests for config loading and merging
* Manual e2e testing
## Changes
Before this change, the fileset library would take a single root path
and list all files in it. To support an allowlist of paths to list (much
like a Git `pathspec` without patterns; see [pathspec](pathspec)), this
change introduces an optional argument to `fileset.New` where the caller
can specify paths to list. If not specified, this argument defaults to
list `.` (i.e. list all files in the root).
The motivation for this change is that we wish to expose this pattern in
bundles. Users should be able to specify which paths to synchronize
instead of always only synchronizing the bundle root directory.
[pathspec]:
https://git-scm.com/docs/gitglossary#Documentation/gitglossary.txt-aiddefpathspecapathspec
## Tests
New and existing unit tests.
CLI:
* Add command line autocomplete to the fs commands
([#1622](https://github.com/databricks/cli/pull/1622)).
* Add trailing slash to directory to produce completions for
([#1666](https://github.com/databricks/cli/pull/1666)).
* Fix ability to import the CLI repository as module
([#1671](https://github.com/databricks/cli/pull/1671)).
* Fix host resolution order in `auth login`
([#1370](https://github.com/databricks/cli/pull/1370)).
* Print text logs in `import-dir` and `export-dir` commands
([#1682](https://github.com/databricks/cli/pull/1682)).
Bundles:
* Expand and upload local wheel libraries for all task types
([#1649](https://github.com/databricks/cli/pull/1649)).
* Clarify file format required for the `config-file` flag in `bundle
init` ([#1651](https://github.com/databricks/cli/pull/1651)).
* Fixed incorrectly cleaning up python wheel dist folder
([#1656](https://github.com/databricks/cli/pull/1656)).
* Merge job parameters based on their name
([#1659](https://github.com/databricks/cli/pull/1659)).
* Fix glob expansion after running a generic build command
([#1662](https://github.com/databricks/cli/pull/1662)).
* Upload local libraries even if they don't have artifact defined
([#1664](https://github.com/databricks/cli/pull/1664)).
Internal:
* Fix python wheel task integration tests
([#1648](https://github.com/databricks/cli/pull/1648)).
* Skip pushing Terraform state after destroy
([#1667](https://github.com/databricks/cli/pull/1667)).
* Enable Spark JAR task test
([#1658](https://github.com/databricks/cli/pull/1658)).
* Run Spark JAR task test on multiple DBR versions
([#1665](https://github.com/databricks/cli/pull/1665)).
* Stop tracking file path locations in bundle resources
([#1673](https://github.com/databricks/cli/pull/1673)).
* Update VS Code settings to match latest value from IDE plugin
([#1677](https://github.com/databricks/cli/pull/1677)).
* Use `service.NamedIdMap` to make lookup generation deterministic
([#1678](https://github.com/databricks/cli/pull/1678)).
* [Internal] Remove dependency to the `openapi` package of the Go SDK
([#1676](https://github.com/databricks/cli/pull/1676)).
* Upgrade TF provider to 1.50.0
([#1681](https://github.com/databricks/cli/pull/1681)).
* Upgrade Go SDK to 0.44.0
([#1679](https://github.com/databricks/cli/pull/1679)).
API Changes:
* Changed `databricks account budgets create` command . New request type
is .
* Changed `databricks account budgets create` command to return .
* Changed `databricks account budgets delete` command . New request type
is .
* Changed `databricks account budgets delete` command to return .
* Changed `databricks account budgets get` command . New request type is
.
* Changed `databricks account budgets get` command to return .
* Changed `databricks account budgets list` command to require request
of .
* Changed `databricks account budgets list` command to return .
* Changed `databricks account budgets update` command . New request type
is .
* Changed `databricks account budgets update` command to return .
* Added `databricks account usage-dashboards` command group.
* Changed `databricks model-versions get` command to return .
* Changed `databricks cluster-policies create` command with new required
argument order.
* Changed `databricks cluster-policies edit` command with new required
argument order.
* Added `databricks clusters update` command.
* Added `databricks genie` command group.
* Changed `databricks permission-migration migrate-permissions` command
. New request type is .
* Changed `databricks permission-migration migrate-permissions` command
to return .
* Changed `databricks account workspace-assignment delete` command to
return .
* Changed `databricks account workspace-assignment update` command with
new required argument order.
* Changed `databricks account custom-app-integration create` command
with new required argument order.
* Changed `databricks account custom-app-integration list` command to
require request of .
* Changed `databricks account published-app-integration list` command to
require request of .
* Removed `databricks apps` command group.
* Added `databricks notification-destinations` command group.
* Changed `databricks shares list` command to require request of .
* Changed `databricks alerts create` command . New request type is .
* Changed `databricks alerts delete` command . New request type is .
* Changed `databricks alerts delete` command to return .
* Changed `databricks alerts get` command with new required argument
order.
* Changed `databricks alerts list` command to require request of .
* Changed `databricks alerts list` command to return .
* Changed `databricks alerts update` command . New request type is .
* Changed `databricks alerts update` command to return .
* Changed `databricks queries create` command . New request type is .
* Changed `databricks queries delete` command . New request type is .
* Changed `databricks queries delete` command to return .
* Changed `databricks queries get` command with new required argument
order.
* Changed `databricks queries list` command to return .
* Removed `databricks queries restore` command.
* Changed `databricks queries update` command . New request type is .
* Added `databricks queries list-visualizations` command.
* Changed `databricks query-visualizations create` command . New request
type is .
* Changed `databricks query-visualizations delete` command . New request
type is .
* Changed `databricks query-visualizations delete` command to return .
* Changed `databricks query-visualizations update` command . New request
type is .
* Changed `databricks statement-execution execute-statement` command to
return .
* Changed `databricks statement-execution get-statement` command to
return .
* Added `databricks alerts-legacy` command group.
* Added `databricks queries-legacy` command group.
* Added `databricks query-visualizations-legacy` command group.
OpenAPI commit f98c07f9c71f579de65d2587bb0292f83d10e55d (2024-08-12)
Dependency updates:
* Bump github.com/hashicorp/hc-install from 0.7.0 to 0.8.0
([#1652](https://github.com/databricks/cli/pull/1652)).
* Bump golang.org/x/sync from 0.7.0 to 0.8.0
([#1655](https://github.com/databricks/cli/pull/1655)).
* Bump golang.org/x/mod from 0.19.0 to 0.20.0
([#1654](https://github.com/databricks/cli/pull/1654)).
* Bump golang.org/x/oauth2 from 0.21.0 to 0.22.0
([#1653](https://github.com/databricks/cli/pull/1653)).
* Bump golang.org/x/text from 0.16.0 to 0.17.0
([#1670](https://github.com/databricks/cli/pull/1670)).
* Bump golang.org/x/term from 0.22.0 to 0.23.0
([#1669](https://github.com/databricks/cli/pull/1669)).
## Changes
These 2 tests failed
`TestAccAlertsCreateErrWhenNoArguments ` -> switched to legacy command
for now, new one does not have a required request body (might be an
OpenAPI spec issue
https://github.com/databricks/databricks-sdk-go/blob/main/service/sql/model.go#L595),
will follow up later
`TestAccClustersList` -> increased channel size because new clusters API
returns more clusters
## Tests
Tests are green now
## Changes
In https://github.com/databricks/cli/pull/1202 the semantics of
`cmdio.RenderJson` was changes to always render the JSON object. Before
we would only render it if `--output json` was specified.
This PR fixes the logs to print human-readable log lines instead of a
JSON object.
This PR also removes the now unused `cmdio.Render` method.
## Tests
Manually:
```
➜ bundle-playground git:(master) ✗ cli workspace import-dir ./tmp /Users/shreyas.goenka@databricks.com/test-import-1 -p aws-prod-ucws
Importing files from ./tmp
a -> /Users/shreyas.goenka@databricks.com/test-import-1/a
Import complete. The files are available at /Users/shreyas.goenka@databricks.com/test-import-1
```
```
➜ bundle-playground git:(master) ✗ cli workspace export-dir /Users/shreyas.goenka@databricks.com/test-export-1 ./tmp-2 -p aws-prod-ucws
Exporting files from /Users/shreyas.goenka@databricks.com/test-export-1
/Users/shreyas.goenka@databricks.com/test-export-1/b -> tmp-2/b
Exported complete. The files are available at ./tmp-2
```
## Changes
This PR removes the dependency to the `databricks-sdk-go/openapi`
package by copying the struct and functions that are needed in a new
`schema/spec.go` file.
The reason to remove this dependency is that it is being deprecated.
Copying the code in the `cli` repo seems reasonable given that it only
uses a couple of very small structs.
## Tests
Verified that CLI code can be properly generated after this change.
## Changes
This updates the `python.envFile` property from VS Code's settings file
to use the value that is set by the latest version of the IDE plugin.
This change will make it a bit easier for contributors who work on the
CLI code base with the plugin enabled.
## Changes
The `auth login` command today prefers a host URL specified in a profile
before selecting the one explicitly provided by a user as a command line
argument.
This PR fixes this bug and refactors the code to make it more linear and
easy to read. Note that the same issue exists in the `auth token`
command and is fixed here as well.
## Tests
Unit tests, and manual testing.