## Changes
This ensures that the CLI and Terraform can both use an Azure CLI
session configured under a non-standard path. This is the default
behavior on Azure DevOps when using the AzureCLI@2 task.
Fixes#1722.
## Tests
Unit test.
## Changes
If not explicitly quoted, the YAML loader interprets a value like
`2024-08-29` as a timestamp. Such a value is usually intended to be a
string instead. Our normalization logic was not able to turn a time
value back into the original string.
This change boxes the time value to include its original string
representation. Normalization of one of these values into a string can
now use the original input value.
## Tests
Unit tests in `libs/dyn/convert`.
## Changes
Consider serverless clusters as compatible for Python wheel tasks.
Fixes a `Python wheel tasks require compute with DBR 13.3+ to include
local libraries` warning shown for serverless clusters
## Changes
* Provide a more helpful error when using an artifact_path based on
/Volumes
* Allow the use of short_names in /Volumes paths
## Example cases
Example of a valid /Volumes artifact_path:
* `artifact_path:
/Volumes/catalog/schema/${workspace.current_user.short_name}/libs`
Example of an invalid /Volumes path (when using `mode: development`):
* `artifact_path: /Volumes/catalog/schema/libs`
* Resulting error: `artifact_path should contain the current username or
${workspace.current_user.short_name} to ensure uniqueness when using
'mode: development'`
## Changes
Fixes issue introduced here https://github.com/databricks/cli/pull/1699
where PyPi packages were treated as local library.
The reason is that `libraryPath` returns an empty string as a path for
PyPi packages and then `IsLibraryLocal` treated empty string as local
path.
Both of these functions are fixed in this PR.
## Tests
Added regression test
## Changes
This changes makes sure we ignore CLI version check on development
builds of the CLI.
Before:
```
$ cat databricks.yml | grep cli_version
databricks_cli_version: ">= 0.223.1"
$ cli bundle deploy
Error: Databricks CLI version constraint not satisfied. Required: >= 0.223.1, current: 0.0.0-dev+06b169284737
```
after
```
...
$ cli bundle deploy
...
Warning: Ignoring Databricks CLI version constraint for development build. Required: >= 0.223.1, current: 0.0.0-dev+d52d6f08fcd5
```
## Tests
<!-- How is this tested? -->
## Changes
With hc-install version `0.8.0` there was a regression where debug logs
would be leaked into stderr. Reported upstream in
https://github.com/hashicorp/hc-install/issues/239.
Meanwhile we need to revert and pin to version`0.7.0`. This PR also
includes a regression test.
## Tests
Regression test.
## Changes
This field allows a user to configure paths to synchronize to the
workspace.
Allowed values are relative paths to files and directories anchored at
the directory where the field is set. If one or more values traverse up
the directory tree (to an ancestor of the bundle root directory), the
CLI will dynamically determine the root path to use to ensure that the
file tree structure remains intact.
For example, given a `databricks.yml` in `my_bundle` that includes:
```yaml
sync:
paths:
- ../common
- .
```
Then upon synchronization, the workspace will look like:
```
.
├── common
│ └── lib.py
└── my_bundle
├── databricks.yml
└── notebook.py
```
If not set behavior remains identical.
## Tests
* Newly added unit tests for the mutators and under `bundle/tests`.
* Manually confirmed a bundle without this configuration works the same.
* Manually confirmed a bundle with this configuration works.
## Changes
Fixes#1701
## Tests
```
Usage:
databricks clusters list [flags]
Flags:
--cluster-sources []string Filter clusters by source
--cluster-states []string Filter clusters by states
-h, --help help for list
--is-pinned Filter clusters by pinned status
--page-size int Use this field to specify the maximum number of results to be returned by the server.
--page-token string Use next_page_token or prev_page_token returned from the previous request to list the next or previous page of clusters respectively.
--policy-id string Filter clusters by policy id
```
## Changes
In https://github.com/databricks/cli/pull/1490 we regressed and started
using the development mode prefix for UC schemas regardless of the mode
of the bundle target.
This PR fixes the regression and adds a regression test
## Tests
Failing integration tests pass now.
## Changes
While experimenting with DAB I discovered that requirements libraries
are being ignored.
One thing worth mentioning is that `bundle validate` runs successfully,
but `bundle deploy` fails. This PR only covers the second part.
## Tests
<!-- How is this tested? -->
Added a unit test
## Changes
`TestAccFilerWorkspaceFilesExtensionsErrorsOnDupName` recently started
failing in our nightlies because the upstream `import` API was changed
to [prohibit conflicting file
paths](https://docs.databricks.com/en/release-notes/product/2024/august.html#files-can-no-longer-have-identical-names-in-workspace-folders).
Because existing conflicting file paths can still be grandfathered in,
we need to retain coverage for the test. To do this, this PR:
1. Removes the failing
`TestAccFilerWorkspaceFilesExtensionsErrorsOnDupName`
2. Add an equivalent unit test with the `list` and `get-status` API
calls mocked.
## Changes
Make `pydabs/venv_path` optional. When not specified, CLI detects the
Python interpreter using `python.DetectExecutable`, the same way as for
`artifacts`. `python.DetectExecutable` works correctly if a virtual
environment is activated or `python3` is available on PATH through other
means.
Extract the venv detection code from PyDABs into `libs/python/detect`.
This code will be used when we implement the `python/venv_path` section
in `databricks.yml`.
## Tests
Unit tests and manually
---------
Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com>
## Changes
These tests inadvertently re-ran mutators, the first time through
`loadTarget` and the second time by running `phases.Initialize()`
themselves. Some of the mutators that are executed in
`phases.Initialize()` are also run as part of `loadTarget`. This is
overdue a refactor to make it unambiguous what runs when. Until then,
this removes the duplicated execution.
## Tests
Unit tests pass.
## Changes
This PR addressed post-merge feedback from
https://github.com/databricks/cli/pull/1673.
## Tests
Unit tests, and manually.
```
Error: experiment undefined-experiment is not defined
at resources.experiments.undefined-experiment
in databricks.yml:11:26
Error: job undefined-job is not defined
at resources.jobs.undefined-job
in databricks.yml:6:19
Error: pipeline undefined-pipeline is not defined
at resources.pipelines.undefined-pipeline
in databricks.yml:14:24
Name: undefined-job
Target: default
Found 3 errors
```
## Changes
This adds configurable transformations based on the transformations
currently seen in `mode: development`.
Example databricks.yml showcasing how some transformations:
```
bundle:
name: my_bundle
targets:
dev:
presets:
prefix: "myprefix_" # prefix all resource names with myprefix_
pipelines_development: true # set development to true by default for pipelines
trigger_pause_status: PAUSED # set pause_status to PAUSED by default for all triggers and schedules
jobs_max_concurrent_runs: 10 # set max_concurrent runs to 10 by default for all jobs
tags:
dev: true
```
## Tests
* Existing process_target_mode tests that were adapted to use this new
code
* Unit tests specific for the new mutator
* Unit tests for config loading and merging
* Manual e2e testing
## Changes
Before this change, the fileset library would take a single root path
and list all files in it. To support an allowlist of paths to list (much
like a Git `pathspec` without patterns; see [pathspec](pathspec)), this
change introduces an optional argument to `fileset.New` where the caller
can specify paths to list. If not specified, this argument defaults to
list `.` (i.e. list all files in the root).
The motivation for this change is that we wish to expose this pattern in
bundles. Users should be able to specify which paths to synchronize
instead of always only synchronizing the bundle root directory.
[pathspec]:
https://git-scm.com/docs/gitglossary#Documentation/gitglossary.txt-aiddefpathspecapathspec
## Tests
New and existing unit tests.
CLI:
* Add command line autocomplete to the fs commands
([#1622](https://github.com/databricks/cli/pull/1622)).
* Add trailing slash to directory to produce completions for
([#1666](https://github.com/databricks/cli/pull/1666)).
* Fix ability to import the CLI repository as module
([#1671](https://github.com/databricks/cli/pull/1671)).
* Fix host resolution order in `auth login`
([#1370](https://github.com/databricks/cli/pull/1370)).
* Print text logs in `import-dir` and `export-dir` commands
([#1682](https://github.com/databricks/cli/pull/1682)).
Bundles:
* Expand and upload local wheel libraries for all task types
([#1649](https://github.com/databricks/cli/pull/1649)).
* Clarify file format required for the `config-file` flag in `bundle
init` ([#1651](https://github.com/databricks/cli/pull/1651)).
* Fixed incorrectly cleaning up python wheel dist folder
([#1656](https://github.com/databricks/cli/pull/1656)).
* Merge job parameters based on their name
([#1659](https://github.com/databricks/cli/pull/1659)).
* Fix glob expansion after running a generic build command
([#1662](https://github.com/databricks/cli/pull/1662)).
* Upload local libraries even if they don't have artifact defined
([#1664](https://github.com/databricks/cli/pull/1664)).
Internal:
* Fix python wheel task integration tests
([#1648](https://github.com/databricks/cli/pull/1648)).
* Skip pushing Terraform state after destroy
([#1667](https://github.com/databricks/cli/pull/1667)).
* Enable Spark JAR task test
([#1658](https://github.com/databricks/cli/pull/1658)).
* Run Spark JAR task test on multiple DBR versions
([#1665](https://github.com/databricks/cli/pull/1665)).
* Stop tracking file path locations in bundle resources
([#1673](https://github.com/databricks/cli/pull/1673)).
* Update VS Code settings to match latest value from IDE plugin
([#1677](https://github.com/databricks/cli/pull/1677)).
* Use `service.NamedIdMap` to make lookup generation deterministic
([#1678](https://github.com/databricks/cli/pull/1678)).
* [Internal] Remove dependency to the `openapi` package of the Go SDK
([#1676](https://github.com/databricks/cli/pull/1676)).
* Upgrade TF provider to 1.50.0
([#1681](https://github.com/databricks/cli/pull/1681)).
* Upgrade Go SDK to 0.44.0
([#1679](https://github.com/databricks/cli/pull/1679)).
API Changes:
* Changed `databricks account budgets create` command . New request type
is .
* Changed `databricks account budgets create` command to return .
* Changed `databricks account budgets delete` command . New request type
is .
* Changed `databricks account budgets delete` command to return .
* Changed `databricks account budgets get` command . New request type is
.
* Changed `databricks account budgets get` command to return .
* Changed `databricks account budgets list` command to require request
of .
* Changed `databricks account budgets list` command to return .
* Changed `databricks account budgets update` command . New request type
is .
* Changed `databricks account budgets update` command to return .
* Added `databricks account usage-dashboards` command group.
* Changed `databricks model-versions get` command to return .
* Changed `databricks cluster-policies create` command with new required
argument order.
* Changed `databricks cluster-policies edit` command with new required
argument order.
* Added `databricks clusters update` command.
* Added `databricks genie` command group.
* Changed `databricks permission-migration migrate-permissions` command
. New request type is .
* Changed `databricks permission-migration migrate-permissions` command
to return .
* Changed `databricks account workspace-assignment delete` command to
return .
* Changed `databricks account workspace-assignment update` command with
new required argument order.
* Changed `databricks account custom-app-integration create` command
with new required argument order.
* Changed `databricks account custom-app-integration list` command to
require request of .
* Changed `databricks account published-app-integration list` command to
require request of .
* Removed `databricks apps` command group.
* Added `databricks notification-destinations` command group.
* Changed `databricks shares list` command to require request of .
* Changed `databricks alerts create` command . New request type is .
* Changed `databricks alerts delete` command . New request type is .
* Changed `databricks alerts delete` command to return .
* Changed `databricks alerts get` command with new required argument
order.
* Changed `databricks alerts list` command to require request of .
* Changed `databricks alerts list` command to return .
* Changed `databricks alerts update` command . New request type is .
* Changed `databricks alerts update` command to return .
* Changed `databricks queries create` command . New request type is .
* Changed `databricks queries delete` command . New request type is .
* Changed `databricks queries delete` command to return .
* Changed `databricks queries get` command with new required argument
order.
* Changed `databricks queries list` command to return .
* Removed `databricks queries restore` command.
* Changed `databricks queries update` command . New request type is .
* Added `databricks queries list-visualizations` command.
* Changed `databricks query-visualizations create` command . New request
type is .
* Changed `databricks query-visualizations delete` command . New request
type is .
* Changed `databricks query-visualizations delete` command to return .
* Changed `databricks query-visualizations update` command . New request
type is .
* Changed `databricks statement-execution execute-statement` command to
return .
* Changed `databricks statement-execution get-statement` command to
return .
* Added `databricks alerts-legacy` command group.
* Added `databricks queries-legacy` command group.
* Added `databricks query-visualizations-legacy` command group.
OpenAPI commit f98c07f9c71f579de65d2587bb0292f83d10e55d (2024-08-12)
Dependency updates:
* Bump github.com/hashicorp/hc-install from 0.7.0 to 0.8.0
([#1652](https://github.com/databricks/cli/pull/1652)).
* Bump golang.org/x/sync from 0.7.0 to 0.8.0
([#1655](https://github.com/databricks/cli/pull/1655)).
* Bump golang.org/x/mod from 0.19.0 to 0.20.0
([#1654](https://github.com/databricks/cli/pull/1654)).
* Bump golang.org/x/oauth2 from 0.21.0 to 0.22.0
([#1653](https://github.com/databricks/cli/pull/1653)).
* Bump golang.org/x/text from 0.16.0 to 0.17.0
([#1670](https://github.com/databricks/cli/pull/1670)).
* Bump golang.org/x/term from 0.22.0 to 0.23.0
([#1669](https://github.com/databricks/cli/pull/1669)).
## Changes
These 2 tests failed
`TestAccAlertsCreateErrWhenNoArguments ` -> switched to legacy command
for now, new one does not have a required request body (might be an
OpenAPI spec issue
https://github.com/databricks/databricks-sdk-go/blob/main/service/sql/model.go#L595),
will follow up later
`TestAccClustersList` -> increased channel size because new clusters API
returns more clusters
## Tests
Tests are green now
## Changes
In https://github.com/databricks/cli/pull/1202 the semantics of
`cmdio.RenderJson` was changes to always render the JSON object. Before
we would only render it if `--output json` was specified.
This PR fixes the logs to print human-readable log lines instead of a
JSON object.
This PR also removes the now unused `cmdio.Render` method.
## Tests
Manually:
```
➜ bundle-playground git:(master) ✗ cli workspace import-dir ./tmp /Users/shreyas.goenka@databricks.com/test-import-1 -p aws-prod-ucws
Importing files from ./tmp
a -> /Users/shreyas.goenka@databricks.com/test-import-1/a
Import complete. The files are available at /Users/shreyas.goenka@databricks.com/test-import-1
```
```
➜ bundle-playground git:(master) ✗ cli workspace export-dir /Users/shreyas.goenka@databricks.com/test-export-1 ./tmp-2 -p aws-prod-ucws
Exporting files from /Users/shreyas.goenka@databricks.com/test-export-1
/Users/shreyas.goenka@databricks.com/test-export-1/b -> tmp-2/b
Exported complete. The files are available at ./tmp-2
```
## Changes
This PR removes the dependency to the `databricks-sdk-go/openapi`
package by copying the struct and functions that are needed in a new
`schema/spec.go` file.
The reason to remove this dependency is that it is being deprecated.
Copying the code in the `cli` repo seems reasonable given that it only
uses a couple of very small structs.
## Tests
Verified that CLI code can be properly generated after this change.
## Changes
This updates the `python.envFile` property from VS Code's settings file
to use the value that is set by the latest version of the IDE plugin.
This change will make it a bit easier for contributors who work on the
CLI code base with the plugin enabled.
## Changes
The `auth login` command today prefers a host URL specified in a profile
before selecting the one explicitly provided by a user as a command line
argument.
This PR fixes this bug and refactors the code to make it more linear and
easy to read. Note that the same issue exists in the `auth token`
command and is fixed here as well.
## Tests
Unit tests, and manual testing.
## Changes
Previously for all the libraries referenced in configuration DABs made
sure that there is corresponding artifact section.
But this is not really necessary and flexible, because local libraries
might be built outside of dabs context.
It also created difficult to follow logic in code where we back
referenced libraries to artifacts which was difficult to fllow
This PR does 3 things:
1. Allows all local libraries referenced in DABs config to be uploaded
to remote
2. Simplifies upload and glob references expand logic by doing this in
single place
3. Speed things up by uploading library only once and doing this in
parallel
## Tests
Added unit + integration tests + made sure that change is backward
compatible (no changes in existing tests)
---------
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
## Changes
Since locations are already tracked in the dynamic value tree, we no
longer need to track it at the resource/artifact level. This PR:
1. Removes use of `paths.Paths`. Uses dyn.Location instead.
2. Refactors the validation of resources not being empty valued to be
generic across all resource types.
## Tests
Existing unit tests.
## Changes
While investigating #1629, I found that Go doesn't allow characters
outside the set documented at
https://pkg.go.dev/golang.org/x/mod/module#CheckFilePath.
To fix this, I changed the relevant test case to create the fixtures it
needs instead of loading it from the `testdata` directory (in
`renderer_test.go`).
Some test cases in `config_test.go` depended on templated paths without
needing to do so. In the process of fixing this, I refactored these
tests slightly to reduce dependencies between them.
This change also adds a test case to ensure that all files in the
repository are allowed to be part of a module (per the earlier
`CheckFilePath` function).
Fixes#1629.
## Tests
I manually confirmed I could import the repository as a Go module.
Bumps [golang.org/x/term](https://github.com/golang/term) from 0.22.0 to
0.23.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="d59895469a"><code>d598954</code></a>
go.mod: update golang.org/x dependencies</li>
<li><a
href="d4346f0be2"><code>d4346f0</code></a>
LICENSE: update per Google Legal</li>
<li>See full diff in <a
href="https://github.com/golang/term/compare/v0.22.0...v0.23.0">compare
view</a></li>
</ul>
</details>
<br />
[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/term&package-manager=go_modules&previous-version=0.22.0&new-version=0.23.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
<details>
<summary>Dependabot commands and options</summary>
<br />
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
</details>
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Bumps [golang.org/x/text](https://github.com/golang/text) from 0.16.0 to
0.17.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b2bec85eb9"><code>b2bec85</code></a>
go.mod: update golang.org/x dependencies</li>
<li><a
href="ae0cf96bbc"><code>ae0cf96</code></a>
LICENSE: update per Google Legal</li>
<li>See full diff in <a
href="https://github.com/golang/text/compare/v0.16.0...v0.17.0">compare
view</a></li>
</ul>
</details>
<br />
[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/text&package-manager=go_modules&previous-version=0.16.0&new-version=0.17.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
<details>
<summary>Dependabot commands and options</summary>
<br />
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
</details>
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
## Changes
The integration test for `fs ls` tried to produce completions for the
temporary directory without a trailing slash. The command will list the
parent directory to see if there are more directories with the same
prefix that are also valid completions. The test intends to test
completions for files inside the temporary directory, so we need that
trailing slash.
This test was hanging because the parent directory of the temporary DBFS
directory has more than 1k entries (on our integration testing
workspaces) and the line buffer in the test runner has a capacity of 1k
lines. To avoid the same hang in the future, this change modifies the
test runner to panic if the line buffer is full.
## Tests
Confirmed the integration test no longer hangs.
## Changes
This PR adds autocomplete for cat, cp, ls, mkdir and rm.
The new completer can do completion for any `Filer`. The command
completion for the `sync` command can be moved to use this general
completer as a follow-up.
## Tests
- Tested manually against a workspace
- Unit tests
## Changes
This didn't work as expected because the generic build mutator called
into the type-specific build mutator in the middle of the function. This
invalidated the `config.Artifact` pointer that was being mutated later
on, effectively hiding these mutations from its caller.
To fix this, I turned glob expansion into its own mutator. It now works
as expected, _and_ produces better errors if the glob patterns are
invalid or do not match files.
## Tests
Unit tests.
Manual verification:
```
% databricks bundle deploy
Building sbt_example...
Error: target/scala-2.12/sbt-e[xam22ple*.jar: syntax error in pattern
at artifacts.sbt_example.files[1].source
in databricks.yml:15:17
```
## Changes
This change enables overriding the default value of job parameters in
target overrides.
This is the same approach we already take for job clusters and job
tasks.
Closes#1620.
## Tests
Mutator unit tests and lightweight end-to-end tests.