Commit Graph

970 Commits

Author SHA1 Message Date
dependabot[bot] e6bc4c9876
Bump github.com/hashicorp/hc-install from 0.6.1 to 0.6.2 (#1054)
Bumps
[github.com/hashicorp/hc-install](https://github.com/hashicorp/hc-install)
from 0.6.1 to 0.6.2.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/hashicorp/hc-install/releases">github.com/hashicorp/hc-install's
releases</a>.</em></p>
<blockquote>
<h2>v0.6.2</h2>
<h2>What's Changed</h2>
<ul>
<li>build(deps): bump actions/checkout from 4.1.0 to 4.1.1 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/162">hashicorp/hc-install#162</a></li>
<li>build(deps): bump github.com/go-git/go-git/v5 from 5.10.0 to 5.10.1
by <a href="https://github.com/dependabot"><code>@​dependabot</code></a>
in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/168">hashicorp/hc-install#168</a></li>
<li>build(deps): bump github.com/go-git/go-git/v5 from 5.9.0 to 5.10.0
by <a href="https://github.com/dependabot"><code>@​dependabot</code></a>
in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/163">hashicorp/hc-install#163</a></li>
<li>build(deps): bump github.com/google/go-cmp from 0.5.9 to 0.6.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/160">hashicorp/hc-install#160</a></li>
<li>build(deps): bump golang.org/x/mod from 0.13.0 to 0.14.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/165">hashicorp/hc-install#165</a></li>
<li>build(deps): bump golang.org/x/net from 0.15.0 to 0.17.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/161">hashicorp/hc-install#161</a></li>
<li>build(deps): Bump workflows to latest trusted versions by <a
href="https://github.com/hashicorp-tsccr"><code>@​hashicorp-tsccr</code></a>
in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/167">hashicorp/hc-install#167</a></li>
<li>ci: Automate releasing via bob by <a
href="https://github.com/radeksimko"><code>@​radeksimko</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/99">hashicorp/hc-install#99</a></li>
<li>github: Replace dependabot with internal tooling by <a
href="https://github.com/radeksimko"><code>@​radeksimko</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/166">hashicorp/hc-install#166</a></li>
<li>go: bump version to 1.21.3 by <a
href="https://github.com/radeksimko"><code>@​radeksimko</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/164">hashicorp/hc-install#164</a></li>
<li>go: bump version to 1.21.4 by <a
href="https://github.com/radeksimko"><code>@​radeksimko</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/169">hashicorp/hc-install#169</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/hashicorp-tsccr"><code>@​hashicorp-tsccr</code></a>
made their first contribution in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/167">hashicorp/hc-install#167</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/hashicorp/hc-install/compare/v0.6.1...v0.6.2">https://github.com/hashicorp/hc-install/compare/v0.6.1...v0.6.2</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b00cdafaea"><code>b00cdaf</code></a>
Set VERSION to 0.6.2</li>
<li><a
href="9bbc98cfc9"><code>9bbc98c</code></a>
ci: Add release workflow (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/99">#99</a>)</li>
<li><a
href="b22ec09b52"><code>b22ec09</code></a>
go: bump version to 1.21.4 (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/169">#169</a>)</li>
<li><a
href="fd6075b3a7"><code>fd6075b</code></a>
build(deps): bump github.com/go-git/go-git/v5 from 5.10.0 to 5.10.1 (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/168">#168</a>)</li>
<li><a
href="9de7b5724f"><code>9de7b57</code></a>
Result of tsccr-helper -log-level=info -pin-all-workflows . (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/167">#167</a>)</li>
<li><a
href="1626fa421a"><code>1626fa4</code></a>
github: Disable dependabot for GHA (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/166">#166</a>)</li>
<li><a
href="0ee87ea235"><code>0ee87ea</code></a>
build(deps): bump golang.org/x/mod from 0.13.0 to 0.14.0 (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/165">#165</a>)</li>
<li><a
href="ed6709cf70"><code>ed6709c</code></a>
go: bump version to 1.21.3 (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/164">#164</a>)</li>
<li><a
href="0e6b3da65c"><code>0e6b3da</code></a>
build(deps): bump actions/checkout from 4.1.0 to 4.1.1 (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/162">#162</a>)</li>
<li><a
href="cb4ec801a4"><code>cb4ec80</code></a>
build(deps): bump github.com/go-git/go-git/v5 from 5.9.0 to 5.10.0 (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/163">#163</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/hashicorp/hc-install/compare/v0.6.1...v0.6.2">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/hashicorp/hc-install&package-manager=go_modules&previous-version=0.6.1&new-version=0.6.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2023-12-13 10:12:50 +00:00
Pieter Noordhuis 37671d9f54
Fix passthrough of pipeline notifications (#1058)
## Changes

Notifications weren't passed along because of a plural vs singular
mismatch.

## Tests

* Added unit test coverage.
* Manually confirmed it now works in an example bundle.
2023-12-12 11:36:06 +00:00
shreyas-goenka b479a7cf67
Upgrade Terraform schema version to v1.31.1 (#1055)
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-12-11 21:53:21 +00:00
Lennart Kats (databricks) 8b9930a49a
Improve default template (#1046)
## Changes
- Tweak strings, documentation in template
- Extend requirements-dev.txt with setuptools/wheel for building whl
files
- Clarify what the "_job.yml" file is for for users who are only
interested in DLT pipelines (answering a question that came up recently)

## Tests
Existing tests exercise this template
2023-12-11 19:13:14 +00:00
Serge Smertin 42c06267eb
Stub out Python virtual environment installation for `labs` commands (#1057)
This PR removes 15 seconds from `make test` runtime
2023-12-11 16:30:19 +00:00
Andrew Nester 2d829678a0
Release v0.210.2 (#1044)
CLI:
* Add documentation for positional args in commands generated from the
Databricks OpenAPI specification
([#1033](https://github.com/databricks/cli/pull/1033)).
* Ask for host when .databrickscfg doesn't exist
([#1041](https://github.com/databricks/cli/pull/1041)).
* Add list of supported values for flags that represent an enum field
([#1036](https://github.com/databricks/cli/pull/1036)).

Bundles:
* Fix panic when bundle auth resolution fails
([#1002](https://github.com/databricks/cli/pull/1002)).
* Add versioning for bundle templates
([#972](https://github.com/databricks/cli/pull/972)).
* Add support for conditional prompting in bundle init
([#971](https://github.com/databricks/cli/pull/971)).
* Pass parameters to task when run with `--python-params` and
`python_wheel_wrapper` is true
([#1037](https://github.com/databricks/cli/pull/1037)).
* Change default_python template to auto-update version on each wheel
build ([#1034](https://github.com/databricks/cli/pull/1034)).

Internal:
* Rewrite the friendly log handler
([#1038](https://github.com/databricks/cli/pull/1038)).
* Move bundle schema update to an internal module
([#1012](https://github.com/databricks/cli/pull/1012)).


Dependency updates:
* Bump github.com/databricks/databricks-sdk-go from 0.26.0 to 0.26.1
([#1040](https://github.com/databricks/cli/pull/1040)).
2023-12-06 14:37:38 +00:00
shreyas-goenka 6002f49c87
Move bundle schema update to an internal module (#1012)
## Changes

This PR:
1. Move code to load bundle JSON Schema descriptions from the OpenAPI
spec to an internal Go module
2. Remove command line flags from the `bundle schema` command. These
flags were meant for internal processes and at no point were meant for
customer use.
3. Regenerate `bundle_descriptions.json`
4. Add support for `bundle: "deprecated"`. The `environments` field is
tagged as deprecated in this PR and consequently will no longer be a
part of the bundle schema.

## Tests
Tested by regenerating the CLI against its current OpenAPI spec (as
defined in `__openapi_sha`). The `bundle_descriptions.json` in this PR
was generated from the code generator.

Manually checked that the autocompletion / descriptions from the new
bundle schema are correct.
2023-12-06 10:45:18 +00:00
shreyas-goenka a6752a5388
Add list of supported values for flags that represent an enum field (#1036)
## Changes
This PR adds the list of supported values for flags that represent an
enum field in the flag's documentation.
2023-12-06 08:12:17 +00:00
dependabot[bot] e9ed828119
Bump github.com/databricks/databricks-sdk-go from 0.26.0 to 0.26.1 (#1040)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.26.0 to 0.26.1.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>0.26.1</h2>
<p>Minor changes:</p>
<ul>
<li>Support overriding DatabricksEnvironment (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/723">#723</a>).</li>
<li>Detect <code>Accept</code> header in
<code>httpclient.WithResponseUnmarshal</code> (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/710">#710</a>).</li>
<li>Detect <code>Content-Type</code> header in
<code>newRequestBody</code> for <code>httpclient</code> (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/711">#711</a>).</li>
</ul>
<p>Bug fixes:</p>
<ul>
<li>Retry request on <code>REQUEST_LIMIT_EXCEEDED</code> error returned
by the SCIM API (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/721">#721</a>).</li>
<li>Match retry logic of pre-refactor SDK (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/722">#722</a>).</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="e86cbfdd4a"><code>e86cbfd</code></a>
Release v0.26.1 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/725">#725</a>)</li>
<li><a
href="89952ab523"><code>89952ab</code></a>
Wrap url.Error in APIError (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/722">#722</a>)</li>
<li><a
href="6f60032df1"><code>6f60032</code></a>
Detect <code>Content-Type</code> header in <code>newRequestBody</code>
for <code>httpclient</code> (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/711">#711</a>)</li>
<li><a
href="9527c7ef5a"><code>9527c7e</code></a>
Detect <code>Accept</code> header in
<code>httpclient.WithResponseUnmarshal</code> (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/710">#710</a>)</li>
<li><a
href="6fd1ca73f0"><code>6fd1ca7</code></a>
Support overriding DatabricksEnvironment (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/723">#723</a>)</li>
<li><a
href="379b6e95d0"><code>379b6e9</code></a>
Retry request on <code>REQUEST_LIMIT_EXCEEDED</code> error returned by
the SCIM API (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/721">#721</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.26.0...v0.26.1">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.26.0&new-version=0.26.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-05 10:36:14 +00:00
Fabian Jakobs 66e923261d
Ask for host when .databrickscfg doesn't exist (#1041)
## Changes
Ask for host when .databrickscfg doesn't exist

This fixes a regression introduced by
https://github.com/databricks/cli/pull/1003
2023-12-04 15:40:52 +00:00
Andrew Nester cdf29da27b
Change default_python template to auto-update version on each wheel build (#1034)
## Changes
Change default_python template to auto-update version on each wheel
build
2023-12-01 13:24:55 +00:00
Pieter Noordhuis 60a8abdcd7
Rewrite the friendly log handler (#1038)
## Changes

It wasn't working because it deferred to the regular `slog.TextHandler`
for the `WithAttr` and `WithGroup` functions. Both of these functions
don't mutate the handler but return a new one. When the top-level logger
called one of these, log records in that context used the standard
handler instead of ours.

To implement tracking of attributes and groups, I followed the guide at
https://github.com/golang/example/blob/master/slog-handler-guide/README.md
for writing custom handlers.

## Tests

The new tests demonstrate formatting through `t.Log` and look good.
2023-12-01 12:17:04 +00:00
Andrew Nester 83d50001fc
Pass parameters to task when run with `--python-params` and `python_wheel_wrapper` is true (#1037)
## Changes
It makes the behaviour consistent with or without `python_wheel_wrapper`
on when job is run with `--python-params` flag.

In `python_wheel_wrapper` mode it converts dynamic `python_params` in a
dynamic specially named `notebook_param` and the wrapper reads them with
`dbutils` and pass to `sys.argv`

Fixes #1000

## Tests
Added an integration test.

Integration tests pass.
2023-12-01 10:35:20 +00:00
shreyas-goenka 76840176e3
Add documentation for positional args in commands generated from the Databricks OpenAPI specification (#1033)
## Changes
This PR adds documentation for positional arguments in commands that are
generated from the openapi spec.

Note: the changes to `.gitattributes` will be revert / properly fixed in
https://github.com/databricks/cli/pull/1012
2023-11-30 16:22:23 +00:00
shreyas-goenka bdef0f7b23
Add support for conditional prompting in bundle init (#971)
## Changes
This PR introduces the `skip_prompt_if` extension to the jsonschema
library. If the inputs provided by the user match the JSON schema then
the prompt for that property is skipped.

Right now only constant checks are supported, but if in the future more
complicated conditionals are required, this can be extended to support
`allOf`, `oneOf`, `anyOf` etc allowing template authors to specify
conditionals of arbitary complexity.

## Tests
Unit tests and manually.
2023-11-30 16:07:45 +00:00
shreyas-goenka 1f1ed6db53
Add versioning for bundle templates (#972)
## Changes
This PR adds versioning for bundle templates. Right now there's only
logic for the maximum version of templates supported. At some point in
the future if we make a breaking template change we can also include a
minimum version of template supported by the CLI.

## Tests
Unit tests.
2023-11-30 14:28:51 +00:00
shreyas-goenka 677926b78b
Fix panic when bundle auth resolution fails (#1002)
## Changes
CLI would panic if an invalid bundle auth is setup when running CLI
commands. This PR removes the panic and shows the error message directly
instead.

## Tests
The CWD is a bundle with:
```
workspace:
  profile: DEFAULT
```

Before:
```
shreyas.goenka@THW32HFW6T bundle-playground % cli clusters list
panic: resolve: /Users/shreyas.goenka/.databrickscfg has no DEFAULT profile configured. Config: profile=DEFAULT

goroutine 1 [running]:
```

After:
```
shreyas.goenka@THW32HFW6T bundle-playground % cli clusters list
Error: cannot resolve bundle auth configuration: resolve: /Users/shreyas.goenka/.databrickscfg has no DEFAULT profile configured. Config: profile=DEFAULT
```

```
shreyas.goenka@THW32HFW6T bundle-playground % DATABRICKS_CONFIG_FILE=/dev/null cli bundle deploy
Error:  cannot resolve bundle auth configuration: resolve: /dev/null has no DEFAULT profile configured. Config: profile=DEFAULT, config_file=/dev/null. Env: DATABRICKS_CONFIG_FILE
```
2023-11-30 14:28:01 +00:00
Pieter Noordhuis 1a1f1b1b4d
Release v0.210.1 (#1032)
This is a bugfix release to address issues with v0.210.0.

CLI:
* Fix `panic: $HOME is not set`
([#1027](https://github.com/databricks/cli/pull/1027)).
* Fix `databricks configure` if new profile is specified
([#1030](https://github.com/databricks/cli/pull/1030)).
* Filter out system clusters for `--configure-cluster`
([#1031](https://github.com/databricks/cli/pull/1031)).

Bundles:
* Fixed panic when job has trigger and in development mode
([#1026](https://github.com/databricks/cli/pull/1026)).

Internal:
* Use `fetch-tags` option in release workflows
([#1025](https://github.com/databricks/cli/pull/1025)).
2023-11-30 10:49:22 +00:00
Pieter Noordhuis 10c9eca06f
Filter out system clusters for `--configure-cluster` (#1031)
## Changes

Only clusters with their source attribute equal to `UI` or `API` should
be presented in the dropdown.

## Tests

Unit test and manual confirmation.
2023-11-30 09:59:11 +00:00
Pieter Noordhuis 4a228e6f12
Fix `databricks configure` if new profile is specified (#1030)
## Changes

The code included the to-be-created profile in the configuration and
that triggered the SDK to try and load it. Instead, we must use the
specified host and token directly.

## Tests

Manually. More integration test coverage tbd.
2023-11-30 09:51:52 +00:00
Serge Smertin 65458cbde6
Fix `panic: $HOME is not set` (#1027)
This PR adds error to `env.UserHomeDir(ctx)`

Fixes https://github.com/databricks/setup-cli/issues/73

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-11-29 19:08:27 +00:00
Andrew Nester 4d8d825746
Fixed panic when job has trigger and in development mode (#1026)
## Changes
Fixed panic when job has trigger and in development mode
2023-11-29 16:32:42 +00:00
Pieter Noordhuis f2969e91bd
Use `fetch-tags` option in release workflows (#1025)
## Changes

The manual unshallow step is superfluous and can be done as part of the
`actions/checkout` step.

Companion to #1022.

## Tests

Manual trigger of the snapshot build workflow.
2023-11-29 15:24:01 +00:00
Pieter Noordhuis 09991da534
Release v0.210.0 (#1023)
This release includes the new `databricks labs` command to install,
manage, and run Databricks Labs projects.

CLI:
* Add `--debug` as shortcut for `--log-level debug`
([#964](https://github.com/databricks/cli/pull/964)).
* Improved usability of `databricks auth login ... --configure-cluster`
([#956](https://github.com/databricks/cli/pull/956)).
* Make `databricks configure` save only explicit fields
([#973](https://github.com/databricks/cli/pull/973)).
* Add `databricks labs` command group
([#914](https://github.com/databricks/cli/pull/914)).
* Tolerate missing .databrickscfg file during `databricks auth login`
([#1003](https://github.com/databricks/cli/pull/1003)).
* Add `--configure-cluster` flag to configure command
([#1005](https://github.com/databricks/cli/pull/1005)).
* Fix bug where the account or workspace client could be `nil`
([#1020](https://github.com/databricks/cli/pull/1020)).

Bundles:
* Do not allow empty descriptions for bundle template inputs
([#967](https://github.com/databricks/cli/pull/967)).
* Added support for top-level permissions
([#928](https://github.com/databricks/cli/pull/928)).
* Allow jobs to be manually unpaused in development mode
([#885](https://github.com/databricks/cli/pull/885)).
* Fix template initialization from current working directory
([#976](https://github.com/databricks/cli/pull/976)).
* Add `--tag` and `--branch` options to bundle init command
([#975](https://github.com/databricks/cli/pull/975)).
* Work around DLT issue with `$PYTHONPATH` not being set correctly
([#999](https://github.com/databricks/cli/pull/999)).
* Enable `spark_jar_task` with local JAR libraries
([#993](https://github.com/databricks/cli/pull/993)).
* Pass `USERPROFILE` environment variable to Terraform
([#1001](https://github.com/databricks/cli/pull/1001)).
* Improve error message when path is not a bundle template
([#985](https://github.com/databricks/cli/pull/985)).
* Correctly overwrite local state if remote state is newer
([#1008](https://github.com/databricks/cli/pull/1008)).
* Add mlops-stacks to the default `databricks bundle init` prompt
([#988](https://github.com/databricks/cli/pull/988)).
* Do not add wheel content hash in uploaded Python wheel path
([#1015](https://github.com/databricks/cli/pull/1015)).
* Do not replace pipeline libraries if there are no matches for pattern
([#1021](https://github.com/databricks/cli/pull/1021)).

Internal:
* Update CLI version in the VS Code extension during release
([#1014](https://github.com/databricks/cli/pull/1014)).

API Changes:
 * Changed `databricks functions create` command.
* Changed `databricks metastores create` command with new required
argument order.
 * Removed `databricks metastores enable-optimization` command.
 * Removed `databricks account o-auth-enrollment` command group.
 * Removed `databricks apps delete` command.
 * Removed `databricks apps get` command.
 * Added `databricks apps delete-app` command.
 * Added `databricks apps get-app` command.
 * Added `databricks apps get-app-deployment-status` command.
 * Added `databricks apps get-apps` command.
 * Added `databricks apps get-events` command.
 * Added `databricks account network-connectivity` command group.

OpenAPI commit 22f09783eb8a84d52026f856be3b2068f9498db3 (2023-11-23)

Dependency updates:
* Bump golang.org/x/term from 0.13.0 to 0.14.0
([#981](https://github.com/databricks/cli/pull/981)).
* Bump github.com/hashicorp/terraform-json from 0.17.1 to 0.18.0
([#979](https://github.com/databricks/cli/pull/979)).
* Bump golang.org/x/oauth2 from 0.13.0 to 0.14.0
([#982](https://github.com/databricks/cli/pull/982)).
* Bump github.com/databricks/databricks-sdk-go from 0.24.0 to 0.25.0
([#980](https://github.com/databricks/cli/pull/980)).
* Bump github.com/databricks/databricks-sdk-go from 0.25.0 to 0.26.0
([#1019](https://github.com/databricks/cli/pull/1019)).
2023-11-29 14:19:20 +00:00
Pieter Noordhuis 3338cfc455
Discontinue 32-bit Windows build (#1024)
## Changes

Build failure for 32-bit Windows binary due to integer overflow in the
SDK.

We don't test 32-bit anywhere. I propose we stop publishing these builds
until we receive evidence they are still useful.

## Tests

n/a
2023-11-29 14:06:51 +00:00
Pieter Noordhuis 94a9fe4385
No need to fetch repository history when running tests (#1022)
Test runs don't need access to the repository history and only need the
commit being tested.
2023-11-29 13:50:17 +00:00
Pieter Noordhuis 0cd3bb072d
Bump Go SDK to v0.26.0 (#1019)
## Changes

Bump Go SDK to v0.26.0.

Changelog at
https://github.com/databricks/databricks-sdk-go/releases/tag/v0.26.0.

## Tests

Integration tests pass.
2023-11-29 13:29:31 +00:00
Pieter Noordhuis deb062c489
Fix bug where the account or workspace client could be `nil` (#1020)
## Changes

We didn't return the error upon creating a workspace or account client.
If there is an error, it must always propagate up the stack. The result
of this bug was that we were setting a `nil` account or workspace
client, which in turn caused SIGSEGVs.

Fixes #913.

## Tests

Manually confirmed this fixes the linked issue. The CLI now correctly
returns an error when the client cannot be constructed.

The issue was reproducible using a `.databrickscfg` with a single,
incorrectly configured profile.
2023-11-29 13:29:17 +00:00
Andrew Nester 833746cbdd
Do not replace pipeline libraries if there are no matches for pattern (#1021)
## Changes
If there are no matches when doing Glob call for pipeline library
defined, leave the entry as is.
The next mutators in the chain will detect that file is missing and the
error will be more user friendly.


Before the change

```
Starting resource deployment
Error: terraform apply: exit status 1

Error: cannot create pipeline: libraries must contain at least one element
```

After

```
Error: notebook ./non-existent not found
```


## Tests
Added regression unit tests
2023-11-29 13:20:13 +00:00
Andrew Nester 5431174302
Do not add wheel content hash in uploaded Python wheel path (#1015)
## Changes
Removed hash from the upload path since it's not useful anyway.

The main reason for that change was to make it work on all-purpose
clusters. But in order to make it work, wheel version needs to be
increased anyway. So having only hash in path is useless.

Note: using --build-number (build tag) flag does not help with
re-installing libraries on all-purpose clusters. The reason is that
`pip` ignoring build tag when upgrading the library and only look at
wheel version.
Build tag is only used for sorting the versions and the one with higher
build tag takes priority when installed. It only works if no library is
installed.
See
a15dd75d98/src/pip/_internal/index/package_finder.py (L522-L556)
https://github.com/pypa/pip/issues/4781

Thus, the only way to reinstall the library on all-purpose cluster is to
increase wheel version manually or use automatic version generation,
f.e.
```
setup(
  version=datetime.datetime.utcnow().strftime("%Y%m%d.%H%M%S"),
  ...
)
```

## Tests
Integration tests passed.
2023-11-29 10:40:12 +00:00
shreyas-goenka 5f88af54fd
Revert automation for bundle schema documentation generation (#1018)
## Changes

Introduced in #1007 but doesn't work well yet. This will be automated
again as part of #1012.
2023-11-29 09:53:07 +00:00
Andrew Nester b5f34a1181
Removed unused `ToHttpsUrl` method and corresponding library (#1017)
## Changes
Removed unused ToHttpsUrl method and corresponding library
2023-11-28 16:08:27 +00:00
Ilia Babanov 1932da0a87
Update cli version in the vscode extension during release (#1014)
Similar to how we do it for setup-cli and homebrew-tap repos.

The PR on the extension side:
https://github.com/databricks/databricks-vscode/pull/948
2023-11-28 10:50:16 +00:00
shreyas-goenka dd1d540429
Add mlops-stacks to the default `databricks bundle init` prompt (#988)
## Changes
This makes mlops-stacks more discoverable and makes the UX of
initialising the mlops-stack template better.

## Tests
Manually

Dropdown UI:
```
shreyas.goenka@THW32HFW6T projects % cli bundle init
Template to use:
  ▸ default-python
    mlops-stacks
```

Help message:
```
shreyas.goenka@THW32HFW6T bricks % cli bundle init -h
Initialize using a bundle template.

TEMPLATE_PATH optionally specifies which template to use. It can be one of the following:
- default-python: The default Python template
- mlops-stacks: The Databricks MLOps Stacks template. More information can be found at: https://github.com/databricks/mlops-stacks
```
2023-11-28 09:04:06 +00:00
shreyas-goenka 96e9545cf0
Automate the generation of bundle schema descriptions (#1007)
## Changes
This PR makes changes required to automatically update the bundle docs
during the CLI release process. We rely on `post_generate` scripts that
are executed after code generation with CWD as the CLI repo root.

The new `output-file` flag is introduced because stdout redirect does
not work here and would otherwise require changes to our release
automation CLI (deco CLI)

## Tests
Manually. Regenerated the CLI and the descriptions were indeed generated
for the CLI from the provided openapi spec.
2023-11-27 10:42:39 +00:00
Pieter Noordhuis f5f57b6bf9
Populate struct field with `config.Value` instance if possible (#1010)
## Changes

If a struct has a field of type `config.Value`, then we set it to the
source value while converting a `config.Value` instance to a struct as
part of a call to `convert.ToTyped`.

This is convenient when dealing with deeply nested structs where
functions on inner structs need access to the metadata provided by their
corresponding `config.Value` (e.g. where they were defined).

## Tests

Added unit tests pass.
2023-11-27 10:06:29 +00:00
Pieter Noordhuis ef97e249ec
Add function to check if `config.Value` is valid (#1009)
## Changes

Small function broken out from other work in progress.
2023-11-24 13:21:47 +00:00
Pieter Noordhuis 6187803007
Correctly overwrite local state if remote state is newer (#1008)
## Changes

A bug in the code that pulls the remote state could cause the local
state to be empty instead of a copy of the remote state. This happened
only if the local state was present and stale when compared to the
remote version.

We correctly checked for the state serial to see if the local state had
to be replaced but didn't seek back on the remote state before writing
it out. Because the staleness check would read the remote state in full,
copying from the same reader would immediately yield an EOF.

## Tests

* Unit tests for state pull and push mutators that rely on a mocked
filer.
* An integration test that deploys the same bundle from multiple paths,
triggering the staleness logic.

Both failed prior to the fix and now pass.
2023-11-24 11:15:46 +00:00
Pieter Noordhuis d985601d30
Add `--configure-cluster` flag to configure command (#1005)
## Changes

This breaks out the flags into a separate struct to make it easier to
pass around.

If specified, the flag calls into the `cfgpicker` to prompt for a
cluster to associated with the profile.

## Tests

Existing tests pass; added one for host validation.

---------

Co-authored-by: Miles Yucht <miles@databricks.com>
2023-11-23 19:56:48 +00:00
Miles Yucht 07c4c90772
Tolerate missing .databrickscfg file during `databricks auth login` (#1003)
## Changes
`databricks configure` creates a new .databrickscfg if one doesn't
already exist, but `databricks auth login` fails in this case. Because
`databricks auth login` anyways writes out the config file, we
gracefully handle this error and continue.

## Tests
Unit test.

```
$ ls ~/.databrickscfg*    
/Users/miles/.databrickscfg.bak

$ ./cli auth login
Databricks Profile Name: test
Databricks Host: https://<HOST>
Profile test was successfully saved

$ ls ~/.databrickscfg*
/Users/miles/.databrickscfg     /Users/miles/.databrickscfg.bak

$ cat ~/.databrickscfg
; The profile defined in the DEFAULT section is to be used as a fallback when no profile is explicitly specified.
[DEFAULT]

[test]
host      = https://<HOST>
auth_type = databricks-cli
```
2023-11-23 09:04:54 +00:00
shreyas-goenka d9fe2ab43d
Improve error message when path is not a bundle template (#985)
Adds better error message when input path is not a bundle template

before:
```
shreyas.goenka@THW32HFW6T bricks % cli bundle init ~/bricks
Error: open /Users/shreyas.goenka/bricks/databricks_template_schema.json: no such file or directory
```

after:
```
shreyas.goenka@THW32HFW6T bricks % cli bundle init ~/bricks
Error: expected to find a template schema file at /Users/shreyas.goenka/bricks/databricks_template_schema.json
```
2023-11-22 12:25:16 +00:00
Andrew Nester 48e293c72c
Pass `USERPROFILE` environment variable to Terraform (#1001)
## Changes
It appears that `USERPROFILE` env variable indicates where Azure CLI
stores configuration data (aka `.azure` folder).

https://learn.microsoft.com/en-us/cli/azure/azure-cli-configuration#cli-configuration-file

Passing it to terraform executable allows it to correctly authenticate
using Azure CLI.

Fixes #983 

## Tests
Ran deployment on Window VM before and after the fix.
2023-11-22 09:16:28 +00:00
Andrew Nester fa89db57e9
Enable `spark_jar_task` with local JAR libraries (#993)
## Changes
Previously local JAR paths were transformed to remote path during
initialisation and thus artifact building logic did not recognise such
libraries as local to be handled and uploaded.

Now it's possible to use spark_jar_tasks with local JAR libraries on
14.1+ DBR clusters

Example configuration
```
bundle:
  name: spark-jar

workspace:
  host: ***

artifacts:
  my_java_code:
    path: ./sample-java
    build: "javac PrintArgs.java && jar cvfm PrintArgs.jar META-INF/MANIFEST.MF PrintArgs.class"
    files:
      - source: "/Users/andrew.nester/dabs/wheel/sample-java/PrintArgs.jar"

resources:
  jobs:
    print_args:
      name: "Print Args"
      tasks:
        - task_key: Print
          new_cluster:
            num_workers: 0
            spark_version: 14.2.x-scala2.12
            node_type_id: i3.xlarge
            spark_conf:
              "spark.databricks.cluster.profile": "singleNode"
              "spark.master": "local[*]"
            custom_tags:
              ResourceClass: "SingleNode"
          spark_jar_task:
            main_class_name: PrintArgs
          libraries:
            - jar: ./sample-java/PrintArgs.jar
```

## Tests
Manually running `bundle deploy and bundle run`
2023-11-21 10:15:09 +00:00
Lennart Kats (databricks) 92539d4b9b
Work around DLT issue with `$PYTHONPATH` not being set correctly (#999)
## Changes

DLT currently doesn't always set `$PYTHONPATH` correctly (ES-947370).
This restores the original workaround to make new pipelines work while
that issue is being addressed. The workaround was removed in #832.

Manually tested.
2023-11-20 19:25:43 +00:00
Serge Smertin 1b7558cd7d
Add `databricks labs` command group (#914)
## Command group
<img width="688" alt="image"
src="https://github.com/databricks/cli/assets/259697/51a3d309-2244-40ff-b6c3-4f151da6a6ec">

## Installed versions
<img width="388" alt="image"
src="https://github.com/databricks/cli/assets/259697/0873e8ac-20cc-4bab-bb32-e064eddc05f2">

## Project commands
<img width="479" alt="image"
src="https://github.com/databricks/cli/assets/259697/618949e2-99f1-466b-9288-97e88c715518">

## Installer hook

![image](https://github.com/databricks/cli/assets/259697/3ce0d355-039a-445f-bff7-6dfc1a2e3288)

## Update notifications

![image](https://github.com/databricks/cli/assets/259697/10724627-3606-49e1-9722-00ae37afed12)

# Downstream work

- https://github.com/databrickslabs/ucx/pull/517
- https://github.com/databrickslabs/dlt-meta/pull/19
- https://github.com/databrickslabs/discoverx/pull/84
2023-11-17 12:47:37 +00:00
Pieter Noordhuis 489d6fa1b8
Replace direct calls with `bundle.Apply` (#990)
## Changes

Some test call sites called directly into the mutator's `Apply` function
instead of `bundle.Apply`. Calling into `bundle.Apply` is preferred
because that's where we can run pre/post logic common across all
mutators.

## Tests

Pass.
2023-11-15 14:19:18 +00:00
Pieter Noordhuis d80c35f66a
Rename variable `bundle -> b` (#989)
## Changes

All calls to apply a mutator must go through `bundle.Apply`. This
conflicts with the existing use of the variable `bundle`. This change
un-aliases the variable from the package name by renaming all variables
to `b`.

## Tests

Pass.
2023-11-15 14:03:36 +00:00
shreyas-goenka 0c837e5772
Make `file_path` and `artifact_path` fields consistent with json tag (#987)
## Changes
This PR:
1. Renames `FilesPath` -> `FilePath` and `ArtifactsPath` ->
`ArtifactPath` in the bundle and metadata configuration to make them
consistant with the json tags.
2. Fixes development / production mode error messages to point to
`file_path` and `artifact_path`

## Tests
Existing unit tests. This is a strightforward renaming of the fields.
2023-11-15 13:37:26 +00:00
Pieter Noordhuis 2c908f8fea
Function to convert Go struct back to `config.Value` (#935)
## Changes

This PR is the counterpart to #904. With this change, we are able to
convert a `config.Value` into a Go struct, make modifications to the Go
struct, and reflect those changes in a new `config.Value`.

This functionality allows us to incrementally introduce this
configuration representation to existing bundle mutators. Bundle
mutators expect a `*bundle.Bundle` argument and mutate its configuration
directly. These mutations are not reflected in the corresponding
`config.Value` (once introduced), which means we cannot use the
`config.Value` as source of truth until we update _all_ mutators. To
address this, we can run `convert.ToTyped` and `convert.FromTyped` at
the mutator boundary (from `bundle.Apply`) and capture changes made to
the Go struct. Then we can incrementally make mutators aware of the
`config.Value` configuration and have them mutate that structure
directly.

## Tests

New unit tests pass.

Manual spot checks against the bundle configuration type.
2023-11-15 09:19:51 +00:00
shreyas-goenka a25f10f247
Add `--tag` and `--branch` options to bundle init command (#975)
## Tests
Tested manually. Specified branch / tag are indeed cloned and used by
bundle init.
2023-11-14 22:27:58 +00:00