Commit Graph

893 Commits

Author SHA1 Message Date
Miles Yucht b65ce75c1f
Use Go SDK Iterators when listing resources with the CLI (#1202)
## Changes
Currently, when the CLI run a list API call (like list jobs), it uses
the `List*All` methods from the SDK, which list all resources in the
collection. This is very slow for large collections: if you need to list
all jobs from a workspace that has 10,000+ jobs, you'll be waiting for
at least 100 RPCs to complete before seeing any output.

Instead of using List*All() methods, the SDK recently added an iterator
data structure that allows traversing the collection without needing to
completely list it first. New pages are fetched lazily if the next
requested item belongs to the next page. Using the List() methods that
return these iterators, the CLI can proactively print out some of the
response before the complete collection has been fetched.

This involves a pretty major rewrite of the rendering logic in `cmdio`.
The idea there is to define custom rendering logic based on the type of
the provided resource. There are three renderer interfaces:

1. textRenderer: supports printing something in a textual format (i.e.
not JSON, and not templated).
2. jsonRenderer: supports printing something in a pretty-printed JSON
format.
3. templateRenderer: supports printing something using a text template.

There are also three renderer implementations:

1. readerRenderer: supports printing a reader. This only implements the
textRenderer interface.
2. iteratorRenderer: supports printing a `listing.Iterator` from the Go
SDK. This implements jsonRenderer and templateRenderer, buffering 20
resources at a time before writing them to the output.
3. defaultRenderer: supports printing arbitrary resources (the previous
implementation).

Callers will either use `cmdio.Render()` for rendering individual
resources or `io.Reader` or `cmdio.RenderIterator()` for rendering an
iterator. This separate method is needed to safely be able to match on
the type of the iterator, since Go does not allow runtime type matches
on generic types with an existential type parameter.

One other change that needs to happen is to split the templates used for
text representation of list resources into a header template and a row
template. The template is now executed multiple times for List API
calls, but the header should only be printed once. To support this, I
have added `headerTemplate` to `cmdIO`, and I have also changed
`RenderWithTemplate` to include a `headerTemplate` parameter everywhere.

## Tests
- [x] Unit tests for text rendering logic
- [x] Unit test for reflection-based iterator construction.

---------

Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-02-21 14:16:36 +00:00
Andrew Nester 5309e0fc2a
Improved error message when no .databrickscfg (#1223)
## Changes
Fixes #1060
2024-02-21 14:15:26 +00:00
shreyas-goenka 4ac1c1655b
Fix CLI nightlies on our UC workspaces (#1225)
This PR fixes some test helper functions so that they work properly on
our test environments for AWS and Azure UC workspaces.
2024-02-21 13:06:03 +00:00
Andrew Nester 26833418c3
Release v0.214.0 (#1226)
CLI:
* Add support for UC Volumes to the `databricks fs` commands
([#1209](https://github.com/databricks/cli/pull/1209)).

Bundles:
* Use dynamic configuration model in bundles
([#1098](https://github.com/databricks/cli/pull/1098)).
* Allow use of variables references in primitive non-string fields
([#1219](https://github.com/databricks/cli/pull/1219)).
* Add an experimental default-sql template
([#1051](https://github.com/databricks/cli/pull/1051)).
* Add an experimental dbt-sql template
([#1059](https://github.com/databricks/cli/pull/1059)).

Internal:
* Add fork-user to winget release workflow
([#1214](https://github.com/databricks/cli/pull/1214)).
* Use `any` as type for data sources and resources in `tf/schema`
([#1216](https://github.com/databricks/cli/pull/1216)).
* Avoid infinite recursion when normalizing a recursive type
([#1213](https://github.com/databricks/cli/pull/1213)).
* Fix issue where interpolating a new ref would rewrite unrelated fields
([#1217](https://github.com/databricks/cli/pull/1217)).
* Use `dyn.Value` as input to generating Terraform JSON
([#1218](https://github.com/databricks/cli/pull/1218)).

API Changes:
* Changed `databricks lakehouse-monitors update` command with new
required argument order.
 * Added `databricks online-tables` command group.

OpenAPI commit cdd76a98a4fca7008572b3a94427566dd286c63b (2024-02-19)
Dependency updates:
* Bump Terraform provider to v1.36.2
([#1215](https://github.com/databricks/cli/pull/1215)).
* Bump github.com/databricks/databricks-sdk-go from 0.32.0 to 0.33.0
([#1222](https://github.com/databricks/cli/pull/1222)).
2024-02-20 20:53:59 +00:00
shreyas-goenka 5ba0aaa5c5
Add support for UC Volumes to the `databricks fs` commands (#1209)
## Changes
```
shreyas.goenka@THW32HFW6T cli % databricks fs -h
Commands to do file system operations on DBFS and UC Volumes.

Usage:
  databricks fs [command]

Available Commands:
  cat         Show file content.
  cp          Copy files and directories.
  ls          Lists files.
  mkdir       Make directories.
  rm          Remove files and directories.
```

This PR adds support for UC Volumes to the fs commands. The fs commands
for UC volumes work the same as they currently do for DBFS. This is
ensured by running the same test matrix we across both DBFS and UC
Volumes versions of the fs commands.

## Tests
Support for UC volumes is tested by running the same tests as we did
originally for DBFS commands. The tests require a `main` catalog to
exist in the workspace, which does in our test workspaces environments
which have the `TEST_METASTORE_ID` environment variable set.

For the Files API filer, we do the same by running mostly common tests
to ensure the filers for "local", "wsfs", "dbfs" and "files API" are
consistent.

The tests are also made to all run in parallel to reduce the time taken.
To ensure the separation of the tests, each test creates its own UC
schema (for UC volumes tests) or DBFS directories (for DBFS tests).
2024-02-20 16:14:37 +00:00
dependabot[bot] d9f34e6b22
Bump github.com/databricks/databricks-sdk-go from 0.32.0 to 0.33.0 (#1222)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.32.0 to 0.33.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.33.0</h2>
<p>Internal Changes:</p>
<ul>
<li>Add helper function to get header fields (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/822">#822</a>).</li>
<li>Add Int64 to header type injection (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/819">#819</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#LakehouseMonitorsAPI">w.LakehouseMonitors</a>
workspace-level service with new required argument order.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTablesAPI">w.OnlineTables</a>
workspace-level service.</li>
<li>Removed <code>AssetsDir</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateMonitor">catalog.UpdateMonitor</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ContinuousUpdateStatus">catalog.ContinuousUpdateStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteOnlineTableRequest">catalog.DeleteOnlineTableRequest</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FailedStatus">catalog.FailedStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetOnlineTableRequest">catalog.GetOnlineTableRequest</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTable">catalog.OnlineTable</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableSpec">catalog.OnlineTableSpec</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableState">catalog.OnlineTableState</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableStatus">catalog.OnlineTableStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#PipelineProgress">catalog.PipelineProgress</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ProvisioningStatus">catalog.ProvisioningStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TriggeredUpdateStatus">catalog.TriggeredUpdateStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ViewData">catalog.ViewData</a>.</li>
<li>Added <code>ContentLength</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Added <code>ContentType</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Added <code>LastModified</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Changed <code>LastModified</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#GetMetadataResponse">files.GetMetadataResponse</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#LastModifiedHttpDate">files.LastModifiedHttpDate</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#LastModifiedHttpDate">files.LastModifiedHttpDate</a>.</li>
<li>Removed <code>Config</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>Ai21labsConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>AnthropicConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>AwsBedrockConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>CohereConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>DatabricksModelServingConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>OpenaiConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>PalmConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModelConfig">serving.ExternalModelConfig</a>.</li>
<li>Added <code>MaxProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput">serving.ServedEntityInput</a>.</li>
<li>Added <code>MinProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput">serving.ServedEntityInput</a>.</li>
<li>Added <code>MaxProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput">serving.ServedEntityOutput</a>.</li>
<li>Added <code>MinProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput">serving.ServedEntityOutput</a>.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>0.33.0</h2>
<p>Internal Changes:</p>
<ul>
<li>Add helper function to get header fields (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/822">#822</a>).</li>
<li>Add Int64 to header type injection (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/819">#819</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#LakehouseMonitorsAPI">w.LakehouseMonitors</a>
workspace-level service with new required argument order.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTablesAPI">w.OnlineTables</a>
workspace-level service.</li>
<li>Removed <code>AssetsDir</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateMonitor">catalog.UpdateMonitor</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ContinuousUpdateStatus">catalog.ContinuousUpdateStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteOnlineTableRequest">catalog.DeleteOnlineTableRequest</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FailedStatus">catalog.FailedStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetOnlineTableRequest">catalog.GetOnlineTableRequest</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTable">catalog.OnlineTable</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableSpec">catalog.OnlineTableSpec</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableState">catalog.OnlineTableState</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableStatus">catalog.OnlineTableStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#PipelineProgress">catalog.PipelineProgress</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ProvisioningStatus">catalog.ProvisioningStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TriggeredUpdateStatus">catalog.TriggeredUpdateStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ViewData">catalog.ViewData</a>.</li>
<li>Added <code>ContentLength</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Added <code>ContentType</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Added <code>LastModified</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Changed <code>LastModified</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#GetMetadataResponse">files.GetMetadataResponse</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#LastModifiedHttpDate">files.LastModifiedHttpDate</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#LastModifiedHttpDate">files.LastModifiedHttpDate</a>.</li>
<li>Removed <code>Config</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>Ai21labsConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>AnthropicConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>AwsBedrockConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>CohereConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>DatabricksModelServingConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>OpenaiConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>PalmConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModelConfig">serving.ExternalModelConfig</a>.</li>
<li>Added <code>MaxProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput">serving.ServedEntityInput</a>.</li>
<li>Added <code>MinProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput">serving.ServedEntityInput</a>.</li>
<li>Added <code>MaxProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput">serving.ServedEntityOutput</a>.</li>
<li>Added <code>MinProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput">serving.ServedEntityOutput</a>.</li>
</ul>
<p>OpenAPI SHA: cdd76a98a4fca7008572b3a94427566dd286c63b, Date:
2024-02-19</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="eba5c8b3ae"><code>eba5c8b</code></a>
Release v0.33.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/823">#823</a>)</li>
<li><a
href="6846045a98"><code>6846045</code></a>
Add Int64 to header type injection (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/819">#819</a>)</li>
<li><a
href="c6a803ae18"><code>c6a803a</code></a>
Add helper function to get header fields (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/822">#822</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.32.0...v0.33.0">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.32.0&new-version=0.33.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-02-19 14:30:06 +00:00
Lennart Kats (databricks) 162b115e19
Add an experimental default-sql template (#1051)
## Changes

This adds a `default-sql` template! 

In this latest revision, I've hidden the new template from the list so
we can merge it, iterate over it, and properly release the template at
the right time.

- [x] WorkspaceFS support for .sql files is in prod
- [x] SQL extension is preconfigured based on extension settings (if
possible)
- [ ] Streaming tables support is either ungated or the template
provides instructions about signup
- _Mitigation for now: this template is hidden from the list of
templates._
- [x] Support non-UC workspaces

## Tests
- [x] Unit tests
- [x] Manual testing
- [x] More manual testing
- [x] Reviewer testing

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
Co-authored-by: PaulCornellDB <paul.cornell@databricks.com>
2024-02-19 12:01:11 +00:00
Pieter Noordhuis a2a4948047
Allow use of variables references in primitive non-string fields (#1219)
## Changes

This change enables the use of bundle variables for boolean, integer,
and floating point fields.

## Tests

* Unit tests.
* I ran a manual test to confirm parameterizing the number of workers in
a cluster definition works.
2024-02-19 10:44:51 +00:00
Lennart Kats (databricks) 1c680121c8
Add an experimental dbt-sql template (#1059)
## Changes

This adds a new dbt-sql template. This work requires the new WorkspaceFS
support for dbt tasks.

In this latest revision, I've hidden the new template from the list so
we can merge it, iterate over it, and propertly release the template at
the right time.

Blockers:
- [x] WorkspaceFS support for dbt projects is in prod
- [x] Move dbt files into a subdirectory
- [ ] Wait until the next (>1.7.4) release of the dbt plugin which will
have major improvements!
- _Rather than wait, this template is hidden from the list of
templates._
- [x] SQL extension is preconfigured based on extension settings (if
possible)
- MV / streaming tables:
  - [x] Add to template
- [x] Fix https://github.com/databricks/dbt-databricks/issues/535 (to be
released with in 1.7.4)
- [x] Merge https://github.com/databricks/dbt-databricks/pull/338 (to be
released with in 1.7.4)
- [ ] Fix "too many 503 errors" issue
(https://github.com/databricks/dbt-databricks/issues/570, internal
tracker: ES-1009215, ES-1014138)
  - [x] Support ANSI mode in the template
- [ ] Streaming tables support is either ungated or the template
provides instructions about signup
- _Mitigation for now: this template is hidden from the list of
templates._
- [x] Support non-workspace-admin deployment
- [x] Make sure `data_security_mode: SINGLE_USER` works on non-UC
workspaces (it's required to be explicitly specified on UC workspaces
with single-node clusters)
- [x] Support non-UC workspaces

## Tests

- [x] Unit tests
- [x] Manual testing
- [x] More manual testing
- [ ] Reviewer manual testing
  - _I'd like to do a small bug bash post-merging._
- [x] Unit tests
2024-02-19 09:15:17 +00:00
Pieter Noordhuis f70ec359dc
Use `dyn.Value` as input to generating Terraform JSON (#1218)
## Changes

This builds on #1098 and uses the `dyn.Value` representation of the
bundle configuration to generate the Terraform JSON definition of
resources in the bundle.

The existing code (in `BundleToTerraform`) was not great and in an
effort to slightly improve this, I added a package `tfdyn` that includes
dedicated files for each resource type. Every resource type has its own
conversion type that takes the `dyn.Value` of the bundle-side resource
and converts it into Terraform resources (e.g. a job and optionally its
permissions).

Because we now use a `dyn.Value` as input, we can represent and emit
zero-values that have so far been omitted. For example, setting
`num_workers: 0` in your bundle configuration now propagates all the way
to the Terraform JSON definition.

## Tests

* Unit tests for every converter. I reused the test inputs from
`convert_test.go`.
* Equivalence tests in every existing test case checks that the
resulting JSON is identical.
* I manually compared the TF JSON file generated by the CLI from the
main branch and from this PR on all of our bundles and bundle examples
(internal and external) and found the output doesn't change (with the
exception of the odd zero-value being included by the version in this
PR).
2024-02-16 20:54:38 +00:00
Pieter Noordhuis 87dd46a3f8
Use dynamic configuration model in bundles (#1098)
## Changes

This is a fundamental change to how we load and process bundle
configuration. We now depend on the configuration being represented as a
`dyn.Value`. This representation is functionally equivalent to Go's
`any` (it is variadic) and allows us to capture metadata associated with
a value, such as where it was defined (e.g. file, line, and column). It
also allows us to represent Go's zero values properly (e.g. empty
string, integer equal to 0, or boolean false).

Using this representation allows us to let the configuration model
deviate from the typed structure we have been relying on so far
(`config.Root`). We need to deviate from these types when using
variables for fields that are not a string themselves. For example,
using `${var.num_workers}` for an integer `workers` field was impossible
until now (though not implemented in this change).

The loader for a `dyn.Value` includes functionality to capture any and
all type mismatches between the user-defined configuration and the
expected types. These mismatches can be surfaced as validation errors in
future PRs.

Given that many mutators expect the typed struct to be the source of
truth, this change converts between the dynamic representation and the
typed representation on mutator entry and exit. Existing mutators can
continue to modify the typed representation and these modifications are
reflected in the dynamic representation (see `MarkMutatorEntry` and
`MarkMutatorExit` in `bundle/config/root.go`).

Required changes included in this change:
* The existing interpolation package is removed in favor of
`libs/dyn/dynvar`.
* Functionality to merge job clusters, job tasks, and pipeline clusters
are now all broken out into their own mutators.

To be implemented later:
* Allow variable references for non-string types.
* Surface diagnostics about the configuration provided by the user in
the validation output.
* Some mutators use a resource's configuration file path to resolve
related relative paths. These depend on `bundle/config/paths.Path` being
set and populated through `ConfigureConfigFilePath`. Instead, they
should interact with the dynamically typed configuration directly. Doing
this also unlocks being able to differentiate different base paths used
within a job (e.g. a task override with a relative path defined in a
directory other than the base job).

## Tests

* Existing unit tests pass (some have been modified to accommodate)
* Integration tests pass
2024-02-16 19:41:58 +00:00
Pieter Noordhuis 5f59572cb3
Fix issue where interpolating a new ref would rewrite unrelated fields (#1217)
## Changes

When resolving a value returned by the lookup function, the code would
call into `resolveRef` with the key that `resolveKey` was called with.
In doing so, it would cache the _new_ ref under that key.

We fix this by caching ref resolution only at the top level and relying
on lookup caching to avoid duplicate work.

This came up while testing #1098.

## Tests

Unit test.
2024-02-16 16:19:40 +00:00
Pieter Noordhuis ea8daf1f97
Avoid infinite recursion when normalizing a recursive type (#1213)
## Changes

This is a follow-up to #1211 prompted by the addition of a recursive
type in the Go SDK v0.31.0 (`jobs.ForEachTask`).

When populating missing fields with their zero values we must not
inadvertently recurse into a recursive type.

## Tests

New unit test fails with a stack overflow if the fix if the check is
disabled.
2024-02-16 12:56:02 +00:00
Pieter Noordhuis 788ec81785
Use `any` as type for data sources and resources in `tf/schema` (#1216)
## Changes

We plan to use the any-equivalent of a `dyn.Value` such that we can use
variable references for non-string fields (e.g.
`${databricks_job.some_job.id}` where an integer is expected), as well
as properly emit zero values for primitive types (e.g. 0 for integers or
false for booleans).

This change is in preparation for the above.

## Tests

Unit tests.
2024-02-16 12:46:24 +00:00
Pieter Noordhuis ffae10d904
Bump Terraform provider to v1.36.2 (#1215)
## Changes

* Update `go.mod` with latest dependencies
* Update `go.mod` to require Go 1.21 to match root `go.mod`
* Regenerate structs for Terraform provider v1.36.2

## Tests

n/a
2024-02-16 07:05:45 +00:00
Miles Yucht ed00c85843
Add fork-user to winget release workflow (#1214)
## Changes
From https://github.com/vedantmgoyal2009/winget-releaser:

> If you are forking
[winget-pkgs](https://github.com/microsoft/winget-pkgs) on a different
account (e.g. bot/personal account), you can use the fork-user input to
specify the username of the account where the fork is present.

This PR makes that change.

## Tests
<!-- How is this tested? -->
2024-02-16 06:35:52 +00:00
Andrew Nester 961d04d4f0
Release v0.213.0 (#1212)
CLI:
* Ignore environment variables for `auth profiles`
([#1189](https://github.com/databricks/cli/pull/1189)).
* Update LICENSE file to match Databricks license language
([#1013](https://github.com/databricks/cli/pull/1013)).

Bundles:
* Added `bundle deployment bind` and `unbind` command
([#1131](https://github.com/databricks/cli/pull/1131)).
* Use allowlist for Git-related fields to include in metadata
([#1187](https://github.com/databricks/cli/pull/1187)).
* Added `--restart` flag for `bundle run` command
([#1191](https://github.com/databricks/cli/pull/1191)).
* Generate correct YAML if `custom_tags` or `spark_conf` is used for
pipeline or job cluster configuration
([#1210](https://github.com/databricks/cli/pull/1210)).

Internal:
* Move folders package into libs
([#1184](https://github.com/databricks/cli/pull/1184)).
* Log time it takes for profile to load
([#1186](https://github.com/databricks/cli/pull/1186)).
* Use mockery to generate mocks compatible with testify/mock
([#1190](https://github.com/databricks/cli/pull/1190)).
* Retain partially valid structs in `convert.Normalize`
([#1203](https://github.com/databricks/cli/pull/1203)).
* Skip `for_each_task` when generating the bundle schema
([#1204](https://github.com/databricks/cli/pull/1204)).
* Regenerate the CLI using the same OpenAPI spec as the SDK
([#1205](https://github.com/databricks/cli/pull/1205)).
* Avoid race-conditions while executing sub-commands
([#1201](https://github.com/databricks/cli/pull/1201)).

API Changes:
 * Added `databricks tables exists` command.
 * Added `databricks lakehouse-monitors` command group.
 * Removed `databricks files get-status` command.
 * Added `databricks files create-directory` command.
 * Added `databricks files delete-directory` command.
 * Added `databricks files get-directory-metadata` command.
 * Added `databricks files get-metadata` command.
 * Added `databricks files list-directory-contents` command.
 * Removed `databricks pipelines reset` command.
* Changed `databricks account settings delete-personal-compute-setting`
command with new required argument order.
* Removed `databricks account settings read-personal-compute-setting`
command.
* Changed `databricks account settings update-personal-compute-setting`
command with new required argument order.
* Added `databricks account settings get-personal-compute-setting`
command.
* Removed `databricks settings delete-default-workspace-namespace`
command.
* Removed `databricks settings read-default-workspace-namespace`
command.
* Removed `databricks settings update-default-workspace-namespace`
command.
 * Added `databricks settings delete-default-namespace-setting` command.
* Added `databricks settings delete-restrict-workspace-admins-setting`
command.
 * Added `databricks settings get-default-namespace-setting` command.
* Added `databricks settings get-restrict-workspace-admins-setting`
command.
 * Added `databricks settings update-default-namespace-setting` command.
* Added `databricks settings update-restrict-workspace-admins-setting`
command.
* Changed `databricks token-management create-obo-token` command with
new required argument order.
 * Changed `databricks token-management get` command to return .
* Changed `databricks dashboards create` command . New request type is .
 * Added `databricks dashboards update` command.

OpenAPI commit c40670f5a2055c92cf0a6aac92a5bccebfb80866 (2024-02-14)
Dependency updates:
* Bump github.com/hashicorp/hc-install from 0.6.2 to 0.6.3
([#1200](https://github.com/databricks/cli/pull/1200)).
* Bump golang.org/x/term from 0.16.0 to 0.17.0
([#1197](https://github.com/databricks/cli/pull/1197)).
* Bump golang.org/x/oauth2 from 0.16.0 to 0.17.0
([#1198](https://github.com/databricks/cli/pull/1198)).
* Bump github.com/databricks/databricks-sdk-go from 0.30.1 to 0.32.0
([#1199](https://github.com/databricks/cli/pull/1199)).

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-02-15 17:34:17 +00:00
Pieter Noordhuis 5063c48e83
Trim trailing whitespace (#1206) 2024-02-15 16:23:48 +00:00
Pieter Noordhuis 18166f5b47
Add option to include fields present in the type but not in the value (#1211)
## Changes

This feature supports variable lookups in a `dyn.Value` that are present
in the type but haven't been initialized with a value.

For example: `${bundle.git.origin_url}` is present in the `dyn.Value`
only if it was assigned a value. If it wasn't assigned a value it should
resolve to the empty string. This normalization option, when set,
ensures that all fields that are represented in the specified type are
present in the return value.

This change is in support of #1098.

## Tests

Added unit test.
2024-02-15 15:16:40 +00:00
Andrew Nester e474948a4b
Generate correct YAML if custom_tags or spark_conf is used for pipeline or job cluster configuration (#1210)
These fields (key and values) needs to be double quoted in order for
yaml loader to read, parse and unmarshal it into Go struct correctly
because these fields are `map[string]string` type.

## Tests
Added regression unit and E2E tests
2024-02-15 15:03:19 +00:00
dependabot[bot] 299e9b56a6
Bump github.com/databricks/databricks-sdk-go from 0.30.1 to 0.32.0 (#1199)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.30.1 to 0.32.0.

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-02-15 14:52:17 +00:00
Andrew Nester 80670eceed
Added `bundle deployment bind` and `unbind` command (#1131)
## Changes
Added `bundle deployment bind` and `unbind` command.

This command allows to bind bundle-defined resources to existing
resources in Databricks workspace so they become DABs-managed.

## Tests
Manually + added E2E test
2024-02-14 18:04:45 +00:00
Miles Yucht e8b0698e19
Regenerate the CLI using the same OpenAPI spec as the SDK (#1205)
## Changes
The OpenAPI spec used to generate the CLI doesn't match the version used
for the SDK version that the CLI currently depends on. This PR
regenerates the CLI based on the same version of the OpenAPI spec used
by the SDK on v0.30.1.

## Tests
<!-- How is this tested? -->
2024-02-13 14:33:59 +00:00
shreyas-goenka 52b813bd8e
Skip `for_each_task` when generating the bundle schema (#1204)
## Changes
Bundle schema generation does not support recursive API fields. This PR
skips generation for for_each_task until we add proper support for
recursive types in the bundle schema.

## Tests
Manually. This fixes the generation of the CLI and the bundle schema
command works as expected, with the sub-schema for `for_each_task` being
set to null in the output.
```
"for_each_task": null,
```
2024-02-13 14:13:47 +00:00
Pieter Noordhuis aa0c715930
Retain partially valid structs in `convert.Normalize` (#1203)
## Changes

Before this change, any error in a subtree would cause the entire
subtree to be dropped from the output.

This is not ideal when debugging, so instead we drop only the values
that cannot be normalized. Note that this doesn't change behavior if the
caller is properly checking the returned diagnostics for errors.

Note: this includes a change to use `dyn.InvalidValue` as opposed to
`dyn.NilValue` when returning errors.

## Tests

Added unit tests for the case where nested struct, map, or slice
elements contain an error.
2024-02-13 14:12:19 +00:00
dependabot[bot] 36241ee55e
Bump golang.org/x/oauth2 from 0.16.0 to 0.17.0 (#1198)
Bumps [golang.org/x/oauth2](https://github.com/golang/oauth2) from
0.16.0 to 0.17.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="ebe81ad837"><code>ebe81ad</code></a>
go.mod: update golang.org/x dependencies</li>
<li><a
href="adffd94437"><code>adffd94</code></a>
google/internal/externalaccount: update serviceAccountImpersonationRE to
supp...</li>
<li><a
href="deefa7e836"><code>deefa7e</code></a>
google/downscope: add DownscopingConfig.UniverseDomain to support
TPC</li>
<li>See full diff in <a
href="https://github.com/golang/oauth2/compare/v0.16.0...v0.17.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/oauth2&package-manager=go_modules&previous-version=0.16.0&new-version=0.17.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-12 15:08:35 +00:00
Ilia Babanov cbf75b157d
Avoid race-conditions while executing sub-commands (#1201)
## Changes
`executor.Exec` now uses `cmd.CombinedOutput`. Previous implementation
was hanging on my windows VM during `bundle deploy` on the
`ReadAll(MultiReader(stdout, stderr))` line.

The problem is related to the fact the MultiReader reads sequentially,
and the `stdout` is the first in line. Even simple `io.ReadAll(stdout)`
hangs on me, as it seems like the command that we spawn (python wheel
build) waits for the error stream to be finished before closing stdout
on its own side? Reading `stderr` (or `out`) in a separate go-routine
fixes the deadlock, but `cmd.CombinedOutput` feels like a simpler
solution.

Also noticed that Exec was not removing `scriptFile` after itself, fixed
that too.

## Tests
Unit tests and manually
2024-02-12 15:04:14 +00:00
dependabot[bot] feb20d59a4
Bump golang.org/x/term from 0.16.0 to 0.17.0 (#1197)
Bumps [golang.org/x/term](https://github.com/golang/term) from 0.16.0 to
0.17.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="353276a841"><code>353276a</code></a>
go.mod: update golang.org/x dependencies</li>
<li>See full diff in <a
href="https://github.com/golang/term/compare/v0.16.0...v0.17.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/term&package-manager=go_modules&previous-version=0.16.0&new-version=0.17.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-12 13:12:41 +00:00
dependabot[bot] 2c752e7cb7
Bump github.com/hashicorp/hc-install from 0.6.2 to 0.6.3 (#1200)
Bumps
[github.com/hashicorp/hc-install](https://github.com/hashicorp/hc-install)
from 0.6.2 to 0.6.3.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/hashicorp/hc-install/releases">github.com/hashicorp/hc-install's
releases</a>.</em></p>
<blockquote>
<h2>v0.6.3</h2>
<p>DEPENDENCIES:</p>
<ul>
<li>build(deps): bump github.com/go-git/go-git/v5 from 5.10.1 to 5.11.0
by <a href="https://github.com/dependabot"><code>@​dependabot</code></a>
in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/171">hashicorp/hc-install#171</a></li>
<li>build(deps): bump github.com/ProtonMail/go-crypto from
0.0.0-20230828082145-3c4c8a2d2371 to 1.0.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/179">hashicorp/hc-install#179</a></li>
<li>build(deps): bump github.com/ProtonMail/go-crypto from 1.0.0 to
v1.1.0-alpha.0 by <a
href="https://github.com/radeksimko"><code>@​radeksimko</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/183">hashicorp/hc-install#183</a></li>
<li>build(deps): bump golang.org/x/crypto from 0.16.0 to 0.17.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/175">hashicorp/hc-install#175</a></li>
<li>build(deps): bump golang.org/x/mod from 0.14.0 to 0.15.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/184">hashicorp/hc-install#184</a></li>
<li>build(deps): Bump workflows to latest trusted versions by <a
href="https://github.com/hashicorp-tsccr"><code>@​hashicorp-tsccr</code></a>
in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/173">hashicorp/hc-install#173</a></li>
<li>build(deps): Bump workflows to latest trusted versions by <a
href="https://github.com/hashicorp-tsccr"><code>@​hashicorp-tsccr</code></a>
in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/177">hashicorp/hc-install#177</a></li>
<li>build(deps): Bump workflows to latest trusted versions by <a
href="https://github.com/hashicorp-tsccr"><code>@​hashicorp-tsccr</code></a>
in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/180">hashicorp/hc-install#180</a></li>
<li>build(deps): Bump workflows to latest trusted versions by <a
href="https://github.com/hashicorp-tsccr"><code>@​hashicorp-tsccr</code></a>
in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/182">hashicorp/hc-install#182</a></li>
<li>build(deps): Replace mitchellh/cli w/ hashicorp/cli by <a
href="https://github.com/radeksimko"><code>@​radeksimko</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/176">hashicorp/hc-install#176</a></li>
<li>go: bump version to 1.21.5 by <a
href="https://github.com/radeksimko"><code>@​radeksimko</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/174">hashicorp/hc-install#174</a></li>
</ul>
<p>INTERNAL:</p>
<ul>
<li>github/release: Ensure production workflow doesn't start before
staging by <a
href="https://github.com/radeksimko"><code>@​radeksimko</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/170">hashicorp/hc-install#170</a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="2c6cd569fd"><code>2c6cd56</code></a>
Update VERSION to 0.6.3</li>
<li><a
href="dbd7e10782"><code>dbd7e10</code></a>
build(deps): bump golang.org/x/mod from 0.14.0 to 0.15.0 (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/184">#184</a>)</li>
<li><a
href="1ffdbbf558"><code>1ffdbbf</code></a>
deps: Bump ProtonMail/go-crypto to v1.1.0-alpha.0 (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/183">#183</a>)</li>
<li><a
href="f2f9c0b14f"><code>f2f9c0b</code></a>
Result of tsccr-helper -log-level=info gha update -latest . (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/182">#182</a>)</li>
<li><a
href="82fbd96950"><code>82fbd96</code></a>
Result of tsccr-helper -log-level=info gha update -latest . (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/180">#180</a>)</li>
<li><a
href="c4ced0f9e6"><code>c4ced0f</code></a>
build(deps): bump github.com/ProtonMail/go-crypto (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/179">#179</a>)</li>
<li><a
href="33c1a7754c"><code>33c1a77</code></a>
Result of tsccr-helper -log-level=info gha update -latest . (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/177">#177</a>)</li>
<li><a
href="bf0a37952f"><code>bf0a379</code></a>
deps: Replace mitchellh/cli w/ hashicorp/cli (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/176">#176</a>)</li>
<li><a
href="02d1410485"><code>02d1410</code></a>
build(deps): bump golang.org/x/crypto from 0.16.0 to 0.17.0 (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/175">#175</a>)</li>
<li><a
href="946b925dd1"><code>946b925</code></a>
go: bump version to 1.21.5 (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/174">#174</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/hashicorp/hc-install/compare/v0.6.2...v0.6.3">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/hashicorp/hc-install&package-manager=go_modules&previous-version=0.6.2&new-version=0.6.3)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-12 11:57:27 +00:00
Andrew Nester bc30c9ed4a
Added `--restart` flag for `bundle run` command (#1191)
## Changes
Added `--restart` flag for `bundle run` command

When running with this flag, `bundle run` will cancel all existing runs
before starting a new one

## Tests
Manually
2024-02-09 14:33:14 +00:00
Serge Smertin cac112c5bc
Update LICENSE (#1013)
See https://www.databricks.com/legal/db-license
2024-02-08 20:26:41 +00:00
Pieter Noordhuis 4073e45d4b
Use mockery to generate mocks compatible with testify/mock (#1190)
## Changes

This is the same approach we use in the Go SDK.

## Tests

Tests pass.
2024-02-08 15:18:53 +00:00
shreyas-goenka d638262665
Add spinner when downloading templates for bundle init (#1188)
## Changes
Templates can take a long time to download. This PR adds a spinner to
give feedback to users.

## Tests
<!-- How is this tested? -->
Manually


https://github.com/databricks/cli/assets/88374338/b453982c-3233-40f4-8d6f-f31606ff0195
2024-02-08 12:52:53 +00:00
Pieter Noordhuis a835a3e564
Ignore environment variables for `auth profiles` (#1189)
## Changes

If environment variables related to unified authentication are set and a
user runs `auth profiles`, the environment variables will interfere with
the output. This change only takes profile data into account for the
output.

## Tests

Added a unit test.
2024-02-08 12:25:51 +00:00
Pieter Noordhuis f7d1a5862d
Use allowlist for Git-related fields to include in metadata (#1187)
## Changes

When new fields are added they should not automatically propagate to the
bundle metadata.

## Tests

Test passes.
2024-02-08 12:23:14 +00:00
Pieter Noordhuis b1b5ad8acd
Log time it takes for profile to load (#1186)
## Changes

Aids debugging why `auth profiles` may take longer than expected.

## Tests

Confirmed manually that timing information shows up in the log output.
2024-02-08 11:10:52 +00:00
Pieter Noordhuis 8e58e04e8f
Move folders package into libs (#1184)
## Changes

This is the last top-level package that doesn't need to be top-level.
2024-02-07 16:33:18 +00:00
Andrew Nester f6cdc75825
Release v0.212.4 (#1183)
Bundles:
* Allow specifying executable in artifact section and skip bash from WSL
([#1169](https://github.com/databricks/cli/pull/1169)).
* Added warning when trying to deploy bundle with `--fail-if-running`
and running resources
([#1163](https://github.com/databricks/cli/pull/1163)).
* Group bundle run flags by job and pipeline types
([#1174](https://github.com/databricks/cli/pull/1174)).
* Make sure grouped flags are added to the command flag set
([#1180](https://github.com/databricks/cli/pull/1180)).
* Add short_name helper function to bundle init templates
([#1167](https://github.com/databricks/cli/pull/1167)).

Internal:
* Fix dynamic representation of zero values in maps and slices
([#1154](https://github.com/databricks/cli/pull/1154)).
* Refactor library to artifact matching to not use pointers
([#1172](https://github.com/databricks/cli/pull/1172)).
* Harden `dyn.Value` equality check
([#1173](https://github.com/databricks/cli/pull/1173)).
* Ensure every variable reference is passed to lookup function
([#1176](https://github.com/databricks/cli/pull/1176)).
* Empty struct should yield empty map in `convert.FromTyped`
([#1177](https://github.com/databricks/cli/pull/1177)).
* Zero destination struct in `convert.ToTyped`
([#1178](https://github.com/databricks/cli/pull/1178)).
* Fix integration test with invalid configuration
([#1182](https://github.com/databricks/cli/pull/1182)).
* Use `acc.WorkspaceTest` helper from bundle integration tests
([#1181](https://github.com/databricks/cli/pull/1181)).
2024-02-07 15:05:03 +00:00
Pieter Noordhuis f8b0f783ea
Use `acc.WorkspaceTest` helper from bundle integration tests (#1181)
## Changes

This helper:
* Constructs a context
* Constructs a `*databricks.WorkspaceClient`
* Ensures required environment variables are present to run an
integration test
* Enables debugging integration tests from VS Code

Debugging integration tests (from VS Code) is made possible by a prelude
in the helper that checks if the calling process is a debug binary, and
if so, sources environment variables from
`~/..databricks/debug-env.json` (if present).

## Tests

Integration tests still pass.

---------

Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-02-07 11:18:56 +00:00
Andrew Nester 6edab93233
Added warning when trying to deploy bundle with `--fail-if-running` and running resources (#1163)
## Changes
Deploying bundle when there are bundle resources running at the same
time can be disruptive for jobs and pipelines in progress.

With this change during deployment phase (before uploading any
resources) if there is `--fail-if-running` specified DABs will check if
there are any resources running and if so, will fail the deployment

## Tests
Manual + add tests
2024-02-07 11:17:17 +00:00
Pieter Noordhuis b64e11304c
Fix integration test with invalid configuration (#1182)
## Changes

The indentation mistake on the `path` field under `notebook` meant the
pipeline had a single entry with a `nil` notebook field. This was
allowed but incorrect.

While working on the `dyn.Value` approach, this yielded a non-nil but
zeroed `notebook` field and a failure to translate an empty path.

## Tests

Correcting the indentation made the test fail because the file is not a
notebook. I changed it to a `file` reference and the test now passes.
2024-02-07 10:53:50 +00:00
Andrew Nester de363faa53
Make sure grouped flags are added to the command flag set (#1180)
## Changes
Make sure grouped flags are added to the command flag set

## Tests
Added regression tests
2024-02-07 10:27:13 +00:00
Pieter Noordhuis 0b5fdcc346
Zero destination struct in `convert.ToTyped` (#1178)
## Changes

Not doing this means that the output struct is not a true representation
of the `dyn.Value` and unrepresentable state (e.g. unexported fields)
can be carried over across `convert.ToTyped` calls.

## Tests

Unit tests.
2024-02-07 09:25:53 +00:00
Pieter Noordhuis dcb9c85201
Empty struct should yield empty map in `convert.FromTyped` (#1177)
## Changes

This was an issue in cases where the typed structure contains a non-nil
pointer to an empty struct. After conversion to a `dyn.Value` and back
to the typed structure, the pointer became nil.

## Tests

Unit tests.
2024-02-07 09:25:07 +00:00
Pieter Noordhuis 6e075e8cf8
Revert "Filter current user from resource permissions (#1145)" (#1179)
## Changes

This reverts commit 4131069a4b.

The integration test for metadata computation failed. The back and forth
to `dyn.Value` erases unexported fields that the code currently still
depends on. We'll have to retry on top of #1098.
2024-02-07 09:22:44 +00:00
Pieter Noordhuis f54e790a3b
Ensure every variable reference is passed to lookup function (#1176)
## Changes

References to keys that themselves are also variable references were
shortcircuited in the previous approach. This meant that certain fields
were resolved even if the lookup function would have instructed to skip
resolution.

To fix this we separate the memoization of resolved variable references
from the memoization of lookups. Now, every variable reference is passed
through the lookup function.

## Tests

Before this change, the new test failed with:
```
=== RUN   TestResolveWithSkipEverything
    [...]/libs/dyn/dynvar/resolve_test.go:208: 
        	Error Trace:	[...]/libs/dyn/dynvar/resolve_test.go:208
        	Error:      	Not equal: 
        	            	expected: "${d} ${c} ${c} ${d}"
        	            	actual  : "${b} ${a} ${a} ${b}"
        	            	
        	            	Diff:
        	            	--- Expected
        	            	+++ Actual
        	            	@@ -1 +1 @@
        	            	-${d} ${c} ${c} ${d}
        	            	+${b} ${a} ${a} ${b}
        	Test:       	TestResolveWithSkipEverything
```
2024-02-06 15:01:49 +00:00
Andrew Nester 2bbb644749
Group bundle run flags by job and pipeline types (#1174)
## Changes
Group bundle run flags by job and pipeline types

## Tests
```
Run a resource (e.g. a job or a pipeline)

Usage:
  databricks bundle run [flags] KEY

Job Flags:
      --dbt-commands strings                 A list of commands to execute for jobs with DBT tasks.
      --jar-params strings                   A list of parameters for jobs with Spark JAR tasks.
      --notebook-params stringToString       A map from keys to values for jobs with notebook tasks. (default [])
      --params stringToString                comma separated k=v pairs for job parameters (default [])
      --pipeline-params stringToString       A map from keys to values for jobs with pipeline tasks. (default [])
      --python-named-params stringToString   A map from keys to values for jobs with Python wheel tasks. (default [])
      --python-params strings                A list of parameters for jobs with Python tasks.
      --spark-submit-params strings          A list of parameters for jobs with Spark submit tasks.
      --sql-params stringToString            A map from keys to values for jobs with SQL tasks. (default [])

Pipeline Flags:
      --full-refresh strings   List of tables to reset and recompute.
      --full-refresh-all       Perform a full graph reset and recompute.
      --refresh strings        List of tables to update.
      --refresh-all            Perform a full graph update.

Flags:
  -h, --help      help for run
      --no-wait   Don't wait for the run to complete.

Global Flags:
      --debug            enable debug logging
  -o, --output type      output type: text or json (default text)
  -p, --profile string   ~/.databrickscfg profile
  -t, --target string    bundle target to use (if applicable)
      --var strings      set values for variables defined in bundle config. Example: --var="foo=bar"
   ```
2024-02-06 14:51:02 +00:00
shreyas-goenka 4131069a4b
Filter current user from resource permissions (#1145)
## Changes
The databricks terraform provider does not allow changing permission of
the current user. Instead, the current identity is implictly set to be
the owner of all resources on the platform side.

This PR introduces a mutator to filter permissions from the bundle
configuration, allowing users to define permissions for their own
identities in their bundle config.

This would allow configurations like, allowing both alice and bob to
collaborate on the same DAB:
```
permissions:
  level: CAN_MANAGE
  user_name: alice

  level: CAN_MANAGE
  user_name: bob
```

## Tests
Unit test and manually
2024-02-06 12:45:08 +00:00
Pieter Noordhuis 20e45b87ae
Harden `dyn.Value` equality check (#1173)
## Changes

This function could panic when either side of the comparison is a nil or
empty slice. This logic is triggered when comparing the input value to
the output value when calling `dyn.Map`.

## Tests

Unit tests.
2024-02-05 16:54:41 +00:00
Pieter Noordhuis 33c446dadd
Refactor library to artifact matching to not use pointers (#1172)
## Changes

The approach to do this was:
1. Iterate over all libraries in all job tasks
2. Find references to local libraries
3. Store pointer to `compute.Library` in the matching artifact file to
signal it should be uploaded

This breaks down when introducing #1098 because we can no longer track
unexported state across mutators. The approach in this PR performs the
path matching twice; once in the matching mutator where we check if each
referenced file has an artifacts section, and once during artifact
upload to rewrite the library path from a local file reference to an
absolute Databricks path.

## Tests

Integration tests pass.
2024-02-05 15:29:45 +00:00