Commit Graph

304 Commits

Author SHA1 Message Date
Andrew Nester c7818560ca
Add usage string when command fails with incorrect arguments (#1276)
## Changes
Add usage string when command fails with incorrect arguments

Fixes #1119

## Tests
Example output

```
> databricks libraries cluster-status 
Error: accepts 1 arg(s), received 0

Usage:
  databricks libraries cluster-status CLUSTER_ID [flags]

Flags:
  -h, --help   help for cluster-status

Global Flags:
      --debug            enable debug logging
  -o, --output type      output type: text or json (default text)
  -p, --profile string   ~/.databrickscfg profile
  -t, --target string    bundle target to use (if applicable)
```
2024-03-12 14:12:34 +00:00
Pieter Noordhuis 4a9a12af19
Retain location annotation when expanding globs for pipeline libraries (#1274)
## Changes

We now keep location metadata associated with every configuration value.
When expanding globs for pipeline libraries, this annotation was erased
because of the conversion to/from the typed structure. This change
modifies the expansion mutator to work with `dyn.Value` and retain the
location of the value that holds the glob pattern.

## Tests

Unit tests pass.
2024-03-11 21:59:36 +00:00
shreyas-goenka d5dc2bd1ca
Filter current user from resource permissions (#1262)
## Changes
The databricks terraform provider does not allow changing permission of
the current user. Instead, the current identity is implictly set to be
the owner of all resources on the platform side.

This PR introduces a mutator to filter permissions from the bundle
configuration at deploy time, allowing users to define permissions for
their own identities in their bundle config.

This would allow configurations like, allowing both alice and bob to
collaborate on the same DAB:
```
permissions:
  level: CAN_MANAGE
  user_name: alice

  level: CAN_MANAGE
  user_name: bob
```

This PR is a reincarnation of
https://github.com/databricks/cli/pull/1145. The earlier attempt had to
be reverted due to metadata loss converting to and from the dynamic
configuration representation (reverted here:
https://github.com/databricks/cli/pull/1179)

## Tests
Unit test and manually
2024-03-11 15:05:15 +00:00
Pieter Noordhuis c05c0cd941
Include `dyn.Path` as argument to the visit callback function (#1260)
## Changes

This change means the callback supplied to `dyn.Foreach` can introspect
the path of the value it is being called for. It also prepares for
allowing visiting path patterns where the exact path is not known
upfront.

## Tests

Unit tests.
2024-03-07 13:56:50 +00:00
Pieter Noordhuis 74b1e05ed7
Update Go SDK to v0.34.0 (#1256)
## Changes

SDK release
https://github.com/databricks/databricks-sdk-go/releases/tag/v0.34.0

This incorporates two changes to the generation code:
* Use explicit empty check for response types (see
https://github.com/databricks/databricks-sdk-go/pull/831)
* Support subservices for the settings commands (see
https://github.com/databricks/databricks-sdk-go/pull/826)

As part of the subservices support, this change also updates how methods
are registered with their services. This used to be done with `init`
functions and now through inline function calls. This should have a
(negligible) positive impact on binary start time because we no longer
have to call as many `init` functions.

## Tests

tbd
2024-03-06 09:53:44 +00:00
Andrew Nester ecf9c52f61
Support relative paths in artifact files source section and always upload all artifact files (#1247)
Support relative paths in artifact files source section and always
upload all artifact files

Fixes #1156

## Tests
Added unit tests
2024-03-04 20:28:15 +00:00
Andrew Nester 09d1846e13
Return `application_id` for service principal lookups (#1245)
## Changes
Return ApplicationId for service principals lookups

Fixes #1234 

## Tests
Added (regression) tests
2024-03-04 16:12:10 +00:00
Andrew Nester 29ab96f327
Only transform wheel libraries when using trampoline (#1248)
## Changes
Only transform wheel libraries when using trampoline

## Tests
Added regression test
2024-03-04 12:34:03 +00:00
Pieter Noordhuis 04827688fb
Add `--validate-only` flag to run validate-only pipeline update (#1251)
## Changes

This flag starts a "validation-only" update.

## Tests

Unit and manual confirmation it does what it should.
2024-03-04 08:38:32 +00:00
Ilia Babanov d12f88e24d
Fix summary command when internal terraform config doesn't exist (#1242)
Check if `bundle.tf.json` doesn't exist and create it before executing
`terraform init` (inside `terraform.Load`)

Fixes a problem when during `terraform.Load` it fails with: 
```
Error: Failed to load plugin schemas

Error while loading schemas for plugin components: Failed to obtain provider
schema: Could not load the schema for provider
registry.terraform.io/databricks/databricks: failed to instantiate provider
"registry.terraform.io/databricks/databricks" to obtain schema: unavailable
provider "registry.terraform.io/databricks/databricks"..
```
2024-03-01 08:25:12 +00:00
Andrew Nester 0839e6f66a
Added test to verify scripts.Execute mutator works correctly (#1237)
## Changes
Follow up to https://github.com/databricks/cli/pull/1232
2024-02-26 10:08:03 +00:00
Andrew Nester 1dbc086e5a
Upgrade Terraform provider to 1.37.0 (#1235)
## Changes
Upgrade Terraform provider to 1.37.0

Currently we're using 1.36.2 version which uses Go SDK 0.30 which does
not have U2M enabled for all clouds.
Upgrading to 1.37.0 allows TF provider (and thus DABs) to use U2M

Fixes #1231
2024-02-23 10:41:42 +00:00
Andrew Nester 1588a14d07
Add correct tag value for models in dev mode (#1230)
## Changes
Fixes #922

## Tests
Added regression test case
2024-02-22 14:52:49 +00:00
Miles Yucht b65ce75c1f
Use Go SDK Iterators when listing resources with the CLI (#1202)
## Changes
Currently, when the CLI run a list API call (like list jobs), it uses
the `List*All` methods from the SDK, which list all resources in the
collection. This is very slow for large collections: if you need to list
all jobs from a workspace that has 10,000+ jobs, you'll be waiting for
at least 100 RPCs to complete before seeing any output.

Instead of using List*All() methods, the SDK recently added an iterator
data structure that allows traversing the collection without needing to
completely list it first. New pages are fetched lazily if the next
requested item belongs to the next page. Using the List() methods that
return these iterators, the CLI can proactively print out some of the
response before the complete collection has been fetched.

This involves a pretty major rewrite of the rendering logic in `cmdio`.
The idea there is to define custom rendering logic based on the type of
the provided resource. There are three renderer interfaces:

1. textRenderer: supports printing something in a textual format (i.e.
not JSON, and not templated).
2. jsonRenderer: supports printing something in a pretty-printed JSON
format.
3. templateRenderer: supports printing something using a text template.

There are also three renderer implementations:

1. readerRenderer: supports printing a reader. This only implements the
textRenderer interface.
2. iteratorRenderer: supports printing a `listing.Iterator` from the Go
SDK. This implements jsonRenderer and templateRenderer, buffering 20
resources at a time before writing them to the output.
3. defaultRenderer: supports printing arbitrary resources (the previous
implementation).

Callers will either use `cmdio.Render()` for rendering individual
resources or `io.Reader` or `cmdio.RenderIterator()` for rendering an
iterator. This separate method is needed to safely be able to match on
the type of the iterator, since Go does not allow runtime type matches
on generic types with an existential type parameter.

One other change that needs to happen is to split the templates used for
text representation of list resources into a header template and a row
template. The template is now executed multiple times for List API
calls, but the header should only be printed once. To support this, I
have added `headerTemplate` to `cmdIO`, and I have also changed
`RenderWithTemplate` to include a `headerTemplate` parameter everywhere.

## Tests
- [x] Unit tests for text rendering logic
- [x] Unit test for reflection-based iterator construction.

---------

Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-02-21 14:16:36 +00:00
dependabot[bot] d9f34e6b22
Bump github.com/databricks/databricks-sdk-go from 0.32.0 to 0.33.0 (#1222)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.32.0 to 0.33.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.33.0</h2>
<p>Internal Changes:</p>
<ul>
<li>Add helper function to get header fields (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/822">#822</a>).</li>
<li>Add Int64 to header type injection (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/819">#819</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#LakehouseMonitorsAPI">w.LakehouseMonitors</a>
workspace-level service with new required argument order.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTablesAPI">w.OnlineTables</a>
workspace-level service.</li>
<li>Removed <code>AssetsDir</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateMonitor">catalog.UpdateMonitor</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ContinuousUpdateStatus">catalog.ContinuousUpdateStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteOnlineTableRequest">catalog.DeleteOnlineTableRequest</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FailedStatus">catalog.FailedStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetOnlineTableRequest">catalog.GetOnlineTableRequest</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTable">catalog.OnlineTable</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableSpec">catalog.OnlineTableSpec</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableState">catalog.OnlineTableState</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableStatus">catalog.OnlineTableStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#PipelineProgress">catalog.PipelineProgress</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ProvisioningStatus">catalog.ProvisioningStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TriggeredUpdateStatus">catalog.TriggeredUpdateStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ViewData">catalog.ViewData</a>.</li>
<li>Added <code>ContentLength</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Added <code>ContentType</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Added <code>LastModified</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Changed <code>LastModified</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#GetMetadataResponse">files.GetMetadataResponse</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#LastModifiedHttpDate">files.LastModifiedHttpDate</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#LastModifiedHttpDate">files.LastModifiedHttpDate</a>.</li>
<li>Removed <code>Config</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>Ai21labsConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>AnthropicConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>AwsBedrockConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>CohereConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>DatabricksModelServingConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>OpenaiConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>PalmConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModelConfig">serving.ExternalModelConfig</a>.</li>
<li>Added <code>MaxProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput">serving.ServedEntityInput</a>.</li>
<li>Added <code>MinProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput">serving.ServedEntityInput</a>.</li>
<li>Added <code>MaxProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput">serving.ServedEntityOutput</a>.</li>
<li>Added <code>MinProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput">serving.ServedEntityOutput</a>.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>0.33.0</h2>
<p>Internal Changes:</p>
<ul>
<li>Add helper function to get header fields (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/822">#822</a>).</li>
<li>Add Int64 to header type injection (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/819">#819</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#LakehouseMonitorsAPI">w.LakehouseMonitors</a>
workspace-level service with new required argument order.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTablesAPI">w.OnlineTables</a>
workspace-level service.</li>
<li>Removed <code>AssetsDir</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateMonitor">catalog.UpdateMonitor</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ContinuousUpdateStatus">catalog.ContinuousUpdateStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteOnlineTableRequest">catalog.DeleteOnlineTableRequest</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FailedStatus">catalog.FailedStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetOnlineTableRequest">catalog.GetOnlineTableRequest</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTable">catalog.OnlineTable</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableSpec">catalog.OnlineTableSpec</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableState">catalog.OnlineTableState</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableStatus">catalog.OnlineTableStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#PipelineProgress">catalog.PipelineProgress</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ProvisioningStatus">catalog.ProvisioningStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TriggeredUpdateStatus">catalog.TriggeredUpdateStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ViewData">catalog.ViewData</a>.</li>
<li>Added <code>ContentLength</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Added <code>ContentType</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Added <code>LastModified</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Changed <code>LastModified</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#GetMetadataResponse">files.GetMetadataResponse</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#LastModifiedHttpDate">files.LastModifiedHttpDate</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#LastModifiedHttpDate">files.LastModifiedHttpDate</a>.</li>
<li>Removed <code>Config</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>Ai21labsConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>AnthropicConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>AwsBedrockConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>CohereConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>DatabricksModelServingConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>OpenaiConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>PalmConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModelConfig">serving.ExternalModelConfig</a>.</li>
<li>Added <code>MaxProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput">serving.ServedEntityInput</a>.</li>
<li>Added <code>MinProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput">serving.ServedEntityInput</a>.</li>
<li>Added <code>MaxProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput">serving.ServedEntityOutput</a>.</li>
<li>Added <code>MinProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput">serving.ServedEntityOutput</a>.</li>
</ul>
<p>OpenAPI SHA: cdd76a98a4fca7008572b3a94427566dd286c63b, Date:
2024-02-19</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="eba5c8b3ae"><code>eba5c8b</code></a>
Release v0.33.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/823">#823</a>)</li>
<li><a
href="6846045a98"><code>6846045</code></a>
Add Int64 to header type injection (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/819">#819</a>)</li>
<li><a
href="c6a803ae18"><code>c6a803a</code></a>
Add helper function to get header fields (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/822">#822</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.32.0...v0.33.0">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.32.0&new-version=0.33.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-02-19 14:30:06 +00:00
Pieter Noordhuis a2a4948047
Allow use of variables references in primitive non-string fields (#1219)
## Changes

This change enables the use of bundle variables for boolean, integer,
and floating point fields.

## Tests

* Unit tests.
* I ran a manual test to confirm parameterizing the number of workers in
a cluster definition works.
2024-02-19 10:44:51 +00:00
Pieter Noordhuis f70ec359dc
Use `dyn.Value` as input to generating Terraform JSON (#1218)
## Changes

This builds on #1098 and uses the `dyn.Value` representation of the
bundle configuration to generate the Terraform JSON definition of
resources in the bundle.

The existing code (in `BundleToTerraform`) was not great and in an
effort to slightly improve this, I added a package `tfdyn` that includes
dedicated files for each resource type. Every resource type has its own
conversion type that takes the `dyn.Value` of the bundle-side resource
and converts it into Terraform resources (e.g. a job and optionally its
permissions).

Because we now use a `dyn.Value` as input, we can represent and emit
zero-values that have so far been omitted. For example, setting
`num_workers: 0` in your bundle configuration now propagates all the way
to the Terraform JSON definition.

## Tests

* Unit tests for every converter. I reused the test inputs from
`convert_test.go`.
* Equivalence tests in every existing test case checks that the
resulting JSON is identical.
* I manually compared the TF JSON file generated by the CLI from the
main branch and from this PR on all of our bundles and bundle examples
(internal and external) and found the output doesn't change (with the
exception of the odd zero-value being included by the version in this
PR).
2024-02-16 20:54:38 +00:00
Pieter Noordhuis 87dd46a3f8
Use dynamic configuration model in bundles (#1098)
## Changes

This is a fundamental change to how we load and process bundle
configuration. We now depend on the configuration being represented as a
`dyn.Value`. This representation is functionally equivalent to Go's
`any` (it is variadic) and allows us to capture metadata associated with
a value, such as where it was defined (e.g. file, line, and column). It
also allows us to represent Go's zero values properly (e.g. empty
string, integer equal to 0, or boolean false).

Using this representation allows us to let the configuration model
deviate from the typed structure we have been relying on so far
(`config.Root`). We need to deviate from these types when using
variables for fields that are not a string themselves. For example,
using `${var.num_workers}` for an integer `workers` field was impossible
until now (though not implemented in this change).

The loader for a `dyn.Value` includes functionality to capture any and
all type mismatches between the user-defined configuration and the
expected types. These mismatches can be surfaced as validation errors in
future PRs.

Given that many mutators expect the typed struct to be the source of
truth, this change converts between the dynamic representation and the
typed representation on mutator entry and exit. Existing mutators can
continue to modify the typed representation and these modifications are
reflected in the dynamic representation (see `MarkMutatorEntry` and
`MarkMutatorExit` in `bundle/config/root.go`).

Required changes included in this change:
* The existing interpolation package is removed in favor of
`libs/dyn/dynvar`.
* Functionality to merge job clusters, job tasks, and pipeline clusters
are now all broken out into their own mutators.

To be implemented later:
* Allow variable references for non-string types.
* Surface diagnostics about the configuration provided by the user in
the validation output.
* Some mutators use a resource's configuration file path to resolve
related relative paths. These depend on `bundle/config/paths.Path` being
set and populated through `ConfigureConfigFilePath`. Instead, they
should interact with the dynamically typed configuration directly. Doing
this also unlocks being able to differentiate different base paths used
within a job (e.g. a task override with a relative path defined in a
directory other than the base job).

## Tests

* Existing unit tests pass (some have been modified to accommodate)
* Integration tests pass
2024-02-16 19:41:58 +00:00
Pieter Noordhuis 788ec81785
Use `any` as type for data sources and resources in `tf/schema` (#1216)
## Changes

We plan to use the any-equivalent of a `dyn.Value` such that we can use
variable references for non-string fields (e.g.
`${databricks_job.some_job.id}` where an integer is expected), as well
as properly emit zero values for primitive types (e.g. 0 for integers or
false for booleans).

This change is in preparation for the above.

## Tests

Unit tests.
2024-02-16 12:46:24 +00:00
Pieter Noordhuis ffae10d904
Bump Terraform provider to v1.36.2 (#1215)
## Changes

* Update `go.mod` with latest dependencies
* Update `go.mod` to require Go 1.21 to match root `go.mod`
* Regenerate structs for Terraform provider v1.36.2

## Tests

n/a
2024-02-16 07:05:45 +00:00
dependabot[bot] 299e9b56a6
Bump github.com/databricks/databricks-sdk-go from 0.30.1 to 0.32.0 (#1199)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.30.1 to 0.32.0.

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-02-15 14:52:17 +00:00
Andrew Nester 80670eceed
Added `bundle deployment bind` and `unbind` command (#1131)
## Changes
Added `bundle deployment bind` and `unbind` command.

This command allows to bind bundle-defined resources to existing
resources in Databricks workspace so they become DABs-managed.

## Tests
Manually + added E2E test
2024-02-14 18:04:45 +00:00
Miles Yucht e8b0698e19
Regenerate the CLI using the same OpenAPI spec as the SDK (#1205)
## Changes
The OpenAPI spec used to generate the CLI doesn't match the version used
for the SDK version that the CLI currently depends on. This PR
regenerates the CLI based on the same version of the OpenAPI spec used
by the SDK on v0.30.1.

## Tests
<!-- How is this tested? -->
2024-02-13 14:33:59 +00:00
shreyas-goenka 52b813bd8e
Skip `for_each_task` when generating the bundle schema (#1204)
## Changes
Bundle schema generation does not support recursive API fields. This PR
skips generation for for_each_task until we add proper support for
recursive types in the bundle schema.

## Tests
Manually. This fixes the generation of the CLI and the bundle schema
command works as expected, with the sub-schema for `for_each_task` being
set to null in the output.
```
"for_each_task": null,
```
2024-02-13 14:13:47 +00:00
Andrew Nester bc30c9ed4a
Added `--restart` flag for `bundle run` command (#1191)
## Changes
Added `--restart` flag for `bundle run` command

When running with this flag, `bundle run` will cancel all existing runs
before starting a new one

## Tests
Manually
2024-02-09 14:33:14 +00:00
Pieter Noordhuis 4073e45d4b
Use mockery to generate mocks compatible with testify/mock (#1190)
## Changes

This is the same approach we use in the Go SDK.

## Tests

Tests pass.
2024-02-08 15:18:53 +00:00
Pieter Noordhuis f7d1a5862d
Use allowlist for Git-related fields to include in metadata (#1187)
## Changes

When new fields are added they should not automatically propagate to the
bundle metadata.

## Tests

Test passes.
2024-02-08 12:23:14 +00:00
Pieter Noordhuis 8e58e04e8f
Move folders package into libs (#1184)
## Changes

This is the last top-level package that doesn't need to be top-level.
2024-02-07 16:33:18 +00:00
Andrew Nester 6edab93233
Added warning when trying to deploy bundle with `--fail-if-running` and running resources (#1163)
## Changes
Deploying bundle when there are bundle resources running at the same
time can be disruptive for jobs and pipelines in progress.

With this change during deployment phase (before uploading any
resources) if there is `--fail-if-running` specified DABs will check if
there are any resources running and if so, will fail the deployment

## Tests
Manual + add tests
2024-02-07 11:17:17 +00:00
Andrew Nester de363faa53
Make sure grouped flags are added to the command flag set (#1180)
## Changes
Make sure grouped flags are added to the command flag set

## Tests
Added regression tests
2024-02-07 10:27:13 +00:00
Pieter Noordhuis 6e075e8cf8
Revert "Filter current user from resource permissions (#1145)" (#1179)
## Changes

This reverts commit 4131069a4b.

The integration test for metadata computation failed. The back and forth
to `dyn.Value` erases unexported fields that the code currently still
depends on. We'll have to retry on top of #1098.
2024-02-07 09:22:44 +00:00
Andrew Nester 2bbb644749
Group bundle run flags by job and pipeline types (#1174)
## Changes
Group bundle run flags by job and pipeline types

## Tests
```
Run a resource (e.g. a job or a pipeline)

Usage:
  databricks bundle run [flags] KEY

Job Flags:
      --dbt-commands strings                 A list of commands to execute for jobs with DBT tasks.
      --jar-params strings                   A list of parameters for jobs with Spark JAR tasks.
      --notebook-params stringToString       A map from keys to values for jobs with notebook tasks. (default [])
      --params stringToString                comma separated k=v pairs for job parameters (default [])
      --pipeline-params stringToString       A map from keys to values for jobs with pipeline tasks. (default [])
      --python-named-params stringToString   A map from keys to values for jobs with Python wheel tasks. (default [])
      --python-params strings                A list of parameters for jobs with Python tasks.
      --spark-submit-params strings          A list of parameters for jobs with Spark submit tasks.
      --sql-params stringToString            A map from keys to values for jobs with SQL tasks. (default [])

Pipeline Flags:
      --full-refresh strings   List of tables to reset and recompute.
      --full-refresh-all       Perform a full graph reset and recompute.
      --refresh strings        List of tables to update.
      --refresh-all            Perform a full graph update.

Flags:
  -h, --help      help for run
      --no-wait   Don't wait for the run to complete.

Global Flags:
      --debug            enable debug logging
  -o, --output type      output type: text or json (default text)
  -p, --profile string   ~/.databrickscfg profile
  -t, --target string    bundle target to use (if applicable)
      --var strings      set values for variables defined in bundle config. Example: --var="foo=bar"
   ```
2024-02-06 14:51:02 +00:00
shreyas-goenka 4131069a4b
Filter current user from resource permissions (#1145)
## Changes
The databricks terraform provider does not allow changing permission of
the current user. Instead, the current identity is implictly set to be
the owner of all resources on the platform side.

This PR introduces a mutator to filter permissions from the bundle
configuration, allowing users to define permissions for their own
identities in their bundle config.

This would allow configurations like, allowing both alice and bob to
collaborate on the same DAB:
```
permissions:
  level: CAN_MANAGE
  user_name: alice

  level: CAN_MANAGE
  user_name: bob
```

## Tests
Unit test and manually
2024-02-06 12:45:08 +00:00
Pieter Noordhuis 33c446dadd
Refactor library to artifact matching to not use pointers (#1172)
## Changes

The approach to do this was:
1. Iterate over all libraries in all job tasks
2. Find references to local libraries
3. Store pointer to `compute.Library` in the matching artifact file to
signal it should be uploaded

This breaks down when introducing #1098 because we can no longer track
unexported state across mutators. The approach in this PR performs the
path matching twice; once in the matching mutator where we check if each
referenced file has an artifacts section, and once during artifact
upload to rewrite the library path from a local file reference to an
absolute Databricks path.

## Tests

Integration tests pass.
2024-02-05 15:29:45 +00:00
shreyas-goenka cb3ad737f1
Add short_name helper function to bundle init templates (#1167)
## Changes
Adds the short_name helper function. short_name is useful when templates
do not want to print the full userName (typically email or service
principal application-id) of the current user.

## Tests
Integration test. Also adds integration tests for other helper functions
that interact with the Databricks API.
2024-02-01 16:46:07 +00:00
Andrew Nester 0b3eeb8e54
Allow specifying executable in artifact section and skip bash from WSL (#1169)
## Changes
Allow specifying executable in artifact section

```
artifacts:
  test:
    type: whl
    executable: bash
    ...
```

We also skip bash found on Windows if it's from WSL because it won't be
correctly executed, see the issue above

Fixes #1159
2024-02-01 14:10:04 +00:00
Andrew Nester f269f8015d
Added `bundle generate pipeline` command (#1139)
## Changes
Added `bundle generate pipeline` command

Usage as the following

```
databricks bundle generate pipeline --existing-pipeline-id f3b8c580-0a88-4b55-xxxx-yyyyyyyyyy
```

## Tests
Manually + added E2E test
2024-01-25 11:35:14 +00:00
Ilia Babanov 9c3e4fda7c
Add "bundle summary" command (#1123)
The plan is to use the new command in the Databricks VSCode extension to
render "modified" UI state in the bundle resource tree elements, plus
use resource IDs to generate links for the resources

### New revision
- Renamed `remote-state` to `summary`
- Added "modified statuses" to all resources. Currently we don't set
"updated" status - it's either nothing, or created/deleted
- Added tests for the `TerraformToBundle` command
2024-01-25 11:32:47 +00:00
shreyas-goenka cf2a1c38ba
Set run_as permissions after variable interpolation (#1141)
## Changes

This PR sets run as permissions after variable interpolation.

Terraform does not allow specifying permissions for current user.

The following configuration would fail becuase we would assign a
permission block for self, bypassing this check here:
4ee926b885/bundle/config/mutator/run_as.go (L47)

```
run_as:
  user_name: ${workspace.current_user.userName}
```



## Tests
Manually, setting run_as to ${workspace.current_user.userName} works now
2024-01-24 12:22:04 +00:00
Andrew Nester 1b6241746e
Use MockWorkspaceClient from SDK instead of WithImpl mocking (#1134)
## Changes
Use MockWorkspaceClient from SDK instead of WithImpl mocking
2024-01-19 14:12:58 +00:00
Andrew Nester 70fe0e36ef
Added `databricks bundle generate job` command (#1043)
## Changes
Now it's possible to generate bundle configuration for existing job.
For now it only supports jobs with notebook tasks.

It will download notebooks referenced in the job tasks and generate
bundle YAML config for this job which can be included in larger bundle.

## Tests
Running command manually

Example of generated config
```
resources:
  jobs:
    job_128737545467921:
      name: Notebook job
      format: MULTI_TASK
      tasks:
        - task_key: as_notebook
          existing_cluster_id: 0704-xxxxxx-yyyyyyy
          notebook_task:
            base_parameters:
              bundle_root: /Users/andrew.nester@databricks.com/.bundle/job_with_module_imports/development/files
            notebook_path: ./entry_notebook.py
            source: WORKSPACE
          run_if: ALL_SUCCESS
      max_concurrent_runs: 1
 ```

## Tests
Manual (on our last 100 jobs) + added end-to-end test

```
--- PASS: TestAccGenerateFromExistingJobAndDeploy (50.91s)
PASS
coverage: 61.5% of statements in ./...
ok github.com/databricks/cli/internal/bundle 51.209s coverage: 61.5% of
statements in ./...
```
2024-01-17 14:26:33 +00:00
Andrew Nester ef67b1755e
Do not require positional arguments if they should be provided in JSON (#1125)
## Changes
Do not require positional arguments if they should be provided in JSON

Fixes #1122
2024-01-17 10:53:50 +00:00
Pieter Noordhuis 06b50670e1
Support passing job parameters to bundle run (#1115)
## Changes

This change adds support for job parameters. If job parameters are
specified for a job that doesn't define job parameters it returns an
error. Conversely, if task parameters are specified for a job that
defines job parameters, it also returns an error.

This change moves the options structs and their functions to separate
files and backfills test coverage for them.

Job parameters can now be specified with `--params foo=bar,bar=qux`.

## Tests

Unit tests and manual integration testing.
2024-01-15 07:42:36 +00:00
Pieter Noordhuis 3c76a11d00
Upgrade Go SDK to v0.29.0 (#1111)
## Changes

See:
* https://github.com/databricks/databricks-sdk-go/releases/tag/v0.29.0
* https://github.com/databricks/databricks-sdk-go/releases/tag/v0.28.0

## Tests

Unit and integration tests pass.
2024-01-11 08:16:25 +00:00
Pieter Noordhuis f5c46478f4
Upgrade golang.org/x/crypto to v0.17.0 in internal module (#1110)
## Changes

This addresses https://github.com/databricks/cli/security/dependabot/12.
2024-01-10 13:53:01 +00:00
Andrew Nester 4b01fff03d
Fixed instance pool resolving by name (#1102)
## Changes
Fixed instance pool resolving by name

## Tests
Added regression test
2024-01-05 10:50:53 +00:00
Andrew Nester 5fb40f9d07
Allow referencing bundle resources by name (#872)
## Changes
Now we can define variables with values which reference different
Databricks resources by name.
When references like this, DABs automatically looks up the resource by
this name and replaces the reference with ID of the resource referenced.
Thus when the variable is used in the configuration it will contain the
correct resolved ID of resource.

The resolvers are code generated and thus DABs support referencing all
resources which has `GetByName`-like methods in Go SDK.

### Example

```
variables:
  my_cluster_id:
    description: An existing cluster.
    lookup: 
      cluster: "12.2 shared"

resources:
  jobs:
    my_job:
      name: "My Job"
      tasks:
        - task_key: TestTask
          existing_cluster_id: ${var.my_cluster_id}

targets:
  dev:
    variables:
      my_cluster_id:
        lookup: 
           cluster: "dev-cluster"
```

## Tests
Added unit test + manual testing

---------

Co-authored-by: shreyas-goenka <88374338+shreyas-goenka@users.noreply.github.com>
2024-01-04 21:04:42 +00:00
Lennart Kats (databricks) 167deec8c3
Change recommended production deployment path from /Shared to /Users (#1091)
## Changes

This PR changes the default and `mode: production` recommendation to
target `/Users` for deployment. Previously, we used `/Shared`, but
because of a lack of POSIX-like permissions in WorkspaceFS this meant
that files inside would be readable and writable by other users in the
workspace.

Detailed change:
* `default-python` no longer uses a path that starts with `/Shared`
* `mode: production` no longer requires a path that starts with
`/Shared`
 
## Related PRs

Docs: https://github.com/databricks/docs/pull/14585
Examples: https://github.com/databricks/bundle-examples/pull/17

## Tests

* Manual tests
* Template unit tests (with an extra check to avoid /Shared)
2024-01-02 19:58:24 +00:00
Lennart Kats (databricks) 9a1f078bd9
Improve error when bundle root is not writable (#1093)
## Changes

This improves the error when deploying to a bundle root that the current
user doesn't have write access to. This can come up slightly more often
since the change of https://github.com/databricks/cli/pull/1091.

Before this change:

```
$ databricks bundle deploy --target prod
Building my_project...
Error: no such directory: /Users/lennart.kats@databricks.com/.bundle/my_project/prod/state
```

After this change:

```
$ databricks bundle deploy --target prod
Building my_project...
Error: cannot write to deployment root (this can indicate a previous deploy was done with a different identity): /Users/lennart.kats@databricks.com/.bundle/my_project/prod
```

Note that this change uses the "no such directory" error returned from
the filer.
2023-12-28 13:15:21 +00:00
Pieter Noordhuis fa3c8b1017
Use resource key as name in permissions code (#1087)
## Changes

The code relied on the `Name` property being accessible for every
resource. This is generally true, but because these property structs are
embedded as pointer, they can be nil. This is also why the tests had to
initialize the embedded struct to pass. This changes the approach to use
the keys from the resource map instead, so that we no longer rely on the
non-nil embedded struct.

Note: we should evaluate whether we should turn these into values
instead of pointers. I don't recall if we get value from them being
pointers.

## Tests

Unit tests pass.
2023-12-22 14:45:53 +00:00