Commit Graph

861 Commits

Author SHA1 Message Date
shreyas-goenka d638262665
Add spinner when downloading templates for bundle init (#1188)
## Changes
Templates can take a long time to download. This PR adds a spinner to
give feedback to users.

## Tests
<!-- How is this tested? -->
Manually


https://github.com/databricks/cli/assets/88374338/b453982c-3233-40f4-8d6f-f31606ff0195
2024-02-08 12:52:53 +00:00
Pieter Noordhuis a835a3e564
Ignore environment variables for `auth profiles` (#1189)
## Changes

If environment variables related to unified authentication are set and a
user runs `auth profiles`, the environment variables will interfere with
the output. This change only takes profile data into account for the
output.

## Tests

Added a unit test.
2024-02-08 12:25:51 +00:00
Pieter Noordhuis f7d1a5862d
Use allowlist for Git-related fields to include in metadata (#1187)
## Changes

When new fields are added they should not automatically propagate to the
bundle metadata.

## Tests

Test passes.
2024-02-08 12:23:14 +00:00
Pieter Noordhuis b1b5ad8acd
Log time it takes for profile to load (#1186)
## Changes

Aids debugging why `auth profiles` may take longer than expected.

## Tests

Confirmed manually that timing information shows up in the log output.
2024-02-08 11:10:52 +00:00
Pieter Noordhuis 8e58e04e8f
Move folders package into libs (#1184)
## Changes

This is the last top-level package that doesn't need to be top-level.
2024-02-07 16:33:18 +00:00
Andrew Nester f6cdc75825
Release v0.212.4 (#1183)
Bundles:
* Allow specifying executable in artifact section and skip bash from WSL
([#1169](https://github.com/databricks/cli/pull/1169)).
* Added warning when trying to deploy bundle with `--fail-if-running`
and running resources
([#1163](https://github.com/databricks/cli/pull/1163)).
* Group bundle run flags by job and pipeline types
([#1174](https://github.com/databricks/cli/pull/1174)).
* Make sure grouped flags are added to the command flag set
([#1180](https://github.com/databricks/cli/pull/1180)).
* Add short_name helper function to bundle init templates
([#1167](https://github.com/databricks/cli/pull/1167)).

Internal:
* Fix dynamic representation of zero values in maps and slices
([#1154](https://github.com/databricks/cli/pull/1154)).
* Refactor library to artifact matching to not use pointers
([#1172](https://github.com/databricks/cli/pull/1172)).
* Harden `dyn.Value` equality check
([#1173](https://github.com/databricks/cli/pull/1173)).
* Ensure every variable reference is passed to lookup function
([#1176](https://github.com/databricks/cli/pull/1176)).
* Empty struct should yield empty map in `convert.FromTyped`
([#1177](https://github.com/databricks/cli/pull/1177)).
* Zero destination struct in `convert.ToTyped`
([#1178](https://github.com/databricks/cli/pull/1178)).
* Fix integration test with invalid configuration
([#1182](https://github.com/databricks/cli/pull/1182)).
* Use `acc.WorkspaceTest` helper from bundle integration tests
([#1181](https://github.com/databricks/cli/pull/1181)).
2024-02-07 15:05:03 +00:00
Pieter Noordhuis f8b0f783ea
Use `acc.WorkspaceTest` helper from bundle integration tests (#1181)
## Changes

This helper:
* Constructs a context
* Constructs a `*databricks.WorkspaceClient`
* Ensures required environment variables are present to run an
integration test
* Enables debugging integration tests from VS Code

Debugging integration tests (from VS Code) is made possible by a prelude
in the helper that checks if the calling process is a debug binary, and
if so, sources environment variables from
`~/..databricks/debug-env.json` (if present).

## Tests

Integration tests still pass.

---------

Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-02-07 11:18:56 +00:00
Andrew Nester 6edab93233
Added warning when trying to deploy bundle with `--fail-if-running` and running resources (#1163)
## Changes
Deploying bundle when there are bundle resources running at the same
time can be disruptive for jobs and pipelines in progress.

With this change during deployment phase (before uploading any
resources) if there is `--fail-if-running` specified DABs will check if
there are any resources running and if so, will fail the deployment

## Tests
Manual + add tests
2024-02-07 11:17:17 +00:00
Pieter Noordhuis b64e11304c
Fix integration test with invalid configuration (#1182)
## Changes

The indentation mistake on the `path` field under `notebook` meant the
pipeline had a single entry with a `nil` notebook field. This was
allowed but incorrect.

While working on the `dyn.Value` approach, this yielded a non-nil but
zeroed `notebook` field and a failure to translate an empty path.

## Tests

Correcting the indentation made the test fail because the file is not a
notebook. I changed it to a `file` reference and the test now passes.
2024-02-07 10:53:50 +00:00
Andrew Nester de363faa53
Make sure grouped flags are added to the command flag set (#1180)
## Changes
Make sure grouped flags are added to the command flag set

## Tests
Added regression tests
2024-02-07 10:27:13 +00:00
Pieter Noordhuis 0b5fdcc346
Zero destination struct in `convert.ToTyped` (#1178)
## Changes

Not doing this means that the output struct is not a true representation
of the `dyn.Value` and unrepresentable state (e.g. unexported fields)
can be carried over across `convert.ToTyped` calls.

## Tests

Unit tests.
2024-02-07 09:25:53 +00:00
Pieter Noordhuis dcb9c85201
Empty struct should yield empty map in `convert.FromTyped` (#1177)
## Changes

This was an issue in cases where the typed structure contains a non-nil
pointer to an empty struct. After conversion to a `dyn.Value` and back
to the typed structure, the pointer became nil.

## Tests

Unit tests.
2024-02-07 09:25:07 +00:00
Pieter Noordhuis 6e075e8cf8
Revert "Filter current user from resource permissions (#1145)" (#1179)
## Changes

This reverts commit 4131069a4b.

The integration test for metadata computation failed. The back and forth
to `dyn.Value` erases unexported fields that the code currently still
depends on. We'll have to retry on top of #1098.
2024-02-07 09:22:44 +00:00
Pieter Noordhuis f54e790a3b
Ensure every variable reference is passed to lookup function (#1176)
## Changes

References to keys that themselves are also variable references were
shortcircuited in the previous approach. This meant that certain fields
were resolved even if the lookup function would have instructed to skip
resolution.

To fix this we separate the memoization of resolved variable references
from the memoization of lookups. Now, every variable reference is passed
through the lookup function.

## Tests

Before this change, the new test failed with:
```
=== RUN   TestResolveWithSkipEverything
    [...]/libs/dyn/dynvar/resolve_test.go:208: 
        	Error Trace:	[...]/libs/dyn/dynvar/resolve_test.go:208
        	Error:      	Not equal: 
        	            	expected: "${d} ${c} ${c} ${d}"
        	            	actual  : "${b} ${a} ${a} ${b}"
        	            	
        	            	Diff:
        	            	--- Expected
        	            	+++ Actual
        	            	@@ -1 +1 @@
        	            	-${d} ${c} ${c} ${d}
        	            	+${b} ${a} ${a} ${b}
        	Test:       	TestResolveWithSkipEverything
```
2024-02-06 15:01:49 +00:00
Andrew Nester 2bbb644749
Group bundle run flags by job and pipeline types (#1174)
## Changes
Group bundle run flags by job and pipeline types

## Tests
```
Run a resource (e.g. a job or a pipeline)

Usage:
  databricks bundle run [flags] KEY

Job Flags:
      --dbt-commands strings                 A list of commands to execute for jobs with DBT tasks.
      --jar-params strings                   A list of parameters for jobs with Spark JAR tasks.
      --notebook-params stringToString       A map from keys to values for jobs with notebook tasks. (default [])
      --params stringToString                comma separated k=v pairs for job parameters (default [])
      --pipeline-params stringToString       A map from keys to values for jobs with pipeline tasks. (default [])
      --python-named-params stringToString   A map from keys to values for jobs with Python wheel tasks. (default [])
      --python-params strings                A list of parameters for jobs with Python tasks.
      --spark-submit-params strings          A list of parameters for jobs with Spark submit tasks.
      --sql-params stringToString            A map from keys to values for jobs with SQL tasks. (default [])

Pipeline Flags:
      --full-refresh strings   List of tables to reset and recompute.
      --full-refresh-all       Perform a full graph reset and recompute.
      --refresh strings        List of tables to update.
      --refresh-all            Perform a full graph update.

Flags:
  -h, --help      help for run
      --no-wait   Don't wait for the run to complete.

Global Flags:
      --debug            enable debug logging
  -o, --output type      output type: text or json (default text)
  -p, --profile string   ~/.databrickscfg profile
  -t, --target string    bundle target to use (if applicable)
      --var strings      set values for variables defined in bundle config. Example: --var="foo=bar"
   ```
2024-02-06 14:51:02 +00:00
shreyas-goenka 4131069a4b
Filter current user from resource permissions (#1145)
## Changes
The databricks terraform provider does not allow changing permission of
the current user. Instead, the current identity is implictly set to be
the owner of all resources on the platform side.

This PR introduces a mutator to filter permissions from the bundle
configuration, allowing users to define permissions for their own
identities in their bundle config.

This would allow configurations like, allowing both alice and bob to
collaborate on the same DAB:
```
permissions:
  level: CAN_MANAGE
  user_name: alice

  level: CAN_MANAGE
  user_name: bob
```

## Tests
Unit test and manually
2024-02-06 12:45:08 +00:00
Pieter Noordhuis 20e45b87ae
Harden `dyn.Value` equality check (#1173)
## Changes

This function could panic when either side of the comparison is a nil or
empty slice. This logic is triggered when comparing the input value to
the output value when calling `dyn.Map`.

## Tests

Unit tests.
2024-02-05 16:54:41 +00:00
Pieter Noordhuis 33c446dadd
Refactor library to artifact matching to not use pointers (#1172)
## Changes

The approach to do this was:
1. Iterate over all libraries in all job tasks
2. Find references to local libraries
3. Store pointer to `compute.Library` in the matching artifact file to
signal it should be uploaded

This breaks down when introducing #1098 because we can no longer track
unexported state across mutators. The approach in this PR performs the
path matching twice; once in the matching mutator where we check if each
referenced file has an artifacts section, and once during artifact
upload to rewrite the library path from a local file reference to an
absolute Databricks path.

## Tests

Integration tests pass.
2024-02-05 15:29:45 +00:00
shreyas-goenka cb3ad737f1
Add short_name helper function to bundle init templates (#1167)
## Changes
Adds the short_name helper function. short_name is useful when templates
do not want to print the full userName (typically email or service
principal application-id) of the current user.

## Tests
Integration test. Also adds integration tests for other helper functions
that interact with the Databricks API.
2024-02-01 16:46:07 +00:00
Andrew Nester 0b3eeb8e54
Allow specifying executable in artifact section and skip bash from WSL (#1169)
## Changes
Allow specifying executable in artifact section

```
artifacts:
  test:
    type: whl
    executable: bash
    ...
```

We also skip bash found on Windows if it's from WSL because it won't be
correctly executed, see the issue above

Fixes #1159
2024-02-01 14:10:04 +00:00
shreyas-goenka 6beda4405e
Fix dynamic representation of zero values in maps and slices (#1154)
## Changes
In the dynamic configuration, the nil value (dyn.NilValue) denotes a
value that should not be serialized, ie a value being nil is the same as
it not existing in the first place.

This is not true for zero values in maps and slices. This PR fixes the
conversion from typed values to dyn.Value, to treat zero values in maps
and slices as zero and not nil.

## Tests
Unit tests
2024-01-31 14:25:13 +00:00
Andrew Nester 359f5f4468
Release v0.212.3 (#1166)
CLI:
* Release Windows packages to winget-pkgs
([#1144](https://github.com/databricks/cli/pull/1144)).

Bundles:
* Add `--key` flag for generate commands to specify resource key
([#1165](https://github.com/databricks/cli/pull/1165)).


Dependency updates:
* Bump github.com/google/uuid from 1.5.0 to 1.6.0
([#1160](https://github.com/databricks/cli/pull/1160)).
* Update Go SDK to v0.30.1
([#1162](https://github.com/databricks/cli/pull/1162)).
2024-01-31 12:36:01 +00:00
Andrew Nester b28432afed
Add `--key` flag for generate commands to specify resource key (#1165)
## Changes
Add --key for generate commands to specify resource key.

Also, resource config files are now not prefixed anymore.

## Tests
Integration tests passed

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-01-31 10:23:35 +00:00
dependabot[bot] 5fda017057
Bump github.com/google/uuid from 1.5.0 to 1.6.0 (#1160)
Bumps [github.com/google/uuid](https://github.com/google/uuid) from
1.5.0 to 1.6.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/google/uuid/releases">github.com/google/uuid's
releases</a>.</em></p>
<blockquote>
<h2>v1.6.0</h2>
<h2><a
href="https://github.com/google/uuid/compare/v1.5.0...v1.6.0">1.6.0</a>
(2024-01-16)</h2>
<h3>Features</h3>
<ul>
<li>add Max UUID constant (<a
href="https://redirect.github.com/google/uuid/issues/149">#149</a>) (<a
href="c58770eb49">c58770e</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li>fix typo in version 7 uuid documentation (<a
href="https://redirect.github.com/google/uuid/issues/153">#153</a>) (<a
href="016b199544">016b199</a>)</li>
<li>Monotonicity in UUIDv7 (<a
href="https://redirect.github.com/google/uuid/issues/150">#150</a>) (<a
href="a2b2b32373">a2b2b32</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/google/uuid/blob/master/CHANGELOG.md">github.com/google/uuid's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/google/uuid/compare/v1.5.0...v1.6.0">1.6.0</a>
(2024-01-16)</h2>
<h3>Features</h3>
<ul>
<li>add Max UUID constant (<a
href="https://redirect.github.com/google/uuid/issues/149">#149</a>) (<a
href="c58770eb49">c58770e</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li>fix typo in version 7 uuid documentation (<a
href="https://redirect.github.com/google/uuid/issues/153">#153</a>) (<a
href="016b199544">016b199</a>)</li>
<li>Monotonicity in UUIDv7 (<a
href="https://redirect.github.com/google/uuid/issues/150">#150</a>) (<a
href="a2b2b32373">a2b2b32</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="0f11ee6918"><code>0f11ee6</code></a>
chore(master): release 1.6.0 (<a
href="https://redirect.github.com/google/uuid/issues/151">#151</a>)</li>
<li><a
href="16939dafc3"><code>16939da</code></a>
chore(tests): add strict monotonicity test case for uuid v7. (<a
href="https://redirect.github.com/google/uuid/issues/154">#154</a>)</li>
<li><a
href="016b199544"><code>016b199</code></a>
fix: fix typo in version 7 uuid documentation (<a
href="https://redirect.github.com/google/uuid/issues/153">#153</a>)</li>
<li><a
href="1d8b6ea099"><code>1d8b6ea</code></a>
ci: set token permissions to github workflows (<a
href="https://redirect.github.com/google/uuid/issues/143">#143</a>)</li>
<li><a
href="a2b2b32373"><code>a2b2b32</code></a>
fix: Monotonicity in UUIDv7 (<a
href="https://redirect.github.com/google/uuid/issues/150">#150</a>)</li>
<li><a
href="c58770eb49"><code>c58770e</code></a>
feat: add Max UUID constant (<a
href="https://redirect.github.com/google/uuid/issues/149">#149</a>)</li>
<li>See full diff in <a
href="https://github.com/google/uuid/compare/v1.5.0...v1.6.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/google/uuid&package-manager=go_modules&previous-version=1.5.0&new-version=1.6.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-29 16:33:16 +00:00
hectorcast-db e3d2dbb0ab
Update Go SDK to v0.30.1 (#1162)
## Changes
Update Go SDK to 0.30.1

## Tests
```
make fmt && make test
```
2024-01-29 15:25:10 +00:00
Miles Yucht 2f1b81cba0
Release Windows packages to winget-pkgs (#1144)
## Changes
This PR adds a release workflow which will automatically publish the CLI
to winget-pkgs whenever a release is made. It uses
https://github.com/vedantmgoyal2009/winget-releaser to release the
windows binaries. @exorcism0666 has been graciously making releases on
our behalf, but we can do this automatically ourselves after this PR.

## Tests
<!-- How is this tested? -->
2024-01-29 09:44:09 +00:00
Pieter Noordhuis 6fcf6ba76b
Release v0.212.2 (#1153)
CLI:
* Prompt for account profile only for account-level command execution
instead of during `databricks labs install` flow
([#1128](https://github.com/databricks/cli/pull/1128)).
* Bring back `--json` flag for workspace-conf set-status command
([#1151](https://github.com/databricks/cli/pull/1151)).

Bundles:
* Set `run_as` permissions after variable interpolation
([#1141](https://github.com/databricks/cli/pull/1141)).
* Add functionality to visit values in `dyn.Value` tree
([#1142](https://github.com/databricks/cli/pull/1142)).
* Add `dynvar` package for variable resolution with a `dyn.Value` tree
([#1143](https://github.com/databricks/cli/pull/1143)).
* Add support for `anyOf` to `skip_prompt_if`
([#1133](https://github.com/databricks/cli/pull/1133)).
* Added `bundle generate pipeline` command
([#1139](https://github.com/databricks/cli/pull/1139)).

Internal:
* Use MockWorkspaceClient from SDK instead of WithImpl mocking
([#1134](https://github.com/databricks/cli/pull/1134)).

Dependency updates:
* Bump github.com/databricks/databricks-sdk-go from 0.29.0 to 0.29.1
([#1137](https://github.com/databricks/cli/pull/1137)).
* Bump github.com/hashicorp/terraform-json from 0.20.0 to 0.21.0
([#1138](https://github.com/databricks/cli/pull/1138)).
* Update actions/setup-go to v5
([#1148](https://github.com/databricks/cli/pull/1148)).
* Update codecov/codecov-action to v3
([#1149](https://github.com/databricks/cli/pull/1149)).
* Use latest patch release of Go toolchain
([#1152](https://github.com/databricks/cli/pull/1152)).
2024-01-25 14:32:32 +00:00
Pieter Noordhuis 2d38d14703
Use latest patch release of Go toolchain (#1152)
## Changes

This was pinned to 1.21.0 and included a vulnerability as reported in
#1150. The vulnerability does not affect the prior CLI releases as it
requires a user to execute Go commands from within compromised module
directories.

Fixes #1150.
2024-01-25 12:18:35 +00:00
Andrew Nester 1fb15331c5
Bring back `--json` flag for workspace-conf set-status command (#1151)
## Changes
--json flag was removed from this command when MustUseJson / CanUseJson
generator functions were introduced which did not take requests types of
map.

This PR bring the flag back.

Relies on this Go SDK change:
https://github.com/databricks/databricks-sdk-go/pull/786
2024-01-25 11:55:17 +00:00
Andrew Nester f269f8015d
Added `bundle generate pipeline` command (#1139)
## Changes
Added `bundle generate pipeline` command

Usage as the following

```
databricks bundle generate pipeline --existing-pipeline-id f3b8c580-0a88-4b55-xxxx-yyyyyyyyyy
```

## Tests
Manually + added E2E test
2024-01-25 11:35:14 +00:00
Ilia Babanov 9c3e4fda7c
Add "bundle summary" command (#1123)
The plan is to use the new command in the Databricks VSCode extension to
render "modified" UI state in the bundle resource tree elements, plus
use resource IDs to generate links for the resources

### New revision
- Renamed `remote-state` to `summary`
- Added "modified statuses" to all resources. Currently we don't set
"updated" status - it's either nothing, or created/deleted
- Added tests for the `TerraformToBundle` command
2024-01-25 11:32:47 +00:00
Pieter Noordhuis 8988920a3e
Update codecov/codecov-action to v3 (#1149)
## Changes

v1 has been deprecated for a long time and produces warnings in action
output.
2024-01-25 10:47:03 +00:00
Pieter Noordhuis 8922bf916c
Update actions/setup-go to v5 (#1148)
## Changes

This silences the following warning as seen in action output:

> Node.js 16 actions are deprecated. Please update the following actions
to use Node.js 20: actions/setup-go@v4.
2024-01-25 10:44:37 +00:00
Arpit Jasapara ce8cfef19d
Add support for `anyOf` to `skip_prompt_if` (#1133)
## Changes
This PR:
Introduces `anyOf` to `skip_prompt_if`. This allows you to make OR
conditionals for skipping prompts during template initialization.

## Tests
Added unit test and confirmed existing ones still work. Also tested
manually.

---------

Co-authored-by: Shreyas Goenka <shreyas.goenka@databricks.com>
2024-01-25 10:09:42 +00:00
Pieter Noordhuis 14abcb3ad7
Add `dynvar` package for variable resolution with a `dyn.Value` tree (#1143)
## Changes

This is the `dyn` counterpart to the `bundle/config/interpolation`
package.

It relies on the paths in `${foo.bar}` being valid `dyn.Path` instances.
It leverages `dyn.Walk` to get a complete picture of all variable
references and uses `dyn.Get` to retrieve values pointed to by variable
references.

Depends on #1142.

## Tests

Unit test coverage. I tried to mirror the tests from
`bundle/config/interpolation` and added new ones where applicable (for
example to test type retention of referenced values).
2024-01-24 18:49:06 +00:00
Pieter Noordhuis ff6e0354b9
Add functionality to visit values in `dyn.Value` tree (#1142)
## Changes

This change adds the following functions:
* `dyn.Get(value, "foo.bar") -> (dyn.Value, error)`
* `dyn.Set(value, "foo.bar", newValue) -> (dyn.Value, error)`
* `dyn.Map(value, "foo.bar", func) -> (dyn.Value, error)`

And equivalent functions that take a previously constructed `dyn.Path`:
* `dyn.GetByPath(value, dyn.Path) -> (dyn.Value, error)`
* `dyn.SetByPath(value, dyn.Path, newValue) -> (dyn.Value, error)`
* `dyn.MapByPath(value, dyn.Path, func) -> (dyn.Value, error)`

Changes made by the "set" and "map" functions are never reflected in the
input argument; they return new `dyn.Value` instances for all nodes in
the path leading up to the changed value.

## Tests

New unit tests cover all critical paths.
2024-01-24 18:38:46 +00:00
shreyas-goenka cf2a1c38ba
Set run_as permissions after variable interpolation (#1141)
## Changes

This PR sets run as permissions after variable interpolation.

Terraform does not allow specifying permissions for current user.

The following configuration would fail becuase we would assign a
permission block for self, bypassing this check here:
4ee926b885/bundle/config/mutator/run_as.go (L47)

```
run_as:
  user_name: ${workspace.current_user.userName}
```



## Tests
Manually, setting run_as to ${workspace.current_user.userName} works now
2024-01-24 12:22:04 +00:00
Serge Smertin deb7e67ad5
Prompt for account profile only for account-level command execution instead of during `databricks labs install` flow (#1128)
## Changes

There's a lot of end-user friction for projects that require
account-level commands. This is mainly related to the fact that, as of
January 2024, workspace administrators do not necessarily have access to
call account-level APIs. Ongoing discussions exist on how to implement
this on a platform level best.

A temporary workaround is creating a dummy ~/.databrickscfg profile with
the `account_id` field, though it doesn't remove the end-user friction.
Hence, we don't require an account profile during installation (anymore)
and just prompt it when the context requires it. This also means that we
always prompt for account-level commands unless users specify a
`--profile` flag.

## Tests
- `go run main.go labs install ucx`, don't see an account profile prompt
- `go run main.go labs ucx sync-workspace-info`, to see a profile prompt
and have a valid auth passed
- `go run main.go labs ucx sync-workspace-info --debug --profile
profile-name` to get a concrete profile passed
2024-01-22 17:35:13 +00:00
dependabot[bot] 261f13f42c
Bump github.com/hashicorp/terraform-json from 0.20.0 to 0.21.0 (#1138)
Bumps
[github.com/hashicorp/terraform-json](https://github.com/hashicorp/terraform-json)
from 0.20.0 to 0.21.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/hashicorp/terraform-json/releases">github.com/hashicorp/terraform-json's
releases</a>.</em></p>
<blockquote>
<h2>v0.21.0</h2>
<p>ENHANCEMENTS</p>
<ul>
<li>Initial support for provider-defined functions from <code>providers
schema -json</code> by <a
href="https://github.com/bflad"><code>@​bflad</code></a> in <a
href="https://redirect.github.com/hashicorp/terraform-json/pull/119">hashicorp/terraform-json#119</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/hashicorp/terraform-json/compare/v0.20.0...v0.21.0">https://github.com/hashicorp/terraform-json/compare/v0.20.0...v0.21.0</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="f2686e92e3"><code>f2686e9</code></a>
Initial support for provider-defined functions from providers schema
-json (#...</li>
<li>See full diff in <a
href="https://github.com/hashicorp/terraform-json/compare/v0.20.0...v0.21.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/hashicorp/terraform-json&package-manager=go_modules&previous-version=0.20.0&new-version=0.21.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-22 16:48:55 +00:00
dependabot[bot] c1d0747e3a
Bump github.com/databricks/databricks-sdk-go from 0.29.0 to 0.29.1 (#1137)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.29.0 to 0.29.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.29.1</h2>
<p>This patch release contains two small changes:</p>
<ul>
<li>Retry on Status Code 503 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/733">#733</a>),
improving the stability of the SDK in light of transient API
unavailability.</li>
<li>Simplify mocking of iterator and waiter objects (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/769">#769</a>,
<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/770">#770</a>).
See the <a
href="https://github.com/databricks/databricks-sdk-go#testing">Testing
section of the README.md</a> for usage information and examples.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>0.29.1</h2>
<p>This patch release contains two small changes:</p>
<ul>
<li>Retry on Status Code 503 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/733">#733</a>),
improving the stability of the SDK in light of transient API
unavailability.</li>
<li>Simplify mocking of iterator and waiter objects (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/769">#769</a>,
<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/770">#770</a>).
See the <a
href="https://github.com/databricks/databricks-sdk-go#testing">Testing
section of the README.md</a> for usage information and examples.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="1fb6a9d657"><code>1fb6a9d</code></a>
Release v0.29.1 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/771">#771</a>)</li>
<li><a
href="e1c610b73f"><code>e1c610b</code></a>
Update README.md with testing documentation (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/770">#770</a>)</li>
<li><a
href="a0de560352"><code>a0de560</code></a>
Simplify mocking of iterator and waiter objects (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/769">#769</a>)</li>
<li><a
href="32643d652c"><code>32643d6</code></a>
Retry on Status Code 503 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/733">#733</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.29.0...v0.29.1">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.29.0&new-version=0.29.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-22 16:48:35 +00:00
Andrew Nester 1b6241746e
Use MockWorkspaceClient from SDK instead of WithImpl mocking (#1134)
## Changes
Use MockWorkspaceClient from SDK instead of WithImpl mocking
2024-01-19 14:12:58 +00:00
Andrew Nester 7067782cf1
Fixed path matching for Windows in generate job test (#1132)
## Changes
Fixed path matching for Windows in generate job test
2024-01-19 08:05:59 +00:00
Andrew Nester 57abf157cc
Release v0.212.1 (#1130)
CLI:
* Fix windows style file paths in fs cp command
([#1118](https://github.com/databricks/cli/pull/1118)).
* Do not require positional arguments if they should be provided in JSON
([#1125](https://github.com/databricks/cli/pull/1125)).
* Always require path parameters as positional arguments
([#1129](https://github.com/databricks/cli/pull/1129)).

Bundles:
* Add debug log line for when bundle init is run from non-TTY interface
([#1117](https://github.com/databricks/cli/pull/1117)).
* Added `databricks bundle generate job` command
([#1043](https://github.com/databricks/cli/pull/1043)).
* Support passing job parameters to bundle run
([#1115](https://github.com/databricks/cli/pull/1115)).

Dependency updates:
* Bump golang.org/x/oauth2 from 0.15.0 to 0.16.0
([#1124](https://github.com/databricks/cli/pull/1124)).
2024-01-17 14:44:42 +00:00
Andrew Nester 70fe0e36ef
Added `databricks bundle generate job` command (#1043)
## Changes
Now it's possible to generate bundle configuration for existing job.
For now it only supports jobs with notebook tasks.

It will download notebooks referenced in the job tasks and generate
bundle YAML config for this job which can be included in larger bundle.

## Tests
Running command manually

Example of generated config
```
resources:
  jobs:
    job_128737545467921:
      name: Notebook job
      format: MULTI_TASK
      tasks:
        - task_key: as_notebook
          existing_cluster_id: 0704-xxxxxx-yyyyyyy
          notebook_task:
            base_parameters:
              bundle_root: /Users/andrew.nester@databricks.com/.bundle/job_with_module_imports/development/files
            notebook_path: ./entry_notebook.py
            source: WORKSPACE
          run_if: ALL_SUCCESS
      max_concurrent_runs: 1
 ```

## Tests
Manual (on our last 100 jobs) + added end-to-end test

```
--- PASS: TestAccGenerateFromExistingJobAndDeploy (50.91s)
PASS
coverage: 61.5% of statements in ./...
ok github.com/databricks/cli/internal/bundle 51.209s coverage: 61.5% of
statements in ./...
```
2024-01-17 14:26:33 +00:00
Andrew Nester 98477699a0
Always require path parameters as positional arguments (#1129)
## Changes
Always require path parameters as positional arguments
Note: uses a generator with this SDK change:
https://github.com/databricks/databricks-sdk-go/pull/773

Fixes https://github.com/databricks/cli/issues/1121
2024-01-17 14:14:20 +00:00
Andrew Nester ef67b1755e
Do not require positional arguments if they should be provided in JSON (#1125)
## Changes
Do not require positional arguments if they should be provided in JSON

Fixes #1122
2024-01-17 10:53:50 +00:00
dependabot[bot] 3d359319df
Bump golang.org/x/oauth2 from 0.15.0 to 0.16.0 (#1124)
Bumps [golang.org/x/oauth2](https://github.com/golang/oauth2) from
0.15.0 to 0.16.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="39adbb7807"><code>39adbb7</code></a>
go.mod: update golang.org/x dependencies</li>
<li><a
href="4ce7bbb2ff"><code>4ce7bbb</code></a>
google: add Credentials.GetUniverseDomain with GCE MDS support</li>
<li><a
href="1e6999b1be"><code>1e6999b</code></a>
google: add UniverseDomain to CredentialsParams</li>
<li>See full diff in <a
href="https://github.com/golang/oauth2/compare/v0.15.0...v0.16.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/oauth2&package-manager=go_modules&previous-version=0.15.0&new-version=0.16.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-15 12:59:23 +00:00
Pieter Noordhuis 06b50670e1
Support passing job parameters to bundle run (#1115)
## Changes

This change adds support for job parameters. If job parameters are
specified for a job that doesn't define job parameters it returns an
error. Conversely, if task parameters are specified for a job that
defines job parameters, it also returns an error.

This change moves the options structs and their functions to separate
files and backfills test coverage for them.

Job parameters can now be specified with `--params foo=bar,bar=qux`.

## Tests

Unit tests and manual integration testing.
2024-01-15 07:42:36 +00:00
shreyas-goenka 2c0d06715c
Fix windows style file paths in fs cp command (#1118)
## Changes
Copying a local file in windows to remote directory in DBFS would fail
if the path was specified as a windows style path (compared to a UNIX
style path). This PR fixes that.

Note, UNIX style paths will continue to work because `filepath.Base`
respects both `/` and `\` as file separators. See: `IsPathSeparator` in
https://go.dev/src/os/path_windows.go.

Fixes issue: https://github.com/databricks/cli/issues/1109.

## Tests
Integration test and manually
```
C:\Users\shreyas.goenka>Desktop\cli.exe fs cp .\Desktop\foo.txt dbfs:/Users/shreyas.goenka@databricks.com
.\Desktop\foo.txt -> dbfs:/Users/shreyas.goenka@databricks.com/foo.txt

C:\Users\shreyas.goenka>Desktop\cli.exe fs cat  dbfs:/Users/shreyas.goenka@databricks.com/foo.txt
hello, world
````
2024-01-11 18:49:42 +00:00
shreyas-goenka 7dcdadde79
Add debug log line for when bundle init is run from non-TTY interface (#1117) 2024-01-11 15:41:13 +00:00