Commit Graph

893 Commits

Author SHA1 Message Date
Andrew Nester c65e59751b
Release v0.205.2 (#791)
CLI:
* Prompt for profile only in interactive mode
([#788](https://github.com/databricks/cli/pull/788)).

Internal:
* Added setup Python action
([#789](https://github.com/databricks/cli/pull/789)).
2023-09-21 14:46:35 +00:00
Andrew Nester aa9c2a1eab
Prompt for profile only in interactive mode (#788)
## Changes
Do not prompt for profiles if not in interactive mode

## Tests
Running sample Go code

```
cmd := exec.Command("databricks", "auth", "login", "--host", "***")
out, err := cmd.CombinedOutput()
```
Before the change
```
Error: ^D

exit status 1
```

After
```
No error (empty output)
```
2023-09-21 12:38:45 +00:00
Andrew Nester 4a9dcd3231
Added setup Python action (#789)
## Changes
Added setup Python action

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-21 12:21:39 +00:00
Pieter Noordhuis 46996b884d
Release v0.205.1 (#787)
Bundles:
* Use enums for default python template
([#765](https://github.com/databricks/cli/pull/765)).
* Make bundle deploy work if no resources are defined
([#767](https://github.com/databricks/cli/pull/767)).
* Added support for experimental scripts section
([#632](https://github.com/databricks/cli/pull/632)).
* Error when unknown keys are encounters during template execution
([#766](https://github.com/databricks/cli/pull/766)).
* Fall back to full Git clone if shallow clone is not supported
([#775](https://github.com/databricks/cli/pull/775)).
* Enable environment overrides for job tasks
([#779](https://github.com/databricks/cli/pull/779)).
* Increase timeout waiting for job run to 1 day
([#786](https://github.com/databricks/cli/pull/786)).

Internal:
* Update Go SDK to v0.19.3 (unreleased)
([#778](https://github.com/databricks/cli/pull/778)).
2023-09-20 11:46:47 +00:00
Pieter Noordhuis 3a812a61e5
Increase timeout waiting for job run to 1 day (#786)
## Changes

It's not uncommon for job runs to take more than 2 hours. On the client
side, we should not stop waiting for a job to complete if it is
intentionally running for a long time. If a job isn't supposed to run
this long, the user can specify a run timeout in the job specification
itself.

## Tests

n/a
2023-09-19 19:54:24 +00:00
Andrew Nester 43e2eefc27
Enable environment overrides for job tasks (#779)
## Changes
Follow up for https://github.com/databricks/cli/pull/658

When a job definition has multiple job tasks using the same key, it's
considered invalid. Instead we should combine those definitions with the
same key into one. This is consistent with environment overrides. This
way, the override ends up in the original job tasks, and we've got a
clear way to put them all together.

## Tests
Added unit tests
2023-09-18 14:13:50 +00:00
Pieter Noordhuis b3b00fd226
Update Go SDK to v0.19.3 (unreleased) (#778)
## Changes

This bump includes:
* A fix for token refreshes on Azure
* A fix for retrying requests without a request body (e.g. GET)

Full comparison at
https://github.com/databricks/databricks-sdk-go/compare/v0.19.1...dacb7f4fc878.

## Tests

n/a
2023-09-15 14:54:23 +00:00
shreyas-goenka 2c58deb2c5
Fall back to full Git clone if shallow clone is not supported (#775)
## Changes
Git repos hosted over HTTP do not support shallow cloning. This PR adds
retry logic if we detect shallow cloning is not supported.

Note I saw the match string `dumb http transport does not support
shallow capabilities` being reported in for different hosts on the
internet, so this should work accross a large class of git servers.
Howerver, it's not strictly necessary to have the `--depth` flag so we
can remove it if this issue is reported again.

## Tests
Tested manually. `bundle init` successfully downloads the private HTTP
repo reported during by internal user.
2023-09-15 09:14:51 +00:00
shreyas-goenka 327ab0e598
Error when unknown keys are encounters during template execution (#766)
## Tests
New unit test and manually
2023-09-14 15:53:20 +00:00
Andrew Nester 953dcb4972
Added support for experimental scripts section (#632)
## Changes
Added support for experimental scripts section

It allows execution of arbitrary bash commands during certain bundle
lifecycle steps.

## Tests
Example of configuration

```yaml
bundle:
  name: wheel-task


workspace:
  host: ***

experimental:
  scripts:
    prebuild: |
      echo 'Prebuild 1'
      echo 'Prebuild 2'
    postbuild: "echo 'Postbuild 1' && echo 'Postbuild 2'" 
    predeploy: |
      echo 'Checking go version...'
      go version
    postdeploy: |
      echo 'Checking python version...'
      python --version

resources:
  jobs:
    test_job:
      name: "[${bundle.environment}] My Wheel Job"
      tasks:
        - task_key: TestTask
          existing_cluster_id: "***"
          python_wheel_task:
            package_name: "my_test_code"
            entry_point: "run"
          libraries:
          - whl: ./dist/*.whl
```

Output
```bash
andrew.nester@HFW9Y94129 wheel % databricks bundle deploy
artifacts.whl.AutoDetect: Detecting Python wheel project...
artifacts.whl.AutoDetect: Found Python wheel project at /Users/andrew.nester/dabs/wheel
'Prebuild 1'
'Prebuild 2'

artifacts.whl.Build(my_test_code): Building...
artifacts.whl.Build(my_test_code): Build succeeded
'Postbuild 1'
'Postbuild 2'

'Checking go version...'
go version go1.19.9 darwin/arm64

Starting upload of bundle files
Uploaded bundle files at /Users/andrew.nester@databricks.com/.bundle/wheel-task/default/files!

artifacts.Upload(my_test_code-0.0.0a0-py3-none-any.whl): Uploading...
artifacts.Upload(my_test_code-0.0.0a0-py3-none-any.whl): Upload succeeded
Starting resource deployment
Resource deployment completed!
'Checking python version...'
Python 2.7.18
```
2023-09-14 10:14:13 +00:00
shreyas-goenka fe32c46dc8
Make bundle deploy work if no resources are defined (#767)
## Changes

This PR sets "resource" to nil in the terraform representation if no
resources are defined in the bundle configuration. This solves two
problems:

1. Makes bundle deploy work without any resources specified. 
2. Previously if a `resources` block was removed after a deployment,
that would fail with an error. Now the resources would get destroyed as
expected.

Also removes `TerraformHasNoResources` which is no longer needed.

## Tests
New e2e tests.
2023-09-13 22:50:37 +00:00
shreyas-goenka be55310cc9
Use enums for default python template (#765)
## Changes
This PR changes schema to use the enum type for the default template
yes/no questions.

## Tests
Manually
2023-09-13 17:57:31 +00:00
Pieter Noordhuis 96d807fb85
Release v0.205.0 (#771)
This release marks the public preview phase of Databricks Asset Bundles.

For more information, please refer to our online documentation at
https://docs.databricks.com/en/dev-tools/bundles/.

CLI:
* Prompt once for a client profile
([#727](https://github.com/databricks/cli/pull/727)).

Bundles:
* Use clearer error message when no interpolation value is found.
([#764](https://github.com/databricks/cli/pull/764)).
* Use interactive prompt to select resource to run if not specified
([#762](https://github.com/databricks/cli/pull/762)).
* Add documentation link bundle command group description
([#770](https://github.com/databricks/cli/pull/770)).
2023-09-12 14:35:36 +00:00
shreyas-goenka 21ff71ceea
Add documentation link bundle command group description (#770)
Help output:
```
shreyas.goenka@THW32HFW6T ~ % cli bundle -h
Databricks Asset Bundles.
Documentation URL: https://docs.databricks.com/en/dev-tools/bundles.

Usage:
  databricks bundle [command]
```

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-12 13:38:43 +00:00
Pieter Noordhuis 3cb74e72a8
Run environment related tests in a pristine environment (#769)
## Changes

If the caller running the test has one or more environment variables
that are used in the test already set, they can interfere and make tests
fail.

## Tests

Ran tests in `./cmd/root` with Databricks related environment variables
set.
2023-09-12 13:28:53 +00:00
Pieter Noordhuis a2775f836f
Use interactive prompt to select resource to run if not specified (#762)
## Changes

Display an interactive prompt with a list of resources to run if one
isn't specified and the command is run interactively.

## Tests

Manually confirmed:
* The new prompt works
* Shell completion still works
* Specifying a key argument still works
2023-09-11 18:03:12 +00:00
Pieter Noordhuis 0cb05d1ded
Prompt once for a client profile (#727)
## Changes

The previous implementation ran the risk of infinite looping for the
account client due to a mismatch in determining what constitutes an
account client between the CLI and SDK (see
[here](83443bae8d/libs/databrickscfg/profiles.go (L61))
and
[here](0fdc5165e5/config/config.go (L160))).

Ultimately, this code must never infinite loop. If a user is prompted
and selects a profile that cannot be used, they should receive that
feedback immediately and try again, instead of being prompted again.

Related to #726.

## Tests
<!-- How is this tested? -->
2023-09-11 15:32:24 +00:00
shreyas-goenka 373f441eb2
Use clearer error message when no interpolation value is found. (#764)
## Changes
This PR makes the error message clearer for when interpolation fails. 

## Tests
Existing unit test and manually
2023-09-11 15:23:25 +00:00
Pieter Noordhuis 44726d6444
Release v0.204.1 (#763)
Bundles:
* Fix conversion of job parameters
([#744](https://github.com/databricks/cli/pull/744)).
* Add schema and config validation to jsonschema package
([#740](https://github.com/databricks/cli/pull/740)).
* Support Model Serving Endpoints in bundles
([#682](https://github.com/databricks/cli/pull/682)).
* Do not include empty output in job run output
([#749](https://github.com/databricks/cli/pull/749)).
* Fixed marking libraries from DBFS as remote
([#750](https://github.com/databricks/cli/pull/750)).
* Process only Python wheel tasks which have local libraries used
([#751](https://github.com/databricks/cli/pull/751)).
* Add enum support for bundle templates
([#668](https://github.com/databricks/cli/pull/668)).
* Apply Python wheel trampoline if workspace library is used
([#755](https://github.com/databricks/cli/pull/755)).
* List available targets when incorrect target passed
([#756](https://github.com/databricks/cli/pull/756)).
* Make bundle and sync fields optional
([#757](https://github.com/databricks/cli/pull/757)).
* Consolidate environment variable interaction
([#747](https://github.com/databricks/cli/pull/747)).

Internal:
* Update Go SDK to v0.19.1
([#759](https://github.com/databricks/cli/pull/759)).
2023-09-11 11:57:21 +00:00
shreyas-goenka ad84abf415
Fix temporary directory cleanup for init repository downloading (#760)
## Changes
This PR fixes a bug where the temp directory created to download the
template would not be cleaned up.

## Tests
Tested manually. The exact process is described in a comment below.
2023-09-11 10:22:05 +00:00
Lennart Kats (databricks) a4e94e1b36
Fix author in setup.py (#761)
Fix author in setup.py showing <no value>
2023-09-11 08:59:48 +00:00
Pieter Noordhuis c836194d89
Update Go SDK to v0.19.1 (#759)
## Changes

This includes token reuse for Azure CLI based auth.

See:
https://github.com/databricks/databricks-sdk-go/releases/tag/v0.19.1

## Tests

Confirmed manually that Azure CLI tokens are acquired only once.
2023-09-11 08:18:52 +00:00
Pieter Noordhuis 4ccc70aeac
Consolidate environment variable interaction (#747)
## Changes

There are a couple places throughout the code base where interaction
with environment variables takes place. Moreover, more than one of these
would try to read a value from more than one environment variable as
fallback (for backwards compatibility). This change consolidates those
accesses.

The majority of diffs in this change are mechanical (i.e. add an
argument or replace a call).

This change:
* Moves common environment variable lookups for bundles to
`bundles/env`.
* Adds a `libs/env` package that wraps `os.LookupEnv` and `os.Getenv`
and allows for overrides to take place in a `context.Context`. By
scoping overrides to a `context.Context` we can avoid `t.Setenv` in
testing and unlock parallel test execution for integration tests.
* Updates call sites to pass through a `context.Context` where needed.
* For bundles, introduces `DATABRICKS_BUNDLE_ROOT` as new primary
variable instead of `BUNDLE_ROOT`. This was the last environment
variable that did not use the `DATABRICKS_` prefix.

## Tests

Unit tests pass.
2023-09-11 08:18:43 +00:00
shreyas-goenka 9a51f72f0b
Make bundle and sync fields optional (#757)
## Changes
This PR:
1. Makes the bundle and sync properties optional in the generated
schema.
2. Fixes schema generation that was broken due to a rogue "description"
field in the bundle docs.

## Tests
Tested manually. The generated schema no longer has "bundle" and "sync"
marked as required.
2023-09-11 08:16:22 +00:00
Lennart Kats (databricks) 9e56bed593
Minor default template tweaks (#758)
Minor template tweaks, mostly making the imports section for DLT
notebooks a bit more elegant.

Tested with DAB deployment + in-workspace UI.
2023-09-11 07:36:44 +00:00
shreyas-goenka d9a276b17d
Fix minor typos in default-python template (#754)
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-09 21:55:43 +00:00
Andrew Nester b5d033d154
List available targets when incorrect target passed (#756)
## Changes
List available targets when incorrect target passed

## Tests
```
andrew.nester@HFW9Y94129 wheel % databricks bundle validate -t incorrect
Error: incorrect: no such target. Available targets: prod, development
```
2023-09-08 15:37:55 +00:00
Andrew Nester 18a5b05d82
Apply Python wheel trampoline if workspace library is used (#755)
## Changes
Workspace library will be detected by trampoline in 2 cases:
- User defined to use local wheel file
- User defined to use remote wheel file from Workspace file system

In both of these cases we should correctly apply Python trampoline

## Tests
Added a regression test (also covered by Python e2e test)
2023-09-08 13:45:21 +00:00
shreyas-goenka 7c96270db8
Add enum support for bundle templates (#668)
## Changes
This PR includes:
1. Adding enum field to the json schema struct
2. Adding prompting logic for enum values. See demo for how it looks
3. Validation rules, validating the default value and config values when
an enum list is specified

This will now enable template authors to use enums for input parameters.

## Tests
Manually and new unit tests
2023-09-08 12:07:22 +00:00
Andrew Nester 368321d07d
Close python wheel directory file descriptor after read (#753)
## Changes
Close python wheel directory file descriptor after read
2023-09-08 11:24:51 +00:00
Andrew Nester 67af171a68
Process only Python wheel tasks which have local libraries used (#751)
## Changes
Process only Python wheel tasks which have local libraries used

## Tests
Updated uni test to catch the regression
2023-09-08 11:08:21 +00:00
Andrew Nester f7566b8264
Close local Terraform state file when pushing to remote (#752)
## Changes
Close local Terraform state file when pushing to remote

Should help fix E2E test cleanup
```
testing.go:1225: TempDir RemoveAll cleanup: remove 
C:\Users\RUNNER~1\AppData\Local\Temp\TestAccPythonWheelTaskDeployAndRun1395546390\001\.databricks\bundle\default\terraform\terraform.tfstate: 
The process cannot access the file because it is being used by another process.
```
2023-09-08 10:47:17 +00:00
Andrew Nester e64463ba47
Fixed marking libraries from DBFS as remote (#750)
## Changes
Fixed marking libraries from DBFS as remote

## Tests
Updated unit tests to catch the regression
2023-09-08 09:53:57 +00:00
Andrew Nester e08f419ef6
Do not include empty output in job run output (#749)
## Changes
Do not include empty output in job run output

## Tests
Running a job from CLI, the result:
```
andrew.nester@HFW9Y94129 wheel % databricks bundle run some_other_job --output json
Run URL: https://***/?o=6051921418418893#job/780620378804085/run/386695528477456

2023-09-08 11:33:24 "[default] My Wheel Job" TERMINATED SUCCESS 
{
  "task_outputs": [
    {
      "TaskKey": "TestTask",
      "Output": {
        "result": "Hello from my func\nGot arguments v2:\n['python']\n"
      },
      "EndTime": 1694165597474
    }
  ]
```
2023-09-08 09:52:45 +00:00
Andrew Nester 17d9f7dd2a
Use unique bundle root path for Python E2E test (#748)
## Changes
It helps to make sure jobs in the tests are deployed and executed
uniquely and isolated


```
Bundle remote directory is /Users/61b77d30-bc10-4214-9650-29cf5db0e941/.bundle/4b630810-5edc-4d8f-85d1-0eb5baf7bb28
Deleted snapshot file at /var/folders/nt/xjv68qzs45319w4k36dhpylc0000gp/T/TestAccPythonWheelTaskDeployAndRun3933198431/001/.databricks/bundle/default/sync-snapshots/dd9db100465e3d91.json
Successfully deleted files!
--- PASS: TestAccPythonWheelTaskDeployAndRun (346.28s)
PASS
coverage: 93.5% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       346.976s        coverage: 93.5% of statements in ./...
```
2023-09-08 09:19:55 +00:00
Arpit Jasapara 50eaf16307
Support Model Serving Endpoints in bundles (#682)
## Changes
<!-- Summary of your changes that are easy to understand -->
Add Model Serving Endpoints to Databricks Bundles

## Tests
<!-- How is this tested? -->
Unit tests and manual testing via
https://github.com/databricks/bundle-examples-internal/pull/76
<img width="1570" alt="Screenshot 2023-08-28 at 7 46 23 PM"
src="https://github.com/databricks/cli/assets/87999496/7030ebd8-b0e2-4ad1-a9e3-5ff8454f1175">
<img width="747" alt="Screenshot 2023-08-28 at 7 47 01 PM"
src="https://github.com/databricks/cli/assets/87999496/fb9b54d7-54e2-43ce-9148-68fb620c809a">

Signed-off-by: Arpit Jasapara <arpit.jasapara@databricks.com>
2023-09-07 21:54:31 +00:00
Andrew Nester 5a14c7cb43
Generate unique name for a job in Python wheel test (#745)
## Changes
Generate unique name for a job in Python wheel test
2023-09-07 20:02:26 +00:00
shreyas-goenka 1a7bf4e4f1
Add schema and config validation to jsonschema package (#740)
## Changes

At a high level this PR adds new schema validation and moves
functionality that should be present in the jsonschema package, but
resides in the template package today, to the jsonschema package. This
includes for example schema validation, schema instance validation, to /
from string conversion methods etc.

The list below outlines all the pieces that have been moved over, and
the new validation bits added.

This PR:
1. Adds casting default value of schema properties to integers to the
jsonschema.Load method.
2. Adds validation for default value types for schema properties,
checking they are consistant with the type defined.
3. Introduces the LoadInstance and ValidateInstance methods to the json
schema package. These methods can be used to read and validate JSON
documents against the schema.
4. Replaces validation done for template inputs to use the newly defined
JSON schema validation functions.
5. Moves to/from string and isInteger utility methods to the json schema
package.

## Tests
Existing and new unit tests.
2023-09-07 14:36:06 +00:00
Andrew Nester 10e0836749
Added end-to-end test for deploying and running Python wheel task (#741)
## Changes
Added end-to-end test for deploying and running Python wheel task

## Tests
Test successfully passed on all environments, takes about 9-10 minutes
to pass.

```
Deleted snapshot file at /var/folders/nt/xjv68qzs45319w4k36dhpylc0000gp/T/TestAccPythonWheelTaskDeployAndRun1845899209/002/.databricks/bundle/default/sync-snapshots/1f7cc766ffe038d6.json
Successfully deleted files!
2023/09/06 17:50:50 INFO Releasing deployment lock mutator=destroy mutator=seq mutator=seq mutator=deferred mutator=lock:release
--- PASS: TestAccPythonWheelTaskDeployAndRun (508.16s)
PASS
coverage: 77.9% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       508.810s        coverage: 77.9% of statements in ./...
```

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-07 14:08:16 +00:00
Pieter Noordhuis c0ebfb8101
Fix conversion of job parameters (#744)
## Changes

Another example of singular/plural conversion.

Longer term solution is we do a full sweep of the type using reflection
to make sure we cover all fields.

## Tests

Unit test passes.
2023-09-07 12:48:59 +00:00
Lennart Kats (databricks) 50b2c0b83b
Fix notebook showing up in template when not selected (#743)
## Changes
This fixes a typo that caused the notebook.ipynb file to show up even if
the user answered "no" to the question about including a notebook.

## Tests
We have matrix validation tests for all the yes/no combinations and
whether the build + validate. There is no current test for the absence
of files.
2023-09-07 08:26:43 +00:00
Lennart Kats (databricks) 3c79181148
Remove unused file (#742)
defaults.json was originally used in tests. It's no longer used and
should be removed.
2023-09-06 18:18:15 +00:00
Pieter Noordhuis c8f5990f47
Release v0.204.0 (#738)
This release includes permission related commands for a subset of
workspace
services where they apply. These complement the `permissions` command
and
do not require specification of the object type to work with, as that is
implied by the command they are nested under.

CLI:
* Group permission related commands
([#730](https://github.com/databricks/cli/pull/730)).

Bundles:
* Fixed artifact file uploading on Windows and wheel execution on DBR
13.3 ([#722](https://github.com/databricks/cli/pull/722)).
* Make resource and artifact paths in bundle config relative to config
folder ([#708](https://github.com/databricks/cli/pull/708)).
* Add support for ordering of input prompts
([#662](https://github.com/databricks/cli/pull/662)).
* Fix IsServicePrincipal() only working for workspace admins
([#732](https://github.com/databricks/cli/pull/732)).
* databricks bundle init template v1
([#686](https://github.com/databricks/cli/pull/686)).
* databricks bundle init template v2: optional stubs, DLT support
([#700](https://github.com/databricks/cli/pull/700)).
* Show 'databricks bundle init' template in CLI prompt
([#725](https://github.com/databricks/cli/pull/725)).
* Include $PATH in set of environment variables to pass along.
([#736](https://github.com/databricks/cli/pull/736)).

Internal:
* Update Go SDK to v0.19.0
([#729](https://github.com/databricks/cli/pull/729)).
* Replace API call to test configuration with dummy authenticate call
([#728](https://github.com/databricks/cli/pull/728)).

API Changes:
* Changed `databricks account storage-credentials create` command to
return .
* Changed `databricks account storage-credentials get` command to return
.
* Changed `databricks account storage-credentials list` command to
return .
* Changed `databricks account storage-credentials update` command to
return .
* Changed `databricks connections create` command with new required
argument order.
* Changed `databricks connections update` command with new required
argument order.
* Changed `databricks volumes create` command with new required argument
order.
 * Added `databricks artifact-allowlists` command group.
 * Added `databricks model-versions` command group.
 * Added `databricks registered-models` command group.
 * Added `databricks cluster-policies get-permission-levels` command.
 * Added `databricks cluster-policies get-permissions` command.
 * Added `databricks cluster-policies set-permissions` command.
 * Added `databricks cluster-policies update-permissions` command.
 * Added `databricks clusters get-permission-levels` command.
 * Added `databricks clusters get-permissions` command.
 * Added `databricks clusters set-permissions` command.
 * Added `databricks clusters update-permissions` command.
 * Added `databricks instance-pools get-permission-levels` command.
 * Added `databricks instance-pools get-permissions` command.
 * Added `databricks instance-pools set-permissions` command.
 * Added `databricks instance-pools update-permissions` command.
 * Added `databricks files` command group.
 * Changed `databricks permissions set` command to start returning .
 * Changed `databricks permissions update` command to start returning .
 * Added `databricks users get-permission-levels` command.
 * Added `databricks users get-permissions` command.
 * Added `databricks users set-permissions` command.
 * Added `databricks users update-permissions` command.
 * Added `databricks jobs get-permission-levels` command.
 * Added `databricks jobs get-permissions` command.
 * Added `databricks jobs set-permissions` command.
 * Added `databricks jobs update-permissions` command.
 * Changed `databricks experiments get-by-name` command to return .
 * Changed `databricks experiments get-experiment` command to return .
 * Added `databricks experiments delete-runs` command.
 * Added `databricks experiments get-permission-levels` command.
 * Added `databricks experiments get-permissions` command.
 * Added `databricks experiments restore-runs` command.
 * Added `databricks experiments set-permissions` command.
 * Added `databricks experiments update-permissions` command.
 * Added `databricks model-registry get-permission-levels` command.
 * Added `databricks model-registry get-permissions` command.
 * Added `databricks model-registry set-permissions` command.
 * Added `databricks model-registry update-permissions` command.
 * Added `databricks pipelines get-permission-levels` command.
 * Added `databricks pipelines get-permissions` command.
 * Added `databricks pipelines set-permissions` command.
 * Added `databricks pipelines update-permissions` command.
 * Added `databricks serving-endpoints get-permission-levels` command.
 * Added `databricks serving-endpoints get-permissions` command.
 * Added `databricks serving-endpoints set-permissions` command.
 * Added `databricks serving-endpoints update-permissions` command.
 * Added `databricks token-management get-permission-levels` command.
 * Added `databricks token-management get-permissions` command.
 * Added `databricks token-management set-permissions` command.
 * Added `databricks token-management update-permissions` command.
* Changed `databricks dashboards create` command with new required
argument order.
 * Added `databricks warehouses get-permission-levels` command.
 * Added `databricks warehouses get-permissions` command.
 * Added `databricks warehouses set-permissions` command.
 * Added `databricks warehouses update-permissions` command.
 * Added `databricks dashboard-widgets` command group.
 * Added `databricks query-visualizations` command group.
 * Added `databricks repos get-permission-levels` command.
 * Added `databricks repos get-permissions` command.
 * Added `databricks repos set-permissions` command.
 * Added `databricks repos update-permissions` command.
 * Added `databricks secrets get-secret` command.
 * Added `databricks workspace get-permission-levels` command.
 * Added `databricks workspace get-permissions` command.
 * Added `databricks workspace set-permissions` command.
 * Added `databricks workspace update-permissions` command.

OpenAPI commit 09a7fa63d9ae243e5407941f200960ca14d48b07 (2023-09-04)
2023-09-06 11:46:21 +00:00
Lennart Kats (databricks) f9e521b43e
databricks bundle init template v2: optional stubs, DLT support (#700)
## Changes

This follows up on https://github.com/databricks/cli/pull/686. This PR
makes our stubs optional + it adds DLT stubs:

```
$ databricks bundle init
Template to use [default-python]: default-python
Unique name for this project [my_project]: my_project
Include a stub (sample) notebook in 'my_project/src' [yes]: yes
Include a stub (sample) DLT pipeline in 'my_project/src' [yes]: yes
Include a stub (sample) Python package 'my_project/src' [yes]: yes
 Successfully initialized template
```

## Tests
Manual testing, matrix tests.

---------

Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
Co-authored-by: PaulCornellDB <paul.cornell@databricks.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-06 09:52:31 +00:00
Andrew Nester a41b9e8bf2
Added description for version command (#737)
## Changes
Added description for version command

## Tests
```
databricks help

...

Additional Commands:
  account              Databricks Account Commands
  api                  Perform Databricks API call
  auth                 Authentication related commands
  bundle               Databricks Asset Bundles
  completion           Generate the autocompletion script for the specified shell
  fs                   Filesystem related commands
  help                 Help about any command
  sync                 Synchronize a local directory to a workspace directory
  version              Retrieve information about the current version of CLI
```

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-06 08:41:47 +00:00
Pieter Noordhuis fabe8e88b8
Include $PATH in set of environment variables to pass along. (#736)
## Changes

This is necessary to ensure that our Terraform provider can use the same
auxiliary programs (e.g. `az`, or `gcloud`) as the CLI.

## Tests

Unit test and manual verification.
2023-09-06 07:54:35 +00:00
shreyas-goenka 9194418ac1
Fix regex error check in mkdir integration test (#735)
## Changes
Fixes test for all cloud provider after the Go SDK bump which introduces
the `non retryable error` prefix to errors. The test passes now.
2023-09-05 14:25:26 +00:00
Lennart Kats (databricks) e533f9109a
Show 'databricks bundle init' template in CLI prompt (#725)
~(this should be changed to target `main`)~

This reveals the template from
https://github.com/databricks/cli/pull/686 in CLI prompts for once #686
and #708 are merged.

---------

Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
Co-authored-by: PaulCornellDB <paul.cornell@databricks.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-05 13:57:01 +00:00
Lennart Kats (databricks) 8c2cc07f7b
databricks bundle init template v1 (#686)
## Changes

This adds a built-in "default-python" template to the CLI. This is based
on the new default-template support of
https://github.com/databricks/cli/pull/685.

The goal here is to offer an experience where customers can simply type
`databricks bundle init` to get a default template:

```
$ databricks bundle init
Template to use [default-python]: default-python
Unique name for this project [my_project]: my_project
 Successfully initialized template
```

The present template:
- [x] Works well with VS Code
- [x] Works well with the workspace
- [x] Works well with DB Connect
- [x] Uses minimal stubs rather than boiler-plate-heavy examples

I'll have a followup with tests + DLT support.

---------

Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
Co-authored-by: PaulCornellDB <paul.cornell@databricks.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-05 11:58:34 +00:00
Lennart Kats (databricks) 947d5b1e5c
Fix IsServicePrincipal() only working for workspace admins (#732)
## Changes

The latest rendition of isServicePrincipal no longer worked for
non-admin users as it used the "principals get" API.

This new version relies on the property that service principals always
have a UUID as their userName. This was tested with the eng-jaws
principal (8b948b2e-d2b5-4b9e-8274-11b596f3b652).
2023-09-05 11:20:55 +00:00