Commit Graph

767 Commits

Author SHA1 Message Date
Andrew Nester b5d033d154
List available targets when incorrect target passed (#756)
## Changes
List available targets when incorrect target passed

## Tests
```
andrew.nester@HFW9Y94129 wheel % databricks bundle validate -t incorrect
Error: incorrect: no such target. Available targets: prod, development
```
2023-09-08 15:37:55 +00:00
Andrew Nester 18a5b05d82
Apply Python wheel trampoline if workspace library is used (#755)
## Changes
Workspace library will be detected by trampoline in 2 cases:
- User defined to use local wheel file
- User defined to use remote wheel file from Workspace file system

In both of these cases we should correctly apply Python trampoline

## Tests
Added a regression test (also covered by Python e2e test)
2023-09-08 13:45:21 +00:00
shreyas-goenka 7c96270db8
Add enum support for bundle templates (#668)
## Changes
This PR includes:
1. Adding enum field to the json schema struct
2. Adding prompting logic for enum values. See demo for how it looks
3. Validation rules, validating the default value and config values when
an enum list is specified

This will now enable template authors to use enums for input parameters.

## Tests
Manually and new unit tests
2023-09-08 12:07:22 +00:00
Andrew Nester 368321d07d
Close python wheel directory file descriptor after read (#753)
## Changes
Close python wheel directory file descriptor after read
2023-09-08 11:24:51 +00:00
Andrew Nester 67af171a68
Process only Python wheel tasks which have local libraries used (#751)
## Changes
Process only Python wheel tasks which have local libraries used

## Tests
Updated uni test to catch the regression
2023-09-08 11:08:21 +00:00
Andrew Nester f7566b8264
Close local Terraform state file when pushing to remote (#752)
## Changes
Close local Terraform state file when pushing to remote

Should help fix E2E test cleanup
```
testing.go:1225: TempDir RemoveAll cleanup: remove 
C:\Users\RUNNER~1\AppData\Local\Temp\TestAccPythonWheelTaskDeployAndRun1395546390\001\.databricks\bundle\default\terraform\terraform.tfstate: 
The process cannot access the file because it is being used by another process.
```
2023-09-08 10:47:17 +00:00
Andrew Nester e64463ba47
Fixed marking libraries from DBFS as remote (#750)
## Changes
Fixed marking libraries from DBFS as remote

## Tests
Updated unit tests to catch the regression
2023-09-08 09:53:57 +00:00
Andrew Nester e08f419ef6
Do not include empty output in job run output (#749)
## Changes
Do not include empty output in job run output

## Tests
Running a job from CLI, the result:
```
andrew.nester@HFW9Y94129 wheel % databricks bundle run some_other_job --output json
Run URL: https://***/?o=6051921418418893#job/780620378804085/run/386695528477456

2023-09-08 11:33:24 "[default] My Wheel Job" TERMINATED SUCCESS 
{
  "task_outputs": [
    {
      "TaskKey": "TestTask",
      "Output": {
        "result": "Hello from my func\nGot arguments v2:\n['python']\n"
      },
      "EndTime": 1694165597474
    }
  ]
```
2023-09-08 09:52:45 +00:00
Andrew Nester 17d9f7dd2a
Use unique bundle root path for Python E2E test (#748)
## Changes
It helps to make sure jobs in the tests are deployed and executed
uniquely and isolated


```
Bundle remote directory is /Users/61b77d30-bc10-4214-9650-29cf5db0e941/.bundle/4b630810-5edc-4d8f-85d1-0eb5baf7bb28
Deleted snapshot file at /var/folders/nt/xjv68qzs45319w4k36dhpylc0000gp/T/TestAccPythonWheelTaskDeployAndRun3933198431/001/.databricks/bundle/default/sync-snapshots/dd9db100465e3d91.json
Successfully deleted files!
--- PASS: TestAccPythonWheelTaskDeployAndRun (346.28s)
PASS
coverage: 93.5% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       346.976s        coverage: 93.5% of statements in ./...
```
2023-09-08 09:19:55 +00:00
Arpit Jasapara 50eaf16307
Support Model Serving Endpoints in bundles (#682)
## Changes
<!-- Summary of your changes that are easy to understand -->
Add Model Serving Endpoints to Databricks Bundles

## Tests
<!-- How is this tested? -->
Unit tests and manual testing via
https://github.com/databricks/bundle-examples-internal/pull/76
<img width="1570" alt="Screenshot 2023-08-28 at 7 46 23 PM"
src="https://github.com/databricks/cli/assets/87999496/7030ebd8-b0e2-4ad1-a9e3-5ff8454f1175">
<img width="747" alt="Screenshot 2023-08-28 at 7 47 01 PM"
src="https://github.com/databricks/cli/assets/87999496/fb9b54d7-54e2-43ce-9148-68fb620c809a">

Signed-off-by: Arpit Jasapara <arpit.jasapara@databricks.com>
2023-09-07 21:54:31 +00:00
Andrew Nester 5a14c7cb43
Generate unique name for a job in Python wheel test (#745)
## Changes
Generate unique name for a job in Python wheel test
2023-09-07 20:02:26 +00:00
shreyas-goenka 1a7bf4e4f1
Add schema and config validation to jsonschema package (#740)
## Changes

At a high level this PR adds new schema validation and moves
functionality that should be present in the jsonschema package, but
resides in the template package today, to the jsonschema package. This
includes for example schema validation, schema instance validation, to /
from string conversion methods etc.

The list below outlines all the pieces that have been moved over, and
the new validation bits added.

This PR:
1. Adds casting default value of schema properties to integers to the
jsonschema.Load method.
2. Adds validation for default value types for schema properties,
checking they are consistant with the type defined.
3. Introduces the LoadInstance and ValidateInstance methods to the json
schema package. These methods can be used to read and validate JSON
documents against the schema.
4. Replaces validation done for template inputs to use the newly defined
JSON schema validation functions.
5. Moves to/from string and isInteger utility methods to the json schema
package.

## Tests
Existing and new unit tests.
2023-09-07 14:36:06 +00:00
Andrew Nester 10e0836749
Added end-to-end test for deploying and running Python wheel task (#741)
## Changes
Added end-to-end test for deploying and running Python wheel task

## Tests
Test successfully passed on all environments, takes about 9-10 minutes
to pass.

```
Deleted snapshot file at /var/folders/nt/xjv68qzs45319w4k36dhpylc0000gp/T/TestAccPythonWheelTaskDeployAndRun1845899209/002/.databricks/bundle/default/sync-snapshots/1f7cc766ffe038d6.json
Successfully deleted files!
2023/09/06 17:50:50 INFO Releasing deployment lock mutator=destroy mutator=seq mutator=seq mutator=deferred mutator=lock:release
--- PASS: TestAccPythonWheelTaskDeployAndRun (508.16s)
PASS
coverage: 77.9% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       508.810s        coverage: 77.9% of statements in ./...
```

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-07 14:08:16 +00:00
Pieter Noordhuis c0ebfb8101
Fix conversion of job parameters (#744)
## Changes

Another example of singular/plural conversion.

Longer term solution is we do a full sweep of the type using reflection
to make sure we cover all fields.

## Tests

Unit test passes.
2023-09-07 12:48:59 +00:00
Lennart Kats (databricks) 50b2c0b83b
Fix notebook showing up in template when not selected (#743)
## Changes
This fixes a typo that caused the notebook.ipynb file to show up even if
the user answered "no" to the question about including a notebook.

## Tests
We have matrix validation tests for all the yes/no combinations and
whether the build + validate. There is no current test for the absence
of files.
2023-09-07 08:26:43 +00:00
Lennart Kats (databricks) 3c79181148
Remove unused file (#742)
defaults.json was originally used in tests. It's no longer used and
should be removed.
2023-09-06 18:18:15 +00:00
Pieter Noordhuis c8f5990f47
Release v0.204.0 (#738)
This release includes permission related commands for a subset of
workspace
services where they apply. These complement the `permissions` command
and
do not require specification of the object type to work with, as that is
implied by the command they are nested under.

CLI:
* Group permission related commands
([#730](https://github.com/databricks/cli/pull/730)).

Bundles:
* Fixed artifact file uploading on Windows and wheel execution on DBR
13.3 ([#722](https://github.com/databricks/cli/pull/722)).
* Make resource and artifact paths in bundle config relative to config
folder ([#708](https://github.com/databricks/cli/pull/708)).
* Add support for ordering of input prompts
([#662](https://github.com/databricks/cli/pull/662)).
* Fix IsServicePrincipal() only working for workspace admins
([#732](https://github.com/databricks/cli/pull/732)).
* databricks bundle init template v1
([#686](https://github.com/databricks/cli/pull/686)).
* databricks bundle init template v2: optional stubs, DLT support
([#700](https://github.com/databricks/cli/pull/700)).
* Show 'databricks bundle init' template in CLI prompt
([#725](https://github.com/databricks/cli/pull/725)).
* Include $PATH in set of environment variables to pass along.
([#736](https://github.com/databricks/cli/pull/736)).

Internal:
* Update Go SDK to v0.19.0
([#729](https://github.com/databricks/cli/pull/729)).
* Replace API call to test configuration with dummy authenticate call
([#728](https://github.com/databricks/cli/pull/728)).

API Changes:
* Changed `databricks account storage-credentials create` command to
return .
* Changed `databricks account storage-credentials get` command to return
.
* Changed `databricks account storage-credentials list` command to
return .
* Changed `databricks account storage-credentials update` command to
return .
* Changed `databricks connections create` command with new required
argument order.
* Changed `databricks connections update` command with new required
argument order.
* Changed `databricks volumes create` command with new required argument
order.
 * Added `databricks artifact-allowlists` command group.
 * Added `databricks model-versions` command group.
 * Added `databricks registered-models` command group.
 * Added `databricks cluster-policies get-permission-levels` command.
 * Added `databricks cluster-policies get-permissions` command.
 * Added `databricks cluster-policies set-permissions` command.
 * Added `databricks cluster-policies update-permissions` command.
 * Added `databricks clusters get-permission-levels` command.
 * Added `databricks clusters get-permissions` command.
 * Added `databricks clusters set-permissions` command.
 * Added `databricks clusters update-permissions` command.
 * Added `databricks instance-pools get-permission-levels` command.
 * Added `databricks instance-pools get-permissions` command.
 * Added `databricks instance-pools set-permissions` command.
 * Added `databricks instance-pools update-permissions` command.
 * Added `databricks files` command group.
 * Changed `databricks permissions set` command to start returning .
 * Changed `databricks permissions update` command to start returning .
 * Added `databricks users get-permission-levels` command.
 * Added `databricks users get-permissions` command.
 * Added `databricks users set-permissions` command.
 * Added `databricks users update-permissions` command.
 * Added `databricks jobs get-permission-levels` command.
 * Added `databricks jobs get-permissions` command.
 * Added `databricks jobs set-permissions` command.
 * Added `databricks jobs update-permissions` command.
 * Changed `databricks experiments get-by-name` command to return .
 * Changed `databricks experiments get-experiment` command to return .
 * Added `databricks experiments delete-runs` command.
 * Added `databricks experiments get-permission-levels` command.
 * Added `databricks experiments get-permissions` command.
 * Added `databricks experiments restore-runs` command.
 * Added `databricks experiments set-permissions` command.
 * Added `databricks experiments update-permissions` command.
 * Added `databricks model-registry get-permission-levels` command.
 * Added `databricks model-registry get-permissions` command.
 * Added `databricks model-registry set-permissions` command.
 * Added `databricks model-registry update-permissions` command.
 * Added `databricks pipelines get-permission-levels` command.
 * Added `databricks pipelines get-permissions` command.
 * Added `databricks pipelines set-permissions` command.
 * Added `databricks pipelines update-permissions` command.
 * Added `databricks serving-endpoints get-permission-levels` command.
 * Added `databricks serving-endpoints get-permissions` command.
 * Added `databricks serving-endpoints set-permissions` command.
 * Added `databricks serving-endpoints update-permissions` command.
 * Added `databricks token-management get-permission-levels` command.
 * Added `databricks token-management get-permissions` command.
 * Added `databricks token-management set-permissions` command.
 * Added `databricks token-management update-permissions` command.
* Changed `databricks dashboards create` command with new required
argument order.
 * Added `databricks warehouses get-permission-levels` command.
 * Added `databricks warehouses get-permissions` command.
 * Added `databricks warehouses set-permissions` command.
 * Added `databricks warehouses update-permissions` command.
 * Added `databricks dashboard-widgets` command group.
 * Added `databricks query-visualizations` command group.
 * Added `databricks repos get-permission-levels` command.
 * Added `databricks repos get-permissions` command.
 * Added `databricks repos set-permissions` command.
 * Added `databricks repos update-permissions` command.
 * Added `databricks secrets get-secret` command.
 * Added `databricks workspace get-permission-levels` command.
 * Added `databricks workspace get-permissions` command.
 * Added `databricks workspace set-permissions` command.
 * Added `databricks workspace update-permissions` command.

OpenAPI commit 09a7fa63d9ae243e5407941f200960ca14d48b07 (2023-09-04)
2023-09-06 11:46:21 +00:00
Lennart Kats (databricks) f9e521b43e
databricks bundle init template v2: optional stubs, DLT support (#700)
## Changes

This follows up on https://github.com/databricks/cli/pull/686. This PR
makes our stubs optional + it adds DLT stubs:

```
$ databricks bundle init
Template to use [default-python]: default-python
Unique name for this project [my_project]: my_project
Include a stub (sample) notebook in 'my_project/src' [yes]: yes
Include a stub (sample) DLT pipeline in 'my_project/src' [yes]: yes
Include a stub (sample) Python package 'my_project/src' [yes]: yes
 Successfully initialized template
```

## Tests
Manual testing, matrix tests.

---------

Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
Co-authored-by: PaulCornellDB <paul.cornell@databricks.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-06 09:52:31 +00:00
Andrew Nester a41b9e8bf2
Added description for version command (#737)
## Changes
Added description for version command

## Tests
```
databricks help

...

Additional Commands:
  account              Databricks Account Commands
  api                  Perform Databricks API call
  auth                 Authentication related commands
  bundle               Databricks Asset Bundles
  completion           Generate the autocompletion script for the specified shell
  fs                   Filesystem related commands
  help                 Help about any command
  sync                 Synchronize a local directory to a workspace directory
  version              Retrieve information about the current version of CLI
```

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-06 08:41:47 +00:00
Pieter Noordhuis fabe8e88b8
Include $PATH in set of environment variables to pass along. (#736)
## Changes

This is necessary to ensure that our Terraform provider can use the same
auxiliary programs (e.g. `az`, or `gcloud`) as the CLI.

## Tests

Unit test and manual verification.
2023-09-06 07:54:35 +00:00
shreyas-goenka 9194418ac1
Fix regex error check in mkdir integration test (#735)
## Changes
Fixes test for all cloud provider after the Go SDK bump which introduces
the `non retryable error` prefix to errors. The test passes now.
2023-09-05 14:25:26 +00:00
Lennart Kats (databricks) e533f9109a
Show 'databricks bundle init' template in CLI prompt (#725)
~(this should be changed to target `main`)~

This reveals the template from
https://github.com/databricks/cli/pull/686 in CLI prompts for once #686
and #708 are merged.

---------

Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
Co-authored-by: PaulCornellDB <paul.cornell@databricks.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-05 13:57:01 +00:00
Lennart Kats (databricks) 8c2cc07f7b
databricks bundle init template v1 (#686)
## Changes

This adds a built-in "default-python" template to the CLI. This is based
on the new default-template support of
https://github.com/databricks/cli/pull/685.

The goal here is to offer an experience where customers can simply type
`databricks bundle init` to get a default template:

```
$ databricks bundle init
Template to use [default-python]: default-python
Unique name for this project [my_project]: my_project
 Successfully initialized template
```

The present template:
- [x] Works well with VS Code
- [x] Works well with the workspace
- [x] Works well with DB Connect
- [x] Uses minimal stubs rather than boiler-plate-heavy examples

I'll have a followup with tests + DLT support.

---------

Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
Co-authored-by: PaulCornellDB <paul.cornell@databricks.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-05 11:58:34 +00:00
Lennart Kats (databricks) 947d5b1e5c
Fix IsServicePrincipal() only working for workspace admins (#732)
## Changes

The latest rendition of isServicePrincipal no longer worked for
non-admin users as it used the "principals get" API.

This new version relies on the property that service principals always
have a UUID as their userName. This was tested with the eng-jaws
principal (8b948b2e-d2b5-4b9e-8274-11b596f3b652).
2023-09-05 11:20:55 +00:00
Pieter Noordhuis f62def3e77
Replace API call to test configuration with dummy authenticate call (#728)
## Changes

This reduces the latency of every workspace command by the duration of a
single API call to retrieve the current user (which can take up to a
full second).

Note: the better place to verify that a request can be authenticated is
the SDK itself.

## Tests

* Unit test to confirm an the empty `*http.Request` can be constructed
* Manually confirmed that the additional API call no longer happens
2023-09-05 11:10:37 +00:00
shreyas-goenka bbbeabf98c
Add support for ordering of input prompts (#662)
## Changes

JSON schema properties are a map and thus unordered.

This PR introduces a JSON schema extension field called `order` to allow
template authors to define the order in which template variables should
be resolved/prompted.

## Tests

Unit tests.

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-05 11:08:25 +00:00
Pieter Noordhuis 2f2386ef5a
Work on GitHub Action (#733)
## Changes

* Run the build workflow on push to main to properly use the build cache

Same as https://github.com/databricks/databricks-sdk-go/pull/601.

## Tests

n/a
2023-09-05 09:58:56 +00:00
Pieter Noordhuis 7a130a3e6e
Group permission related commands (#730)
## Changes

Before:
```
Usage:
  databricks instance-pools [command]

Available Commands:
  create                Create a new instance pool.
  delete                Delete an instance pool.
  edit                  Edit an existing instance pool.
  get                   Get instance pool information.
  get-permission-levels Get instance pool permission levels.
  get-permissions       Get instance pool permissions.
  list                  List instance pool info.
  set-permissions       Set instance pool permissions.
  update-permissions    Update instance pool permissions.
```

After:
```
Usage:
  databricks instance-pools [command]

Available Commands
  create                Create a new instance pool.
  delete                Delete an instance pool.
  edit                  Edit an existing instance pool.
  get                   Get instance pool information.
  list                  List instance pool info.

Permission Commands
  get-permission-levels Get instance pool permission levels.
  get-permissions       Get instance pool permissions.
  set-permissions       Set instance pool permissions.
  update-permissions    Update instance pool permissions.
```

## Tests

Manual.
2023-09-05 09:58:45 +00:00
Pieter Noordhuis 1752e29885
Update Go SDK to v0.19.0 (#729)
## Changes

* Update Go SDK to v0.19.0
* Update commands per OpenAPI spec from Go SDK
* Incorporate `client.Do()` signature change to include a (nil) header
map
* Update `workspace.WorkspaceService` mock with permissions methods
* Skip `files` service in codegen; already implemented under the `fs`
command

## Tests

Unit and integration tests pass.
2023-09-05 09:43:57 +00:00
Pieter Noordhuis 437263eb58
Upgrade to actions/checkout@v4 (#731)
## Changes

This should fix intermittent failures with v3 (see
https://github.com/actions/checkout/issues/1448)
2023-09-05 08:27:18 +00:00
Andrew Nester 83443bae8d
Make resource and artifact paths in bundle config relative to config folder (#708)
# Warning: breaking change

## Changes
Instead of having paths in bundle config files be relative to bundle
root even if the config file is nested, this PR makes such paths
relative to the folder where the config is located.

When bundle is initialised, these paths will be transformed to relative
paths based on bundle root. For example,
we have file structure like this
```
- mybundle
| - bundle.yml
| - subfolder
| -- resource.yml
| -- my.whl
```

Previously, we had to reference `my.whl` in resource.yml like this,
which was confusing because resource.yml is in the same subfolder
```
sync:
  include:
    - ./subfolder/*.whl
...
tasks:
  - task_key: name
    libraries:
      - whl: ./subfolder/my.whl
...
```

After the change we can reference it like this (which is in line with
the current behaviour for notebooks)

```
sync:
  include:
    - ./*.whl
...
tasks:
  - task_key: name
    libraries:
      - whl: ./my.whl
...
```

## Tests
Existing `translate_path_tests` successfully passed after refactoring.

Added a couple of uses cases for `Libraries` paths.

Added a bundle config tests with include config and sync section

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-04 09:55:01 +00:00
Lennart Kats (databricks) e22fd73b7d
Cleanup after previous PR comments (#724)
## Changes

@pietern this addresses a comment from you on a recently merged PR. It
also updates settings.json based on the settings VS Code adds as soon as
you edit a notebook.
2023-09-04 07:07:17 +00:00
Andrew Nester 86c30dd328
Fixed artifact file uploading on Windows and wheel execution on DBR 13.3 (#722)
## Changes
Fixed artifact file uploading on Windows and wheel execution on DBR 13.3

Fixes #719, #720

## Tests
Added regression test for Windows
2023-08-31 14:10:32 +00:00
Pieter Noordhuis cc1038fbd5
Upgrade to actions/setup-go@v4 (#718)
## Changes

Version 4 enables caching by default so we no longer need to explicitly
enable it: https://github.com/actions/setup-go#v4.

The build cache only reuses a cache from a repo's default branch, which
for this repository is `main`. After enabling the merge queue, we no
longer run builds on the `main` branch after push, but on merge queue
branches. With no more builds on the `main` branch there is no longer a
cache to reuse.

This change fixes that by making the `release(-snapshot)?` workflows use
the same caching mechanism. These run off of the `main` branch, so the
cache they save can be reused by builds triggered on PRs or from the
merge queue.

## Tests

We have to merge this to see if it works.
2023-08-30 14:57:34 +00:00
Andrew Nester deebaa89f7
Release v0.203.3 (#716)
Bundles:
* Support cluster overrides with cluster_key and compute_key
([#696](https://github.com/databricks/cli/pull/696)).
* Allow referencing local Python wheels without artifacts section
defined ([#703](https://github.com/databricks/cli/pull/703)).
* Fixed --environment flag
([#705](https://github.com/databricks/cli/pull/705)).
* Correctly identify local paths in libraries section
([#702](https://github.com/databricks/cli/pull/702)).
* Fixed path joining in FindFilesWithSuffixInPath
([#704](https://github.com/databricks/cli/pull/704)).
* Added transformation mutator for Python wheel task for them to work on
DBR <13.1 ([#635](https://github.com/databricks/cli/pull/635)).

Internal:
* Add a foundation for built-in templates
([#685](https://github.com/databricks/cli/pull/685)).
* Test transform when no Python wheel tasks defined
([#714](https://github.com/databricks/cli/pull/714)).
* Pin Terraform binary version to 1.5.5
([#715](https://github.com/databricks/cli/pull/715)).
* Cleanup after "Add a foundation for built-in templates"
([#707](https://github.com/databricks/cli/pull/707)).
* Filter down to Python wheel tasks only for trampoline
([#712](https://github.com/databricks/cli/pull/712)).
* Update Terraform provider schema structs from 1.23.0
([#713](https://github.com/databricks/cli/pull/713)).
2023-08-30 14:31:36 +00:00
Andrew Nester a548eba492
Test transform when no Python wheel tasks defined (#714)
## Changes
Fixed panic from Python transform when no python wheel tasks defined

## Tests
Added regression test
2023-08-30 14:09:15 +00:00
Pieter Noordhuis 46b999ed42
Pin Terraform binary version to 1.5.5 (#715)
## Changes

The installer doesn't respect the version constraints if they are
specified.

Source: [the vc argument is not
used](850464c601/releases/latest_version.go (L158-L177)).

## Tests

Confirmed manually.
2023-08-30 14:08:37 +00:00
Lennart Kats (databricks) 707fd6f617
Cleanup after "Add a foundation for built-in templates" (#707)
## Changes
Add some cleanup based on @pietern's comments on
https://github.com/databricks/cli/pull/685
2023-08-30 14:01:08 +00:00
Pieter Noordhuis aa9e1fc41c
Update Terraform provider schema structs from 1.23.0 (#713)
## Changes

The provider at version 1.24.0 includes a regression for the MLflow
model resource.

To fix this, we explicitly pin the provider version at the version we
generate bindings for.

## Tests

Confirmed that a deploy of said MLflow model resource works with 1.23.0.
2023-08-30 13:58:28 +00:00
Pieter Noordhuis ca2f1dc06c
Filter down to Python wheel tasks only for trampoline (#712)
## Changes

Fixes issue introduced in #635.

## Tests

Added new unit test to confirm correct behavior.

Manually deployed sample bundle.
2023-08-30 13:51:15 +00:00
Andrew Nester 12368e3382
Added transformation mutator for Python wheel task for them to work on DBR <13.1 (#635)
## Changes
***Note: this PR relies on sync.include functionality from here:
https://github.com/databricks/cli/pull/671***

Added transformation mutator for Python wheel task for them to work on
DBR <13.1

Using wheels upload to Workspace file system as cluster libraries is not
supported in DBR < 13.1

In order to make Python wheel work correctly on DBR < 13.1 we do the
following:
1. Build and upload python wheel as usual
2. Transform python wheel task into special notebook task which does the
following
a. Installs all necessary wheels with %pip magic
b. Executes defined entry point with all provided parameters
3. Upload this notebook file to workspace file system
4. Deploy transformed job task

This is also beneficial for executing on existing clusters because this
notebook always reinstall wheels so if there are any changes to the
wheel package, they are correctly picked up

## Tests
bundle.yml

```yaml
bundle:
  name: wheel-task

workspace:
  host: ****

resources:
  jobs:
    test_job:
      name: "[${bundle.environment}] My Wheel Job"
      tasks:
        - task_key: TestTask
          existing_cluster_id: "***"
          python_wheel_task:
            package_name: "my_test_code"
            entry_point: "run"
            parameters: ["first argument","first value","second argument","second value"]
          libraries:
          - whl: ./dist/*.whl
```

Output
```
andrew.nester@HFW9Y94129 wheel % databricks bundle run test_job
Run URL: ***

2023-08-03 15:58:04 "[default] My Wheel Job" TERMINATED SUCCESS 
Output:
=======
Task TestTask:
Hello from my func
Got arguments v1:
['python', 'first argument', 'first value', 'second argument', 'second value']

```
2023-08-30 12:21:39 +00:00
Andrew Nester 3f2cf3c6b7
Fixed path joining in FindFilesWithSuffixInPath (#704)
## Changes
Fixes #693 

## Tests
Newly added tests failed before the fix:
https://github.com/databricks/cli/actions/runs/6000754026/job/16273507998?pr=704
2023-08-29 08:26:26 +00:00
Andrew Nester 842cd8b7ae
Correctly identify local paths in libraries section (#702)
## Changes
Fixes #699

## Tests
Added unit test
2023-08-29 08:26:09 +00:00
Andrew Nester 5477afe4f4
Fixed --environment flag (#705)
## Changes
Fixed --environment flag

Fixes https://github.com/databricks/setup-cli/issues/35

## Tests
Added regression test
2023-08-28 17:05:55 +00:00
Andrew Nester 5f6289e3a7
Allow referencing local Python wheels without artifacts section defined (#703)
## Changes
Now if the user reference local Python wheel files and do not specify
"artifacts" section, this file will be automatically uploaded by CLI.

Fixes #693 

## Tests
Added unit tests

Ran bundle deploy for this configuration
```
resources:
  jobs:
    some_other_job:
      name: "[${bundle.environment}] My Wheel Job"
      tasks:
        - task_key: TestTask
          existing_cluster_id: ${var.job_existing_cluster}
          python_wheel_task:
            package_name: "my_test_code"
            entry_point: "run"
          libraries:
          - whl: ./dist/*.whl
 ```
 
 Result
 
 ```
andrew.nester@HFW9Y94129 wheel % databricks bundle deploy
artifacts.whl.AutoDetect: Detecting Python wheel project...
artifacts.whl.AutoDetect: No Python wheel project found at bundle root folder
Starting upload of bundle files
Uploaded bundle files at /Users/andrew.nester@databricks.com/.bundle/wheel-task/default/files!

artifacts.Upload(my_test_code-0.0.1-py3-none-any.whl): Uploading...
artifacts.Upload(my_test_code-0.0.1-py3-none-any.whl): Upload succeeded
 
 ```
2023-08-28 16:29:04 +00:00
Lennart Kats (databricks) 861f33d376
Support cluster overrides with cluster_key and compute_key (#696)
## Changes

Support `databricks bundle deploy --compute-id my_all_purpose_cluster`
in two missing cases.
2023-08-28 07:51:35 +00:00
Lennart Kats (databricks) a5b86093ec
Add a foundation for built-in templates (#685)
## Changes

This pull request extends the templating support in preparation of a
new, default template (WIP, https://github.com/databricks/cli/pull/686):
* builtin templates that can be initialized using e.g. `databricks
bundle init default-python`
* builtin templates are embedded into the executable using go's `embed`
functionality, making sure they're co-versioned with the CLI
* new helpers to get the workspace name, current user name, etc. help
craft a complete template
* (not enabled yet) when the user types `databricks bundle init` they
can interactively select the `default-python` template

And makes two tangentially related changes:
* IsServicePrincipal now uses the "users" API rather than the
"principals" API, since the latter is too slow for our purposes.
* mode: prod no longer requires the 'target.prod.git' setting. It's hard
to set that from a template. (Pieter is planning an overhaul of warnings
support; this would be one of the first warnings we show.)

The actual `default-python` template is maintained in a separate PR:
https://github.com/databricks/cli/pull/686

## Tests
Unit tests, manual testing
2023-08-25 09:03:42 +00:00
Andrew Nester c5cd20de23
Release v0.203.2 (#694)
CLI:
* Added `databricks account o-auth-enrollment enable` command
([#687](https://github.com/databricks/cli/pull/687)).

Bundles:
* Do not try auto detect Python package if no Python wheel tasks defined
([#674](https://github.com/databricks/cli/pull/674)).
* Renamed `environments` to `targets` in bundle configuration
([#670](https://github.com/databricks/cli/pull/670)).
* Rename init project-dir flag to output-dir
([#676](https://github.com/databricks/cli/pull/676)).
* Added support for sync.include and sync.exclude sections
([#671](https://github.com/databricks/cli/pull/671)).
* Add template directory flag for bundle templates
([#675](https://github.com/databricks/cli/pull/675)).
* Never ignore root directory when enumerating files in a repository
([#683](https://github.com/databricks/cli/pull/683)).
* Improve 'mode' error message
([#681](https://github.com/databricks/cli/pull/681)).
* Added run_as section for bundle configuration
([#692](https://github.com/databricks/cli/pull/692)).
2023-08-24 11:04:20 +00:00
Andrew Nester 4ee926b885
Added run_as section for bundle configuration (#692)
## Changes
Added run_as section for bundle configuration.

This section allows to define an user name or service principal which
will be applied as an execution identity for jobs and DLT pipelines. In
the case of DLT, identity defined in `run_as` will be assigned
`IS_OWNER` permission on this pipeline.

## Tests
Added unit tests for configuration.

Also ran deploy for the following bundle configuration

```
bundle:
  name: "run_as"

run_as:
  # service_principal_name: "f7263fcc-56d0-4981-8baf-c2a45296690b"
  user_name: "lennart.kats@databricks.com"

resources:
  pipelines:
    andrew_pipeline:
      name: "Andrew Nester pipeline"
      libraries:
        - notebook:
            path: ./test.py

  jobs:
    job_one:
      name: Job One
      tasks:
        - task_key: "task"
          new_cluster:
            num_workers: 1
            spark_version: 13.2.x-snapshot-scala2.12
            node_type_id: i3.xlarge
            runtime_engine: PHOTON
          notebook_task: 
            notebook_path: "./test.py"
```
2023-08-23 16:47:07 +00:00
Serge Smertin 5ed635a240
Added `databricks account o-auth-enrollment enable` command (#687)
This command takes the user through the interactive flow to set up OAuth
for a fresh account, where only Basic authentication works.

---------

Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2023-08-21 16:17:02 +00:00