Commit Graph

8 Commits

Author SHA1 Message Date
Pieter Noordhuis d80c35f66a
Rename variable `bundle -> b` (#989)
## Changes

All calls to apply a mutator must go through `bundle.Apply`. This
conflicts with the existing use of the variable `bundle`. This change
un-aliases the variable from the package name by renaming all variables
to `b`.

## Tests

Pass.
2023-11-15 14:03:36 +00:00
Andrew Nester 0daa0022af
Make a notebook wrapper for Python wheel tasks optional (#797)
## Changes
Instead of always using notebook wrapper for Python wheel tasks, let's
make this an opt-in option.

Now by default Python wheel tasks will be deployed as is to Databricks
platform.
If notebook wrapper required (DBR < 13.1 or other configuration
differences), users can provide a following experimental setting

```
experimental:
  python_wheel_wrapper: true
```

Fixes #783,
https://github.com/databricks/databricks-asset-bundles-dais2023/issues/8

## Tests
Added unit tests.

Integration tests passed for both cases

```
    helpers.go:163: [databricks stdout]: Hello from my func
    helpers.go:163: [databricks stdout]: Got arguments:
    helpers.go:163: [databricks stdout]: ['my_test_code', 'one', 'two']
    ...
Bundle remote directory is ***/.bundle/ac05d5e8-ed4b-4e34-b3f2-afa73f62b021
Deleted snapshot file at /var/folders/nt/xjv68qzs45319w4k36dhpylc0000gp/T/TestAccPythonWheelTaskDeployAndRunWithWrapper3733431114/001/.databricks/bundle/default/sync-snapshots/cac1e02f3941a97b.json
Successfully deleted files!
--- PASS: TestAccPythonWheelTaskDeployAndRunWithWrapper (214.18s)
PASS
coverage: 93.5% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       214.495s        coverage: 93.5% of statements in ./...

```

```
    helpers.go:163: [databricks stdout]: Hello from my func
    helpers.go:163: [databricks stdout]: Got arguments:
    helpers.go:163: [databricks stdout]: ['my_test_code', 'one', 'two']
    ...
Bundle remote directory is ***/.bundle/0ef67aaf-5960-4049-bf1d-dc9e29157421
Deleted snapshot file at /var/folders/nt/xjv68qzs45319w4k36dhpylc0000gp/T/TestAccPythonWheelTaskDeployAndRunWithoutWrapper2340216760/001/.databricks/bundle/default/sync-snapshots/edf0b322cee93b13.json
Successfully deleted files!
--- PASS: TestAccPythonWheelTaskDeployAndRunWithoutWrapper (192.36s)
PASS
coverage: 93.5% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       195.130s        coverage: 93.5% of statements in ./...

```
2023-09-26 14:32:20 +00:00
Andrew Nester 18a5b05d82
Apply Python wheel trampoline if workspace library is used (#755)
## Changes
Workspace library will be detected by trampoline in 2 cases:
- User defined to use local wheel file
- User defined to use remote wheel file from Workspace file system

In both of these cases we should correctly apply Python trampoline

## Tests
Added a regression test (also covered by Python e2e test)
2023-09-08 13:45:21 +00:00
Andrew Nester 67af171a68
Process only Python wheel tasks which have local libraries used (#751)
## Changes
Process only Python wheel tasks which have local libraries used

## Tests
Updated uni test to catch the regression
2023-09-08 11:08:21 +00:00
Andrew Nester 83443bae8d
Make resource and artifact paths in bundle config relative to config folder (#708)
# Warning: breaking change

## Changes
Instead of having paths in bundle config files be relative to bundle
root even if the config file is nested, this PR makes such paths
relative to the folder where the config is located.

When bundle is initialised, these paths will be transformed to relative
paths based on bundle root. For example,
we have file structure like this
```
- mybundle
| - bundle.yml
| - subfolder
| -- resource.yml
| -- my.whl
```

Previously, we had to reference `my.whl` in resource.yml like this,
which was confusing because resource.yml is in the same subfolder
```
sync:
  include:
    - ./subfolder/*.whl
...
tasks:
  - task_key: name
    libraries:
      - whl: ./subfolder/my.whl
...
```

After the change we can reference it like this (which is in line with
the current behaviour for notebooks)

```
sync:
  include:
    - ./*.whl
...
tasks:
  - task_key: name
    libraries:
      - whl: ./my.whl
...
```

## Tests
Existing `translate_path_tests` successfully passed after refactoring.

Added a couple of uses cases for `Libraries` paths.

Added a bundle config tests with include config and sync section

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-04 09:55:01 +00:00
Andrew Nester a548eba492
Test transform when no Python wheel tasks defined (#714)
## Changes
Fixed panic from Python transform when no python wheel tasks defined

## Tests
Added regression test
2023-08-30 14:09:15 +00:00
Pieter Noordhuis ca2f1dc06c
Filter down to Python wheel tasks only for trampoline (#712)
## Changes

Fixes issue introduced in #635.

## Tests

Added new unit test to confirm correct behavior.

Manually deployed sample bundle.
2023-08-30 13:51:15 +00:00
Andrew Nester 12368e3382
Added transformation mutator for Python wheel task for them to work on DBR <13.1 (#635)
## Changes
***Note: this PR relies on sync.include functionality from here:
https://github.com/databricks/cli/pull/671***

Added transformation mutator for Python wheel task for them to work on
DBR <13.1

Using wheels upload to Workspace file system as cluster libraries is not
supported in DBR < 13.1

In order to make Python wheel work correctly on DBR < 13.1 we do the
following:
1. Build and upload python wheel as usual
2. Transform python wheel task into special notebook task which does the
following
a. Installs all necessary wheels with %pip magic
b. Executes defined entry point with all provided parameters
3. Upload this notebook file to workspace file system
4. Deploy transformed job task

This is also beneficial for executing on existing clusters because this
notebook always reinstall wheels so if there are any changes to the
wheel package, they are correctly picked up

## Tests
bundle.yml

```yaml
bundle:
  name: wheel-task

workspace:
  host: ****

resources:
  jobs:
    test_job:
      name: "[${bundle.environment}] My Wheel Job"
      tasks:
        - task_key: TestTask
          existing_cluster_id: "***"
          python_wheel_task:
            package_name: "my_test_code"
            entry_point: "run"
            parameters: ["first argument","first value","second argument","second value"]
          libraries:
          - whl: ./dist/*.whl
```

Output
```
andrew.nester@HFW9Y94129 wheel % databricks bundle run test_job
Run URL: ***

2023-08-03 15:58:04 "[default] My Wheel Job" TERMINATED SUCCESS 
Output:
=======
Task TestTask:
Hello from my func
Got arguments v1:
['python', 'first argument', 'first value', 'second argument', 'second value']

```
2023-08-30 12:21:39 +00:00