Commit Graph

12 Commits

Author SHA1 Message Date
Andrew Nester 1872aa12b3
Added support for job environments (#1379)
## Changes
The main changes are:
1. Don't link artifacts to libraries anymore and instead just iterate
over all jobs and tasks when uploading artifacts and update local path
to remote
2. Iterating over `jobs.environments` to check if there are any local
libraries and checking that they exist locally
3. Added tests to check environments are handled correctly

End-to-end test will follow up

## Tests
Added regression test, existing tests (including integration one) pass
2024-04-22 11:44:34 +00:00
Pieter Noordhuis 00d76d5afa
Move path field to bundle type (#1316)
## Changes

The bundle path was previously stored on the `config.Root` type under
the assumption that the first configuration file being loaded would set
it. This is slightly counterintuitive and we know what the path is upon
construction of the bundle. The new location for this property reflects
this.

## Tests

Unit tests pass.
2024-03-27 09:03:24 +00:00
Pieter Noordhuis ed194668db
Return `diag.Diagnostics` from mutators (#1305)
## Changes

This diagnostics type allows us to capture multiple warnings as well as
errors in the return value. This is a preparation for returning
additional warnings from mutators in case we detect non-fatal problems.

* All return statements that previously returned an error now return
`diag.FromErr`
* All return statements that previously returned `fmt.Errorf` now return
`diag.Errorf`
* All `err != nil` checks now use `diags.HasError()` or `diags.Error()`

## Tests

* Existing tests pass.
* I confirmed no call site under `./bundle` or `./cmd/bundle` uses
`errors.Is` on the return value from mutators. This is relevant because
we cannot wrap errors with `%w` when calling `diag.Errorf` (like
`fmt.Errorf`; context in https://github.com/golang/go/issues/47641).
2024-03-25 14:18:47 +00:00
Andrew Nester ac37a592f1
Added exec.NewCommandExecutor to execute commands with correct interpreter (#1075)
## Changes
Instead of handling command chaining ourselves, we execute passed
commands as-is by storing them, in temp file and passing to correct
interpreter (bash or cmd) based on OS.

Fixes #1065 

## Tests
Added unit tests
2023-12-21 15:45:23 +00:00
Lennart Kats (databricks) 875c9d2db1
Tune output of bundle deploy command (#1047)
## Changes

Update the output of the `deploy` command to be more concise and
consistent:
```
$ databricks bundle deploy
Building my_project...
Uploading my_project-0.0.1+20231207.205106-py3-none-any.whl...
Uploading bundle files to /Users/lennart.kats@databricks.com/.bundle/my_project/dev/files...
Deploying resources...
Updating deployment state...
Deployment complete!
```

This does away with the intermediate success messages, makes consistent
use of `...`, and only prints the success message at the very end after
everything is completed.

Below is the original output for comparison:

```
$ databricks bundle deploy
Detecting Python wheel project...
Found Python wheel project at /tmp/output/my_project
Building my_project...
Build succeeded
Uploading my_project-0.0.1+20231207.205134-py3-none-any.whl...
Upload succeeded
Starting upload of bundle files
Uploaded bundle files at /Users/lennart.kats@databricks.com/.bundle/my_project/dev/files!

Starting resource deployment
Resource deployment completed!
```
2023-12-21 08:00:37 +00:00
Andrew Nester 5431174302
Do not add wheel content hash in uploaded Python wheel path (#1015)
## Changes
Removed hash from the upload path since it's not useful anyway.

The main reason for that change was to make it work on all-purpose
clusters. But in order to make it work, wheel version needs to be
increased anyway. So having only hash in path is useless.

Note: using --build-number (build tag) flag does not help with
re-installing libraries on all-purpose clusters. The reason is that
`pip` ignoring build tag when upgrading the library and only look at
wheel version.
Build tag is only used for sorting the versions and the one with higher
build tag takes priority when installed. It only works if no library is
installed.
See
a15dd75d98/src/pip/_internal/index/package_finder.py (L522-L556)
https://github.com/pypa/pip/issues/4781

Thus, the only way to reinstall the library on all-purpose cluster is to
increase wheel version manually or use automatic version generation,
f.e.
```
setup(
  version=datetime.datetime.utcnow().strftime("%Y%m%d.%H%M%S"),
  ...
)
```

## Tests
Integration tests passed.
2023-11-29 10:40:12 +00:00
Serge Smertin 7d0f170eee
Added `python.DetectInterpreters` and other utils (#805)
This PR adds a few utilities related to Python interpreter detection:

- `python.DetectInterpreters` to detect all Python versions available in
`$PATH` by executing every matched binary name with `--version` flag.
- `python.DetectVirtualEnvPath` to detect if there's any child virtual
environment in `src` directory
- `python.DetectExecutable` to detect if there's python3 installed
either by `which python3` command or by calling
`python.DetectInterpreters().AtLeast("v3.8")`

To be merged after https://github.com/databricks/cli/pull/804, as one of
the steps to get https://github.com/databricks/cli/pull/637 in, as
previously discussed.
2023-10-03 10:47:09 +00:00
Andrew Nester 67af171a68
Process only Python wheel tasks which have local libraries used (#751)
## Changes
Process only Python wheel tasks which have local libraries used

## Tests
Updated uni test to catch the regression
2023-09-08 11:08:21 +00:00
Andrew Nester 5f6289e3a7
Allow referencing local Python wheels without artifacts section defined (#703)
## Changes
Now if the user reference local Python wheel files and do not specify
"artifacts" section, this file will be automatically uploaded by CLI.

Fixes #693 

## Tests
Added unit tests

Ran bundle deploy for this configuration
```
resources:
  jobs:
    some_other_job:
      name: "[${bundle.environment}] My Wheel Job"
      tasks:
        - task_key: TestTask
          existing_cluster_id: ${var.job_existing_cluster}
          python_wheel_task:
            package_name: "my_test_code"
            entry_point: "run"
          libraries:
          - whl: ./dist/*.whl
 ```
 
 Result
 
 ```
andrew.nester@HFW9Y94129 wheel % databricks bundle deploy
artifacts.whl.AutoDetect: Detecting Python wheel project...
artifacts.whl.AutoDetect: No Python wheel project found at bundle root folder
Starting upload of bundle files
Uploaded bundle files at /Users/andrew.nester@databricks.com/.bundle/wheel-task/default/files!

artifacts.Upload(my_test_code-0.0.1-py3-none-any.whl): Uploading...
artifacts.Upload(my_test_code-0.0.1-py3-none-any.whl): Upload succeeded
 
 ```
2023-08-28 16:29:04 +00:00
Andrew Nester 4694832534
Do not try auto detect Python package if no Python wheel tasks defined (#674)
## Changes
Fixes #673 

It also includes a change for `libraries` from #635 to get the list of
wheel tasks
2023-08-17 09:11:39 +00:00
Andrew Nester cfff140815
Auto detect Python wheel packages and infer build command (#603) 2023-07-26 10:07:26 +00:00
Andrew Nester 9a88fa602d
Added support for artifacts building for bundles (#583)
## Changes
Added support for artifacts building for bundles. 

Now it allows to specify `artifacts` block in bundle.yml and define a
resource (at the moment Python wheel) to be build and uploaded during
`bundle deploy`

Built artifact will be automatically attached to corresponding job task
or pipeline where it's used as a library

Follow-ups:
1. If artifact is used in job or pipeline, but not found in the config,
try to infer and build it anyway
2. If build command is not provided for Python wheel artifact, infer it
2023-07-25 13:35:08 +02:00