Commit Graph

233 Commits

Author SHA1 Message Date
Andrew Nester f3db42e622
Added support for top-level permissions (#928)
## Changes
Now it's possible to define top level `permissions` section in bundle
configuration and permissions defined there will be applied to all
resources defined in the bundle.

Supported top-level permission levels: CAN_MANAGE, CAN_VIEW, CAN_RUN.

Permissions are applied to: Jobs, DLT Pipelines, ML Models, ML
Experiments and Model Service Endpoints

```
bundle:
  name: permissions

workspace:
  host: ***

permissions:
  - level: CAN_VIEW
    group_name: test-group
  - level: CAN_MANAGE
    user_name: user@company.com
  - level: CAN_RUN
    service_principal_name: 123456-abcdef
```

## Tests
Added corresponding unit tests + ran `bundle validate` and `bundle
deploy` manually
2023-11-13 11:29:40 +00:00
Pieter Noordhuis 7847388f95
Initialize variable definitions that are defined without properties (#966)
## Changes

We can debate whether or not variable definitions without properties are
valid, but in no case should this panic the CLI.

Fixes #934.

## Tests

Unit.
2023-11-08 11:01:14 +00:00
Michał Szafrański 10291b0e13
Bundle path rewrites for dbt and SQL file tasks (#962)
## Changes
Support path rewrites for Dbt and SQL file job taks.
<!-- Summary of your changes that are easy to understand -->

## Tests
* Added unit test
<!-- How is this tested? -->
2023-11-07 20:00:09 +00:00
shreyas-goenka b6aa4631f1
Fix metadata computation for empty bundle (#939)
## Changes
This PR fixes metadata computation for empty bundle. Before we would
error because the `terraform.Load()` mutator errors on a empty / no
state file.

## Tests
Failing integration tests now pass.
2023-11-02 11:00:30 +00:00
shreyas-goenka 5a8cd0c5bc
Persist deployment metadata in WSFS (#845)
## Changes

This PR introduces a metadata struct that stores a subset of bundle
configuration that we wish to expose to other Databricks services that
wish to integrate with bundles.

This metadata file is uploaded to a file
`${bundle.workspace.state_path}/metadata.json` in the WSFS destination
of the bundle deployment.

Documentation for emitted metadata fields:
* `version`: Version for the metadata file schema
* `config.bundle.git.branch`: Name of the git branch the bundle was
deployed from.
* `config.bundle.git.origin_url`: URL for git remote "origin"
* `config.bundle.git.bundle_root_path`: Relative path of the bundle root
from the root of the git repository. Is set to "." if they are the same.
* `config.bundle.git.commit`: SHA-1 commit hash of the exact commit this
bundle was deployed from. Note, the deployment might not exactly match
this commit version if there are changes that have not been committed to
git at deploy time,
* `file_path`: Path in workspace where we sync bundle files to. 
* `resources.jobs.[job-ref].id`: Id of the job
* `resources.jobs.[job-ref].relative_path`: Relative path of the yaml
config file from the bundle root where this job was defined.

Example metadata object when bundle root and git root are the same:
```json
{
  "version": 1,
  "config": {
    "bundle": {
      "lock": {},
      "git": {
        "branch": "master",
        "origin_url": "www.host.com",
        "commit": "7af8e5d3f5dceffff9295d42d21606ccf056dce0",
        "bundle_root_path": "."
      }
    },
    "workspace": {
      "file_path": "/Users/shreyas.goenka@databricks.com/.bundle/pipeline-progress/default/files"
    },
    "resources": {
      "jobs": {
        "bar": {
          "id": "245921165354846",
          "relative_path": "databricks.yml"
        }
      }
    },
    "sync": {}
  }
}
```

Example metadata when the git root is one level above the bundle repo:
```json
{
  "version": 1,
  "config": {
    "bundle": {
      "lock": {},
      "git": {
        "branch": "dev-branch",
        "origin_url": "www.my-repo.com",
        "commit": "3db46ef750998952b00a2b3e7991e31787e4b98b",
        "bundle_root_path": "pipeline-progress"
      }
    },
    "workspace": {
      "file_path": "/Users/shreyas.goenka@databricks.com/.bundle/pipeline-progress/default/files"
    },
    "resources": {
      "jobs": {
        "bar": {
          "id": "245921165354846",
          "relative_path": "databricks.yml"
        }
      }
    },
    "sync": {}
  }
}
```


This unblocks integration to the jobs break glass UI for bundles.

## Tests
Unit tests and integration tests.
2023-10-27 12:55:43 +00:00
shreyas-goenka bb662fadbb
Bump Terraform provider to v1.29.0 (#926)
This PR:
1. Regenerates go structs using provider version 1.29
2. Adds QOL autogenerated diff labels for github
3. Adds a small SOP for doing the tf provider bump for go structs
2023-10-27 09:16:41 +00:00
Andrew Nester 6f22ae8696
Use UserName instead of Id to check if identity used is a service principal (#924)
## Changes
Use UserName instead of Id to check if identity used is a service
principal
2023-10-26 14:58:16 +00:00
Andrew Nester 19e00d2d47
Upload terraform state even if apply fails (#923)
## Changes
Upload terraform state even if apply fails

Fixes #893 

## Tests
Manually running `databricks bundle deploy` with incorrect permissions
in bundle config and observe that it gets uploaded correctly
2023-10-26 14:38:01 +00:00
Pieter Noordhuis 6e21ced54a
Consolidate bundle configuration loader function (#918)
## Changes

There were two functions related to loading a bundle configuration file;
one as a package function and one as a member function on the
configuration type. Loading the same configuration object twice doesn't
make sense and therefore we can consolidate to only using the package
function.

The package function would scan for known file names if the specified
path was a directory. This functionality was not in use because the
top-level bundle loader figures out the filename itself as of #580.

## Tests

Pass.
2023-10-25 12:55:56 +00:00
Pieter Noordhuis 486bf59627
Move bundle configuration filename code (#917)
## Changes

This is unrelated to the config root so belongs in a separate file (this
was added in #580).

## Tests

n/a
2023-10-25 09:54:39 +00:00
Lennart Kats (databricks) 9049f11479
Fix wheel task not working with with 13.x clusters (#898)
## Changes

This lets us recognize 13.x as "13.1 or higher," making it possible to
use wheel tasks on 13.x-snapshot clusters.
2023-10-23 08:19:26 +00:00
Pieter Noordhuis d4be40520c
Resolve configuration before performing verification (#890)
## Changes

If a bundle configuration specifies a workspace host, and the user
specifies a profile to use, we perform a check to confirm that the
workspace host in the bundle configuration and the workspace host from
the profile are identical. If they are not, we return an error. The
check was introduced in #571.

Previously, the code included an assumption that the client
configuration was already loaded from the environment prior to
performing the check. This was not the case, and as such if the user
intended to use a non-default path to `.databrickscfg`, this path was
not used when performing the check.

The fix does the following:
* Resolve the configuration prior to performing the check.
* Don't treat the configuration file not existing as an error.
* Add unit tests.

Fixes #884.

## Tests

Unit tests and manual confirmation.
2023-10-20 13:10:31 +00:00
Andrew Nester 7b1d972b33
Do not emit wheel wrapper error when python_wheel_wrapper setting is true (#894)
## Changes
Do not emit wheel wrapper error when python_wheel_wrapper setting is
true

Fixes #892 

## Tests
Added an regression test
2023-10-20 12:32:04 +00:00
Andrew Nester 5273d0c51a
Support Python wheels larger than 10MB (#879)
## Changes
Previously we only supported uploading Python wheels smaller than 10mb
due to using Workspace.Import API and `content ` field
https://docs.databricks.com/api/workspace/workspace/import

By switching to use `WorkspaceFilesClient` we overcome the limit because
it uses POST body for the API instead.

## Tests
`TestAccUploadArtifactFileToCorrectRemotePath` integration test passes

```
=== RUN   TestAccUploadArtifactFileToCorrectRemotePath
    artifacts_test.go:28: gcp
2023/10/17 15:24:04 INFO Using Google Credentials sdk=true
    helpers.go:356: Creating /Users/.../integration-test-wsfs-ekggbkcfdkid
artifacts.Upload(test.whl): Uploading...
2023/10/17 15:24:06 INFO Using Google Credentials mutator=artifacts.Upload(test) sdk=true
artifacts.Upload(test.whl): Upload succeeded
    helpers.go:362: Removing /Users/.../integration-test-wsfs-ekggbkcfdkid
--- PASS: TestAccUploadArtifactFileToCorrectRemotePath (5.66s)
PASS
coverage: 14.9% of statements in ./...
ok      github.com/databricks/cli/internal      6.109s  coverage: 14.9% of statements in ./...
```
2023-10-18 10:20:43 +00:00
Arpit Jasapara 24cc67563e
Support Unity Catalog Registered Models in bundles (#846)
## Changes
<!-- Summary of your changes that are easy to understand -->
Add UC Registered Models support to Databricks Asset Bundles as new
resource `registered_model`. Also added UC Permission support via new
resource `grant`.

## Tests
<!-- How is this tested? -->
Tested via unit tests and manual testing with [example
PR](https://github.com/databricks/bundle-examples-internal/pull/80) and
[custom Terraform
provider](https://github.com/databricks/terraform-provider-databricks/pull/2771).
<img width="698" alt="Screenshot 2023-10-08 at 4 57 23 PM"
src="https://github.com/databricks/cli/assets/87999496/bcf605a9-7894-443b-865a-f7e240037815">
<img width="1109" alt="Screenshot 2023-10-08 at 4 56 47 PM"
src="https://github.com/databricks/cli/assets/87999496/e4d6e424-cd70-4809-8843-6939ed2e172f">
<img width="1091" alt="Screenshot 2023-10-08 at 4 56 57 PM"
src="https://github.com/databricks/cli/assets/87999496/88ebaabb-67db-4a11-88a5-df087e2e41c0">

---------

Signed-off-by: Arpit Jasapara <arpit.jasapara@databricks.com>
Co-authored-by: Andrew Nester <andrew.nester.dev@gmail.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-10-16 15:32:49 +00:00
Pieter Noordhuis 61cf4fbe8d
Propagate Terraform provider version into generated config (#874)
## Changes

The preparations for this change were in place (see #713) but it wasn't
actually used.

## Tests

n/a
2023-10-16 15:27:46 +00:00
Pieter Noordhuis b940c8631e
Bump Terraform provider to v1.28.0 (#871)
## Changes

Regenerate structs for Terraform provider v1.28.0
([release](https://github.com/databricks/terraform-provider-databricks/releases/tag/v1.28.0)).

## Tests

n/a
2023-10-16 12:52:16 +00:00
Andrew Nester 30c4d2e8a7
Fixed merging task libraries from targets (#868)
## Changes
Previous we (erroneously) kept the reference and merged into the
original tasks and not the copies which we later used to replace
existing tasks. Thus the merging of slices and references was incorrect.

Fixes #864 

## Tests
Added a regression test
2023-10-16 08:48:32 +00:00
hectorcast-db 36f30c8b47
Update Go SDK to 0.23.0 and use custom marshaller (#772)
## Changes
Update Go SDK to 0.23.0 and use custom marshaller.
## Tests
* Run unit tests

* Run nightly

* Manual test:
```
./cli jobs create --json @myjob.json
```
with 
```
{
    "name": "my-job-marshal-test-go",
    "tasks": [{
        "task_key": "testgomarshaltask",
        "new_cluster": {
            "num_workers": 0,
            "spark_version": "10.4.x-scala2.12",
            "node_type_id": "Standard_DS3_v2"
        },
        "libraries": [
            {
                "jar": "dbfs:/max/jars/exampleJarTask.jar"
            }
        ],
        "spark_jar_task": {
            "main_class_name":  "com.databricks.quickstart.exampleTask"
        }
    }]
}
```
Main branch:
```
Error: Cluster validation error: Missing required field: settings.cluster_spec.new_cluster.size
```
This branch:
```
{
  "job_id":<jobid>
}
```

---------

Co-authored-by: Miles Yucht <miles@databricks.com>
2023-10-16 06:56:06 +00:00
Andrew Nester 943ea89728
Allow target overrides for sync section (#856)
## Changes
Allow target overrides for sync section

## Tests
Added tests
2023-10-10 15:18:18 +00:00
Andrew Nester 8d8de3f509
Fixed using repo files as pipeline libraries (#847)
## Changes
Fixed using repo files as pipeline libraries

## Tests
Added regression test
2023-10-09 10:10:28 +00:00
Andrew Nester aa54a8665a
Added support for glob patterns in pipeline libraries section (#833)
## Changes
Now it's possible to specify glob pattern in pipeline libraries section
and DAB will add all matched files as libraries

```
  pipelines:
    dummy:
      name: " DLT with Python files"
      target: "dlt_python_files"
      libraries:
        - file:
            path: ./*.py
```

## Tests
Added unit test
2023-10-04 13:23:13 +00:00
Andrew Nester 9b6a847178
Mark artifacts properties as optional (#834)
## Changes
Mark artifacts properties as optional

Fixes #816
2023-10-03 13:59:28 +00:00
Serge Smertin 7d0f170eee
Added `python.DetectInterpreters` and other utils (#805)
This PR adds a few utilities related to Python interpreter detection:

- `python.DetectInterpreters` to detect all Python versions available in
`$PATH` by executing every matched binary name with `--version` flag.
- `python.DetectVirtualEnvPath` to detect if there's any child virtual
environment in `src` directory
- `python.DetectExecutable` to detect if there's python3 installed
either by `which python3` command or by calling
`python.DetectInterpreters().AtLeast("v3.8")`

To be merged after https://github.com/databricks/cli/pull/804, as one of
the steps to get https://github.com/databricks/cli/pull/637 in, as
previously discussed.
2023-10-03 10:47:09 +00:00
Pieter Noordhuis f1b068cefe
Use normalized short name for tag value in development mode (#821)
## Changes

The jobs backend propagates job tags to the underlying cloud provider's
resources. As such, they need to match the constraints a cloud provider
places on tag values. The display name can contain anything. With this
change, we modify the tag value to equal the short name as used in the
name prefix.

Additionally, we leverage tag normalization as introduced in #819 to
make sure characters that aren't accepted are removed before using the
value as a tag value.

This is a new stab at #810 and should completely eliminate this class of
problems.

## Tests

Tests pass.
2023-10-02 06:58:51 +00:00
Andrew Nester 775251d0dc
Emit an error when incompatible all purpose cluster used with Python wheel tasks (#823)
## Changes
Follow up for https://github.com/databricks/cli/pull/807 to also
validate configuration if existing cluster id is used.

## Tests
Added unit tests
2023-09-29 12:19:05 +00:00
Pieter Noordhuis 30b4b8ce58
Allow digits in the generated short name (#820)
## Changes

Digits were previously replaced by `_`.

## Tests

Additional test cases with uncommon variations of email addresses.
2023-09-29 06:58:40 +00:00
Serge Smertin 7171874db0
Added `process.Background()` and `process.Forwarded()` (#804)
## Changes
This PR adds higher-level wrappers for calling subprocesses. One of the
steps to get https://github.com/databricks/cli/pull/637 in, as
previously discussed.

The reason to add `process.Forwarded()` is to proxy Python's `input()`
calls from a child process seamlessly. Another use-case is plugging in
`less` as a pager for the list results.

## Tests
`make test`
2023-09-27 09:04:44 +00:00
Andrew Nester 3ee89c41da
Added a warning when Python wheel wrapper needs to be used (#807)
## Changes
Added a warning when Python wheel wrapper needs to be used

## Tests
Added unit tests + manual run with different bundle configurations
2023-09-27 08:26:59 +00:00
Andrew Nester 0daa0022af
Make a notebook wrapper for Python wheel tasks optional (#797)
## Changes
Instead of always using notebook wrapper for Python wheel tasks, let's
make this an opt-in option.

Now by default Python wheel tasks will be deployed as is to Databricks
platform.
If notebook wrapper required (DBR < 13.1 or other configuration
differences), users can provide a following experimental setting

```
experimental:
  python_wheel_wrapper: true
```

Fixes #783,
https://github.com/databricks/databricks-asset-bundles-dais2023/issues/8

## Tests
Added unit tests.

Integration tests passed for both cases

```
    helpers.go:163: [databricks stdout]: Hello from my func
    helpers.go:163: [databricks stdout]: Got arguments:
    helpers.go:163: [databricks stdout]: ['my_test_code', 'one', 'two']
    ...
Bundle remote directory is ***/.bundle/ac05d5e8-ed4b-4e34-b3f2-afa73f62b021
Deleted snapshot file at /var/folders/nt/xjv68qzs45319w4k36dhpylc0000gp/T/TestAccPythonWheelTaskDeployAndRunWithWrapper3733431114/001/.databricks/bundle/default/sync-snapshots/cac1e02f3941a97b.json
Successfully deleted files!
--- PASS: TestAccPythonWheelTaskDeployAndRunWithWrapper (214.18s)
PASS
coverage: 93.5% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       214.495s        coverage: 93.5% of statements in ./...

```

```
    helpers.go:163: [databricks stdout]: Hello from my func
    helpers.go:163: [databricks stdout]: Got arguments:
    helpers.go:163: [databricks stdout]: ['my_test_code', 'one', 'two']
    ...
Bundle remote directory is ***/.bundle/0ef67aaf-5960-4049-bf1d-dc9e29157421
Deleted snapshot file at /var/folders/nt/xjv68qzs45319w4k36dhpylc0000gp/T/TestAccPythonWheelTaskDeployAndRunWithoutWrapper2340216760/001/.databricks/bundle/default/sync-snapshots/edf0b322cee93b13.json
Successfully deleted files!
--- PASS: TestAccPythonWheelTaskDeployAndRunWithoutWrapper (192.36s)
PASS
coverage: 93.5% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       195.130s        coverage: 93.5% of statements in ./...

```
2023-09-26 14:32:20 +00:00
Pieter Noordhuis ee30277119
Enable target overrides for pipeline clusters (#792)
## Changes

This is a follow-up to #658 and #779 for jobs.

This change applies label normalization the same way the backend does.

## Tests

Unit and config loading tests.
2023-09-21 19:21:20 +00:00
Pieter Noordhuis 3a812a61e5
Increase timeout waiting for job run to 1 day (#786)
## Changes

It's not uncommon for job runs to take more than 2 hours. On the client
side, we should not stop waiting for a job to complete if it is
intentionally running for a long time. If a job isn't supposed to run
this long, the user can specify a run timeout in the job specification
itself.

## Tests

n/a
2023-09-19 19:54:24 +00:00
Andrew Nester 43e2eefc27
Enable environment overrides for job tasks (#779)
## Changes
Follow up for https://github.com/databricks/cli/pull/658

When a job definition has multiple job tasks using the same key, it's
considered invalid. Instead we should combine those definitions with the
same key into one. This is consistent with environment overrides. This
way, the override ends up in the original job tasks, and we've got a
clear way to put them all together.

## Tests
Added unit tests
2023-09-18 14:13:50 +00:00
Andrew Nester 953dcb4972
Added support for experimental scripts section (#632)
## Changes
Added support for experimental scripts section

It allows execution of arbitrary bash commands during certain bundle
lifecycle steps.

## Tests
Example of configuration

```yaml
bundle:
  name: wheel-task


workspace:
  host: ***

experimental:
  scripts:
    prebuild: |
      echo 'Prebuild 1'
      echo 'Prebuild 2'
    postbuild: "echo 'Postbuild 1' && echo 'Postbuild 2'" 
    predeploy: |
      echo 'Checking go version...'
      go version
    postdeploy: |
      echo 'Checking python version...'
      python --version

resources:
  jobs:
    test_job:
      name: "[${bundle.environment}] My Wheel Job"
      tasks:
        - task_key: TestTask
          existing_cluster_id: "***"
          python_wheel_task:
            package_name: "my_test_code"
            entry_point: "run"
          libraries:
          - whl: ./dist/*.whl
```

Output
```bash
andrew.nester@HFW9Y94129 wheel % databricks bundle deploy
artifacts.whl.AutoDetect: Detecting Python wheel project...
artifacts.whl.AutoDetect: Found Python wheel project at /Users/andrew.nester/dabs/wheel
'Prebuild 1'
'Prebuild 2'

artifacts.whl.Build(my_test_code): Building...
artifacts.whl.Build(my_test_code): Build succeeded
'Postbuild 1'
'Postbuild 2'

'Checking go version...'
go version go1.19.9 darwin/arm64

Starting upload of bundle files
Uploaded bundle files at /Users/andrew.nester@databricks.com/.bundle/wheel-task/default/files!

artifacts.Upload(my_test_code-0.0.0a0-py3-none-any.whl): Uploading...
artifacts.Upload(my_test_code-0.0.0a0-py3-none-any.whl): Upload succeeded
Starting resource deployment
Resource deployment completed!
'Checking python version...'
Python 2.7.18
```
2023-09-14 10:14:13 +00:00
shreyas-goenka fe32c46dc8
Make bundle deploy work if no resources are defined (#767)
## Changes

This PR sets "resource" to nil in the terraform representation if no
resources are defined in the bundle configuration. This solves two
problems:

1. Makes bundle deploy work without any resources specified. 
2. Previously if a `resources` block was removed after a deployment,
that would fail with an error. Now the resources would get destroyed as
expected.

Also removes `TerraformHasNoResources` which is no longer needed.

## Tests
New e2e tests.
2023-09-13 22:50:37 +00:00
Pieter Noordhuis a2775f836f
Use interactive prompt to select resource to run if not specified (#762)
## Changes

Display an interactive prompt with a list of resources to run if one
isn't specified and the command is run interactively.

## Tests

Manually confirmed:
* The new prompt works
* Shell completion still works
* Specifying a key argument still works
2023-09-11 18:03:12 +00:00
shreyas-goenka 373f441eb2
Use clearer error message when no interpolation value is found. (#764)
## Changes
This PR makes the error message clearer for when interpolation fails. 

## Tests
Existing unit test and manually
2023-09-11 15:23:25 +00:00
Pieter Noordhuis 4ccc70aeac
Consolidate environment variable interaction (#747)
## Changes

There are a couple places throughout the code base where interaction
with environment variables takes place. Moreover, more than one of these
would try to read a value from more than one environment variable as
fallback (for backwards compatibility). This change consolidates those
accesses.

The majority of diffs in this change are mechanical (i.e. add an
argument or replace a call).

This change:
* Moves common environment variable lookups for bundles to
`bundles/env`.
* Adds a `libs/env` package that wraps `os.LookupEnv` and `os.Getenv`
and allows for overrides to take place in a `context.Context`. By
scoping overrides to a `context.Context` we can avoid `t.Setenv` in
testing and unlock parallel test execution for integration tests.
* Updates call sites to pass through a `context.Context` where needed.
* For bundles, introduces `DATABRICKS_BUNDLE_ROOT` as new primary
variable instead of `BUNDLE_ROOT`. This was the last environment
variable that did not use the `DATABRICKS_` prefix.

## Tests

Unit tests pass.
2023-09-11 08:18:43 +00:00
shreyas-goenka 9a51f72f0b
Make bundle and sync fields optional (#757)
## Changes
This PR:
1. Makes the bundle and sync properties optional in the generated
schema.
2. Fixes schema generation that was broken due to a rogue "description"
field in the bundle docs.

## Tests
Tested manually. The generated schema no longer has "bundle" and "sync"
marked as required.
2023-09-11 08:16:22 +00:00
Andrew Nester b5d033d154
List available targets when incorrect target passed (#756)
## Changes
List available targets when incorrect target passed

## Tests
```
andrew.nester@HFW9Y94129 wheel % databricks bundle validate -t incorrect
Error: incorrect: no such target. Available targets: prod, development
```
2023-09-08 15:37:55 +00:00
Andrew Nester 18a5b05d82
Apply Python wheel trampoline if workspace library is used (#755)
## Changes
Workspace library will be detected by trampoline in 2 cases:
- User defined to use local wheel file
- User defined to use remote wheel file from Workspace file system

In both of these cases we should correctly apply Python trampoline

## Tests
Added a regression test (also covered by Python e2e test)
2023-09-08 13:45:21 +00:00
Andrew Nester 67af171a68
Process only Python wheel tasks which have local libraries used (#751)
## Changes
Process only Python wheel tasks which have local libraries used

## Tests
Updated uni test to catch the regression
2023-09-08 11:08:21 +00:00
Andrew Nester f7566b8264
Close local Terraform state file when pushing to remote (#752)
## Changes
Close local Terraform state file when pushing to remote

Should help fix E2E test cleanup
```
testing.go:1225: TempDir RemoveAll cleanup: remove 
C:\Users\RUNNER~1\AppData\Local\Temp\TestAccPythonWheelTaskDeployAndRun1395546390\001\.databricks\bundle\default\terraform\terraform.tfstate: 
The process cannot access the file because it is being used by another process.
```
2023-09-08 10:47:17 +00:00
Andrew Nester e64463ba47
Fixed marking libraries from DBFS as remote (#750)
## Changes
Fixed marking libraries from DBFS as remote

## Tests
Updated unit tests to catch the regression
2023-09-08 09:53:57 +00:00
Andrew Nester e08f419ef6
Do not include empty output in job run output (#749)
## Changes
Do not include empty output in job run output

## Tests
Running a job from CLI, the result:
```
andrew.nester@HFW9Y94129 wheel % databricks bundle run some_other_job --output json
Run URL: https://***/?o=6051921418418893#job/780620378804085/run/386695528477456

2023-09-08 11:33:24 "[default] My Wheel Job" TERMINATED SUCCESS 
{
  "task_outputs": [
    {
      "TaskKey": "TestTask",
      "Output": {
        "result": "Hello from my func\nGot arguments v2:\n['python']\n"
      },
      "EndTime": 1694165597474
    }
  ]
```
2023-09-08 09:52:45 +00:00
Arpit Jasapara 50eaf16307
Support Model Serving Endpoints in bundles (#682)
## Changes
<!-- Summary of your changes that are easy to understand -->
Add Model Serving Endpoints to Databricks Bundles

## Tests
<!-- How is this tested? -->
Unit tests and manual testing via
https://github.com/databricks/bundle-examples-internal/pull/76
<img width="1570" alt="Screenshot 2023-08-28 at 7 46 23 PM"
src="https://github.com/databricks/cli/assets/87999496/7030ebd8-b0e2-4ad1-a9e3-5ff8454f1175">
<img width="747" alt="Screenshot 2023-08-28 at 7 47 01 PM"
src="https://github.com/databricks/cli/assets/87999496/fb9b54d7-54e2-43ce-9148-68fb620c809a">

Signed-off-by: Arpit Jasapara <arpit.jasapara@databricks.com>
2023-09-07 21:54:31 +00:00
Andrew Nester 10e0836749
Added end-to-end test for deploying and running Python wheel task (#741)
## Changes
Added end-to-end test for deploying and running Python wheel task

## Tests
Test successfully passed on all environments, takes about 9-10 minutes
to pass.

```
Deleted snapshot file at /var/folders/nt/xjv68qzs45319w4k36dhpylc0000gp/T/TestAccPythonWheelTaskDeployAndRun1845899209/002/.databricks/bundle/default/sync-snapshots/1f7cc766ffe038d6.json
Successfully deleted files!
2023/09/06 17:50:50 INFO Releasing deployment lock mutator=destroy mutator=seq mutator=seq mutator=deferred mutator=lock:release
--- PASS: TestAccPythonWheelTaskDeployAndRun (508.16s)
PASS
coverage: 77.9% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       508.810s        coverage: 77.9% of statements in ./...
```

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-07 14:08:16 +00:00
Pieter Noordhuis c0ebfb8101
Fix conversion of job parameters (#744)
## Changes

Another example of singular/plural conversion.

Longer term solution is we do a full sweep of the type using reflection
to make sure we cover all fields.

## Tests

Unit test passes.
2023-09-07 12:48:59 +00:00
Lennart Kats (databricks) f9e521b43e
databricks bundle init template v2: optional stubs, DLT support (#700)
## Changes

This follows up on https://github.com/databricks/cli/pull/686. This PR
makes our stubs optional + it adds DLT stubs:

```
$ databricks bundle init
Template to use [default-python]: default-python
Unique name for this project [my_project]: my_project
Include a stub (sample) notebook in 'my_project/src' [yes]: yes
Include a stub (sample) DLT pipeline in 'my_project/src' [yes]: yes
Include a stub (sample) Python package 'my_project/src' [yes]: yes
 Successfully initialized template
```

## Tests
Manual testing, matrix tests.

---------

Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
Co-authored-by: PaulCornellDB <paul.cornell@databricks.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-06 09:52:31 +00:00
Pieter Noordhuis fabe8e88b8
Include $PATH in set of environment variables to pass along. (#736)
## Changes

This is necessary to ensure that our Terraform provider can use the same
auxiliary programs (e.g. `az`, or `gcloud`) as the CLI.

## Tests

Unit test and manual verification.
2023-09-06 07:54:35 +00:00