Commit Graph

31 Commits

Author SHA1 Message Date
Shreyas Goenka 4f5f9eccb6
Merge remote-tracking branch 'origin' into feature/uc-volumes 2024-11-29 02:16:49 +01:00
shreyas-goenka 984c38e03e
Add unique ID to `root_path` for bundle integration test fixtures (#1917)
## Changes
Integration tests using these fixtures could have been flaky when run in
parallel using the same user's identity. They would also possibly have
piggybacked state from previous runs.

This PR adds a UUID to the root_path to force independent bundle
deployments for every test run.

I have checked that all bundles in `internal/bundle/bundles` have
`root_path` namespaced to a UUID.

## Tests
Self testing.
2024-11-20 16:30:10 +00:00
Andrew Nester fab3e8f168
Added integration test to deploy bundle to /Shared root path (#1914)
## Changes
Added integration test to deploy bundle to /Shared root path

## Tests
```
--- PASS: TestAccDeployBasicToSharedWorkspace (24.58s)
PASS
coverage: 31.2% of statements in ./...
ok  	github.com/databricks/cli/internal/bundle	25.572s	coverage: 31.2% of statements in ./...
```

---------

Co-authored-by: shreyas-goenka <88374338+shreyas-goenka@users.noreply.github.com>
2024-11-20 12:20:39 +00:00
Shreyas Goenka 68dc6c1ce4
Merge remote-tracking branch 'origin' into feature/uc-volumes 2024-11-18 16:56:37 +01:00
Andrew Nester 71cf426755
Added E2E test to run Python wheels on interactive cluster created in bundle (#1864)
## Changes
Added E2E test to run python wheels on interactive cluster created in
bundle.

We had a gap in testing wheel on all purpose clusters, so this PR
addresses the gap
2024-11-01 14:22:47 +00:00
Shreyas Goenka 250d4265ce
rename to volume 2024-10-31 18:00:31 +01:00
Shreyas Goenka 8a2fe4969c
merge 2024-10-29 11:38:42 +01:00
Pieter Noordhuis 11f75fd320
Add support for AI/BI dashboards (#1743)
## Changes

This change adds support for modeling [AI/BI dashboards][docs] in DABs.


[Example bundle configuration][example] is located in the
`bundle-examples` repository.

[docs]: https://docs.databricks.com/en/dashboards/index.html#dashboards
[example]:
https://github.com/databricks/bundle-examples/tree/main/knowledge_base/dashboard_nyc_taxi

## Tests

* Added unit tests for self-contained parts
* Integration test for e2e dashboard deployment and remote change
modification
2024-10-29 09:11:08 +00:00
Shreyas Goenka 1a961eb19c
- 2024-10-16 13:46:45 +02:00
Shreyas Goenka d241c2b39c
add integration test for grant on volume 2024-10-15 16:05:23 +02:00
Shreyas Goenka e43f566579
Merge remote-tracking branch 'origin' into feature/uc-volumes 2024-10-14 15:04:40 +02:00
Andrew Nester 56ed9bebf3
Added support for creating all-purpose clusters (#1698)
## Changes
Added support for creating all-purpose clusters

Example of configuration

```
bundle:
  name: clusters

resources:
  clusters:
    test_cluster:
      cluster_name: "Test Cluster"
      num_workers: 2
      node_type_id: "i3.xlarge"
      autoscale:
        min_workers: 2
        max_workers: 7
      spark_version: "13.3.x-scala2.12"
      spark_conf:
        "spark.executor.memory": "2g"

  jobs:
    test_job:
      name: "Test Job"
      tasks:
        - task_key: test_task
          existing_cluster_id: ${resources.clusters.test_cluster.id}
          notebook_task:
            notebook_path: "./src/test.py"

targets:
    development:
      mode: development
      compute_id: ${resources.clusters.test_cluster.id}

```

## Tests
Added unit, config and E2E tests
2024-09-23 10:42:34 +00:00
Shreyas Goenka 6f9817e194
add prompt and crud test for volumes 2024-09-10 12:52:31 +02:00
shreyas-goenka a27c24a397
Add prompt when a pipeline recreation happens (#1672)
## Changes
DLT pipeline recreations are destructive. They can lead to lost history
of previous updates, outage of the tables temporarily and are
potentially computationally expensive. Thus we make a breaking change
where a prompt is shown to the user if there configuration changes will
lead to a DLT recreation.

Users can skip the prompt by specifying  the `--auto-approve` flag.

This PR also fixes an issue with our test runner where logs from the
cmdio.Logger would not get propagated to the reader returned by our
cobra test runner.

## Tests
Manually, and new unit and integration tests.

```
➜  bundle-playground-3 cli bundle deploy
Uploading bundle files to /Users/63ec021d-b0c6-49c0-93a0-5123953a1cb2/.bundle/test/development/files...
The following DLT pipelines will be recreated. Underlying tables will be unavailable for a transient period until the newly recreated pipelines are run once successfully. History of previous pipeline update runs will be lost because of recreation:
  recreate pipeline foo

Would you like to proceed? [y/n]: n
Deployment cancelled!
```
2024-09-04 11:11:47 +00:00
Pieter Noordhuis a240be0b5a
Run Spark JAR task test on multiple DBR versions (#1665)
## Changes

This explores error messages on older DBRs and UC vs non-UC.

## Tests

Integration tests pass.
2024-08-09 15:13:31 +00:00
Andrew Nester 9d1fbbb39c
Enable Spark JAR task test (#1658)
## Changes
Enable Spark JAR task test

## Tests
```

Updating deployment state...
Deleting files...
Destroy complete!
--- PASS: TestAccSparkJarTaskDeployAndRunOnVolumes (194.13s)
PASS
coverage: 51.9% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       194.586s        coverage: 51.9% of statements in ./...
```
2024-08-06 18:58:34 +00:00
shreyas-goenka a13d77f8eb
Fix python wheel task integration tests (#1648)
## Changes
A new Service Control Policy has removed the `ec2.RunInstances`
permission from our service principal for our AWS integration tests.
This PR switches over to using the instance pool which does not require
creating new clusters.

## Tests
The integration tests pass now.
2024-08-01 13:55:22 +00:00
shreyas-goenka 89c0af5bdc
Add resource for UC schemas to DABs (#1413)
## Changes
This PR adds support for UC Schemas to DABs. This allows users to define
schemas for tables and other assets their pipelines/workflows create as
part of the DAB, thus managing the life-cycle in the DAB.

The first version has a couple of intentional limitations:
1. The owner of the schema will be the deployment user. Changing the
owner of the schema is not allowed (yet). `run_as` will not be
restricted for DABs containing UC schemas. Let's limit the scope of
run_as to the compute identity used instead of ownership of data assets
like UC schemas.
2. API fields that are present in the update API but not the create API.
For example: enabling predictive optimization is not supported in the
create schema API and thus is not available in DABs at the moment.

## Tests
Manually and integration test. Manually verified the following work:
1. Development mode adds a "dev_" prefix.
2. Modified status is correctly computed in the `bundle summary`
command.
3. Grants work as expected, for assigning privileges.
4. Variable interpolation works for the schema ID.
2024-07-31 12:16:28 +00:00
Andrew Nester 434bcbb018
Allow artifacts (JARs, wheels) to be uploaded to UC Volumes (#1591)
## Changes
This change allows to specify UC volumes path as an artifact paths so
all artifacts (JARs, wheels) are uploaded to UC Volumes.

Example configuration is here:
```
bundle:
  name: jar-bundle

workspace:
  host: https://foo.com
  artifact_path: /Volumes/main/default/foobar

artifacts:
  my_java_code:
    path: ./sample-java
    build: "javac PrintArgs.java && jar cvfm PrintArgs.jar META-INF/MANIFEST.MF PrintArgs.class"
    files:
      - source: ./sample-java/PrintArgs.jar

resources:
  jobs:
    jar_job:
      name: "Test Spark Jar Job"
      tasks:
        - task_key: TestSparkJarTask
          new_cluster:
            num_workers: 1
            spark_version: "14.3.x-scala2.12"
            node_type_id: "i3.xlarge"
          spark_jar_task:
            main_class_name: PrintArgs
          libraries:
            - jar: ./sample-java/PrintArgs.jar
```
## Tests
Manually + added E2E test for Java jobs

E2E test is temporarily skipped until auth related issues for UC for
tests are resolved
2024-07-16 08:57:04 +00:00
Ilia Babanov 2035516fde
Don't merge-in remote resources during depolyments (#1432)
## Changes
`check_running_resources` now pulls the remote state without modifying
the bundle state, similar to how it was doing before. This avoids a
problem when we fail to compute deployment metadata for a deleted job
(which we shouldn't do in the first place)

`deploy_then_remove_resources_test` now also deploys and deletes a job
(in addition to a pipeline), which catches the error that this PR fixes.

## Tests
Unit and integ tests
2024-05-15 12:41:44 +00:00
Andrew Nester 1872aa12b3
Added support for job environments (#1379)
## Changes
The main changes are:
1. Don't link artifacts to libraries anymore and instead just iterate
over all jobs and tasks when uploading artifacts and update local path
to remote
2. Iterating over `jobs.environments` to check if there are any local
libraries and checking that they exist locally
3. Added tests to check environments are handled correctly

End-to-end test will follow up

## Tests
Added regression test, existing tests (including integration one) pass
2024-04-22 11:44:34 +00:00
Andrew Nester 80670eceed
Added `bundle deployment bind` and `unbind` command (#1131)
## Changes
Added `bundle deployment bind` and `unbind` command.

This command allows to bind bundle-defined resources to existing
resources in Databricks workspace so they become DABs-managed.

## Tests
Manually + added E2E test
2024-02-14 18:04:45 +00:00
Pieter Noordhuis b64e11304c
Fix integration test with invalid configuration (#1182)
## Changes

The indentation mistake on the `path` field under `notebook` meant the
pipeline had a single entry with a `nil` notebook field. This was
allowed but incorrect.

While working on the `dyn.Value` approach, this yielded a non-nil but
zeroed `notebook` field and a failure to translate an empty path.

## Tests

Correcting the indentation made the test fail because the file is not a
notebook. I changed it to a `file` reference and the test now passes.
2024-02-07 10:53:50 +00:00
Andrew Nester 70fe0e36ef
Added `databricks bundle generate job` command (#1043)
## Changes
Now it's possible to generate bundle configuration for existing job.
For now it only supports jobs with notebook tasks.

It will download notebooks referenced in the job tasks and generate
bundle YAML config for this job which can be included in larger bundle.

## Tests
Running command manually

Example of generated config
```
resources:
  jobs:
    job_128737545467921:
      name: Notebook job
      format: MULTI_TASK
      tasks:
        - task_key: as_notebook
          existing_cluster_id: 0704-xxxxxx-yyyyyyy
          notebook_task:
            base_parameters:
              bundle_root: /Users/andrew.nester@databricks.com/.bundle/job_with_module_imports/development/files
            notebook_path: ./entry_notebook.py
            source: WORKSPACE
          run_if: ALL_SUCCESS
      max_concurrent_runs: 1
 ```

## Tests
Manual (on our last 100 jobs) + added end-to-end test

```
--- PASS: TestAccGenerateFromExistingJobAndDeploy (50.91s)
PASS
coverage: 61.5% of statements in ./...
ok github.com/databricks/cli/internal/bundle 51.209s coverage: 61.5% of
statements in ./...
```
2024-01-17 14:26:33 +00:00
Pieter Noordhuis 6187803007
Correctly overwrite local state if remote state is newer (#1008)
## Changes

A bug in the code that pulls the remote state could cause the local
state to be empty instead of a copy of the remote state. This happened
only if the local state was present and stale when compared to the
remote version.

We correctly checked for the state serial to see if the local state had
to be replaced but didn't seek back on the remote state before writing
it out. Because the staleness check would read the remote state in full,
copying from the same reader would immediately yield an EOF.

## Tests

* Unit tests for state pull and push mutators that rely on a mocked
filer.
* An integration test that deploys the same bundle from multiple paths,
triggering the staleness logic.

Both failed prior to the fix and now pass.
2023-11-24 11:15:46 +00:00
shreyas-goenka 5a8cd0c5bc
Persist deployment metadata in WSFS (#845)
## Changes

This PR introduces a metadata struct that stores a subset of bundle
configuration that we wish to expose to other Databricks services that
wish to integrate with bundles.

This metadata file is uploaded to a file
`${bundle.workspace.state_path}/metadata.json` in the WSFS destination
of the bundle deployment.

Documentation for emitted metadata fields:
* `version`: Version for the metadata file schema
* `config.bundle.git.branch`: Name of the git branch the bundle was
deployed from.
* `config.bundle.git.origin_url`: URL for git remote "origin"
* `config.bundle.git.bundle_root_path`: Relative path of the bundle root
from the root of the git repository. Is set to "." if they are the same.
* `config.bundle.git.commit`: SHA-1 commit hash of the exact commit this
bundle was deployed from. Note, the deployment might not exactly match
this commit version if there are changes that have not been committed to
git at deploy time,
* `file_path`: Path in workspace where we sync bundle files to. 
* `resources.jobs.[job-ref].id`: Id of the job
* `resources.jobs.[job-ref].relative_path`: Relative path of the yaml
config file from the bundle root where this job was defined.

Example metadata object when bundle root and git root are the same:
```json
{
  "version": 1,
  "config": {
    "bundle": {
      "lock": {},
      "git": {
        "branch": "master",
        "origin_url": "www.host.com",
        "commit": "7af8e5d3f5dceffff9295d42d21606ccf056dce0",
        "bundle_root_path": "."
      }
    },
    "workspace": {
      "file_path": "/Users/shreyas.goenka@databricks.com/.bundle/pipeline-progress/default/files"
    },
    "resources": {
      "jobs": {
        "bar": {
          "id": "245921165354846",
          "relative_path": "databricks.yml"
        }
      }
    },
    "sync": {}
  }
}
```

Example metadata when the git root is one level above the bundle repo:
```json
{
  "version": 1,
  "config": {
    "bundle": {
      "lock": {},
      "git": {
        "branch": "dev-branch",
        "origin_url": "www.my-repo.com",
        "commit": "3db46ef750998952b00a2b3e7991e31787e4b98b",
        "bundle_root_path": "pipeline-progress"
      }
    },
    "workspace": {
      "file_path": "/Users/shreyas.goenka@databricks.com/.bundle/pipeline-progress/default/files"
    },
    "resources": {
      "jobs": {
        "bar": {
          "id": "245921165354846",
          "relative_path": "databricks.yml"
        }
      }
    },
    "sync": {}
  }
}
```


This unblocks integration to the jobs break glass UI for bundles.

## Tests
Unit tests and integration tests.
2023-10-27 12:55:43 +00:00
Andrew Nester 0daa0022af
Make a notebook wrapper for Python wheel tasks optional (#797)
## Changes
Instead of always using notebook wrapper for Python wheel tasks, let's
make this an opt-in option.

Now by default Python wheel tasks will be deployed as is to Databricks
platform.
If notebook wrapper required (DBR < 13.1 or other configuration
differences), users can provide a following experimental setting

```
experimental:
  python_wheel_wrapper: true
```

Fixes #783,
https://github.com/databricks/databricks-asset-bundles-dais2023/issues/8

## Tests
Added unit tests.

Integration tests passed for both cases

```
    helpers.go:163: [databricks stdout]: Hello from my func
    helpers.go:163: [databricks stdout]: Got arguments:
    helpers.go:163: [databricks stdout]: ['my_test_code', 'one', 'two']
    ...
Bundle remote directory is ***/.bundle/ac05d5e8-ed4b-4e34-b3f2-afa73f62b021
Deleted snapshot file at /var/folders/nt/xjv68qzs45319w4k36dhpylc0000gp/T/TestAccPythonWheelTaskDeployAndRunWithWrapper3733431114/001/.databricks/bundle/default/sync-snapshots/cac1e02f3941a97b.json
Successfully deleted files!
--- PASS: TestAccPythonWheelTaskDeployAndRunWithWrapper (214.18s)
PASS
coverage: 93.5% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       214.495s        coverage: 93.5% of statements in ./...

```

```
    helpers.go:163: [databricks stdout]: Hello from my func
    helpers.go:163: [databricks stdout]: Got arguments:
    helpers.go:163: [databricks stdout]: ['my_test_code', 'one', 'two']
    ...
Bundle remote directory is ***/.bundle/0ef67aaf-5960-4049-bf1d-dc9e29157421
Deleted snapshot file at /var/folders/nt/xjv68qzs45319w4k36dhpylc0000gp/T/TestAccPythonWheelTaskDeployAndRunWithoutWrapper2340216760/001/.databricks/bundle/default/sync-snapshots/edf0b322cee93b13.json
Successfully deleted files!
--- PASS: TestAccPythonWheelTaskDeployAndRunWithoutWrapper (192.36s)
PASS
coverage: 93.5% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       195.130s        coverage: 93.5% of statements in ./...

```
2023-09-26 14:32:20 +00:00
shreyas-goenka fe32c46dc8
Make bundle deploy work if no resources are defined (#767)
## Changes

This PR sets "resource" to nil in the terraform representation if no
resources are defined in the bundle configuration. This solves two
problems:

1. Makes bundle deploy work without any resources specified. 
2. Previously if a `resources` block was removed after a deployment,
that would fail with an error. Now the resources would get destroyed as
expected.

Also removes `TerraformHasNoResources` which is no longer needed.

## Tests
New e2e tests.
2023-09-13 22:50:37 +00:00
Andrew Nester 17d9f7dd2a
Use unique bundle root path for Python E2E test (#748)
## Changes
It helps to make sure jobs in the tests are deployed and executed
uniquely and isolated


```
Bundle remote directory is /Users/61b77d30-bc10-4214-9650-29cf5db0e941/.bundle/4b630810-5edc-4d8f-85d1-0eb5baf7bb28
Deleted snapshot file at /var/folders/nt/xjv68qzs45319w4k36dhpylc0000gp/T/TestAccPythonWheelTaskDeployAndRun3933198431/001/.databricks/bundle/default/sync-snapshots/dd9db100465e3d91.json
Successfully deleted files!
--- PASS: TestAccPythonWheelTaskDeployAndRun (346.28s)
PASS
coverage: 93.5% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       346.976s        coverage: 93.5% of statements in ./...
```
2023-09-08 09:19:55 +00:00
Andrew Nester 5a14c7cb43
Generate unique name for a job in Python wheel test (#745)
## Changes
Generate unique name for a job in Python wheel test
2023-09-07 20:02:26 +00:00
Andrew Nester 10e0836749
Added end-to-end test for deploying and running Python wheel task (#741)
## Changes
Added end-to-end test for deploying and running Python wheel task

## Tests
Test successfully passed on all environments, takes about 9-10 minutes
to pass.

```
Deleted snapshot file at /var/folders/nt/xjv68qzs45319w4k36dhpylc0000gp/T/TestAccPythonWheelTaskDeployAndRun1845899209/002/.databricks/bundle/default/sync-snapshots/1f7cc766ffe038d6.json
Successfully deleted files!
2023/09/06 17:50:50 INFO Releasing deployment lock mutator=destroy mutator=seq mutator=seq mutator=deferred mutator=lock:release
--- PASS: TestAccPythonWheelTaskDeployAndRun (508.16s)
PASS
coverage: 77.9% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       508.810s        coverage: 77.9% of statements in ./...
```

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-07 14:08:16 +00:00