Commit Graph

124 Commits

Author SHA1 Message Date
Pieter Noordhuis 87dd46a3f8
Use dynamic configuration model in bundles (#1098)
## Changes

This is a fundamental change to how we load and process bundle
configuration. We now depend on the configuration being represented as a
`dyn.Value`. This representation is functionally equivalent to Go's
`any` (it is variadic) and allows us to capture metadata associated with
a value, such as where it was defined (e.g. file, line, and column). It
also allows us to represent Go's zero values properly (e.g. empty
string, integer equal to 0, or boolean false).

Using this representation allows us to let the configuration model
deviate from the typed structure we have been relying on so far
(`config.Root`). We need to deviate from these types when using
variables for fields that are not a string themselves. For example,
using `${var.num_workers}` for an integer `workers` field was impossible
until now (though not implemented in this change).

The loader for a `dyn.Value` includes functionality to capture any and
all type mismatches between the user-defined configuration and the
expected types. These mismatches can be surfaced as validation errors in
future PRs.

Given that many mutators expect the typed struct to be the source of
truth, this change converts between the dynamic representation and the
typed representation on mutator entry and exit. Existing mutators can
continue to modify the typed representation and these modifications are
reflected in the dynamic representation (see `MarkMutatorEntry` and
`MarkMutatorExit` in `bundle/config/root.go`).

Required changes included in this change:
* The existing interpolation package is removed in favor of
`libs/dyn/dynvar`.
* Functionality to merge job clusters, job tasks, and pipeline clusters
are now all broken out into their own mutators.

To be implemented later:
* Allow variable references for non-string types.
* Surface diagnostics about the configuration provided by the user in
the validation output.
* Some mutators use a resource's configuration file path to resolve
related relative paths. These depend on `bundle/config/paths.Path` being
set and populated through `ConfigureConfigFilePath`. Instead, they
should interact with the dynamically typed configuration directly. Doing
this also unlocks being able to differentiate different base paths used
within a job (e.g. a task override with a relative path defined in a
directory other than the base job).

## Tests

* Existing unit tests pass (some have been modified to accommodate)
* Integration tests pass
2024-02-16 19:41:58 +00:00
Andrew Nester e474948a4b
Generate correct YAML if custom_tags or spark_conf is used for pipeline or job cluster configuration (#1210)
These fields (key and values) needs to be double quoted in order for
yaml loader to read, parse and unmarshal it into Go struct correctly
because these fields are `map[string]string` type.

## Tests
Added regression unit and E2E tests
2024-02-15 15:03:19 +00:00
Andrew Nester 80670eceed
Added `bundle deployment bind` and `unbind` command (#1131)
## Changes
Added `bundle deployment bind` and `unbind` command.

This command allows to bind bundle-defined resources to existing
resources in Databricks workspace so they become DABs-managed.

## Tests
Manually + added E2E test
2024-02-14 18:04:45 +00:00
Pieter Noordhuis 4073e45d4b
Use mockery to generate mocks compatible with testify/mock (#1190)
## Changes

This is the same approach we use in the Go SDK.

## Tests

Tests pass.
2024-02-08 15:18:53 +00:00
Pieter Noordhuis f8b0f783ea
Use `acc.WorkspaceTest` helper from bundle integration tests (#1181)
## Changes

This helper:
* Constructs a context
* Constructs a `*databricks.WorkspaceClient`
* Ensures required environment variables are present to run an
integration test
* Enables debugging integration tests from VS Code

Debugging integration tests (from VS Code) is made possible by a prelude
in the helper that checks if the calling process is a debug binary, and
if so, sources environment variables from
`~/..databricks/debug-env.json` (if present).

## Tests

Integration tests still pass.

---------

Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-02-07 11:18:56 +00:00
Pieter Noordhuis b64e11304c
Fix integration test with invalid configuration (#1182)
## Changes

The indentation mistake on the `path` field under `notebook` meant the
pipeline had a single entry with a `nil` notebook field. This was
allowed but incorrect.

While working on the `dyn.Value` approach, this yielded a non-nil but
zeroed `notebook` field and a failure to translate an empty path.

## Tests

Correcting the indentation made the test fail because the file is not a
notebook. I changed it to a `file` reference and the test now passes.
2024-02-07 10:53:50 +00:00
Pieter Noordhuis 33c446dadd
Refactor library to artifact matching to not use pointers (#1172)
## Changes

The approach to do this was:
1. Iterate over all libraries in all job tasks
2. Find references to local libraries
3. Store pointer to `compute.Library` in the matching artifact file to
signal it should be uploaded

This breaks down when introducing #1098 because we can no longer track
unexported state across mutators. The approach in this PR performs the
path matching twice; once in the matching mutator where we check if each
referenced file has an artifacts section, and once during artifact
upload to rewrite the library path from a local file reference to an
absolute Databricks path.

## Tests

Integration tests pass.
2024-02-05 15:29:45 +00:00
shreyas-goenka cb3ad737f1
Add short_name helper function to bundle init templates (#1167)
## Changes
Adds the short_name helper function. short_name is useful when templates
do not want to print the full userName (typically email or service
principal application-id) of the current user.

## Tests
Integration test. Also adds integration tests for other helper functions
that interact with the Databricks API.
2024-02-01 16:46:07 +00:00
Andrew Nester b28432afed
Add `--key` flag for generate commands to specify resource key (#1165)
## Changes
Add --key for generate commands to specify resource key.

Also, resource config files are now not prefixed anymore.

## Tests
Integration tests passed

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-01-31 10:23:35 +00:00
Andrew Nester f269f8015d
Added `bundle generate pipeline` command (#1139)
## Changes
Added `bundle generate pipeline` command

Usage as the following

```
databricks bundle generate pipeline --existing-pipeline-id f3b8c580-0a88-4b55-xxxx-yyyyyyyyyy
```

## Tests
Manually + added E2E test
2024-01-25 11:35:14 +00:00
Andrew Nester 7067782cf1
Fixed path matching for Windows in generate job test (#1132)
## Changes
Fixed path matching for Windows in generate job test
2024-01-19 08:05:59 +00:00
Andrew Nester 70fe0e36ef
Added `databricks bundle generate job` command (#1043)
## Changes
Now it's possible to generate bundle configuration for existing job.
For now it only supports jobs with notebook tasks.

It will download notebooks referenced in the job tasks and generate
bundle YAML config for this job which can be included in larger bundle.

## Tests
Running command manually

Example of generated config
```
resources:
  jobs:
    job_128737545467921:
      name: Notebook job
      format: MULTI_TASK
      tasks:
        - task_key: as_notebook
          existing_cluster_id: 0704-xxxxxx-yyyyyyy
          notebook_task:
            base_parameters:
              bundle_root: /Users/andrew.nester@databricks.com/.bundle/job_with_module_imports/development/files
            notebook_path: ./entry_notebook.py
            source: WORKSPACE
          run_if: ALL_SUCCESS
      max_concurrent_runs: 1
 ```

## Tests
Manual (on our last 100 jobs) + added end-to-end test

```
--- PASS: TestAccGenerateFromExistingJobAndDeploy (50.91s)
PASS
coverage: 61.5% of statements in ./...
ok github.com/databricks/cli/internal/bundle 51.209s coverage: 61.5% of
statements in ./...
```
2024-01-17 14:26:33 +00:00
shreyas-goenka 2c0d06715c
Fix windows style file paths in fs cp command (#1118)
## Changes
Copying a local file in windows to remote directory in DBFS would fail
if the path was specified as a windows style path (compared to a UNIX
style path). This PR fixes that.

Note, UNIX style paths will continue to work because `filepath.Base`
respects both `/` and `\` as file separators. See: `IsPathSeparator` in
https://go.dev/src/os/path_windows.go.

Fixes issue: https://github.com/databricks/cli/issues/1109.

## Tests
Integration test and manually
```
C:\Users\shreyas.goenka>Desktop\cli.exe fs cp .\Desktop\foo.txt dbfs:/Users/shreyas.goenka@databricks.com
.\Desktop\foo.txt -> dbfs:/Users/shreyas.goenka@databricks.com/foo.txt

C:\Users\shreyas.goenka>Desktop\cli.exe fs cat  dbfs:/Users/shreyas.goenka@databricks.com/foo.txt
hello, world
````
2024-01-11 18:49:42 +00:00
Andrew Nester 3e6e04831f
Fixed storage-credentials list command in text output (#1094)
## Changes
Fixes #1029 
Closes #910 

## Tests
Added regression test
2024-01-02 07:24:51 +00:00
Andrew Nester 83d50001fc
Pass parameters to task when run with `--python-params` and `python_wheel_wrapper` is true (#1037)
## Changes
It makes the behaviour consistent with or without `python_wheel_wrapper`
on when job is run with `--python-params` flag.

In `python_wheel_wrapper` mode it converts dynamic `python_params` in a
dynamic specially named `notebook_param` and the wrapper reads them with
`dbutils` and pass to `sys.argv`

Fixes #1000

## Tests
Added an integration test.

Integration tests pass.
2023-12-01 10:35:20 +00:00
Andrew Nester 5431174302
Do not add wheel content hash in uploaded Python wheel path (#1015)
## Changes
Removed hash from the upload path since it's not useful anyway.

The main reason for that change was to make it work on all-purpose
clusters. But in order to make it work, wheel version needs to be
increased anyway. So having only hash in path is useless.

Note: using --build-number (build tag) flag does not help with
re-installing libraries on all-purpose clusters. The reason is that
`pip` ignoring build tag when upgrading the library and only look at
wheel version.
Build tag is only used for sorting the versions and the one with higher
build tag takes priority when installed. It only works if no library is
installed.
See
a15dd75d98/src/pip/_internal/index/package_finder.py (L522-L556)
https://github.com/pypa/pip/issues/4781

Thus, the only way to reinstall the library on all-purpose cluster is to
increase wheel version manually or use automatic version generation,
f.e.
```
setup(
  version=datetime.datetime.utcnow().strftime("%Y%m%d.%H%M%S"),
  ...
)
```

## Tests
Integration tests passed.
2023-11-29 10:40:12 +00:00
Pieter Noordhuis 6187803007
Correctly overwrite local state if remote state is newer (#1008)
## Changes

A bug in the code that pulls the remote state could cause the local
state to be empty instead of a copy of the remote state. This happened
only if the local state was present and stale when compared to the
remote version.

We correctly checked for the state serial to see if the local state had
to be replaced but didn't seek back on the remote state before writing
it out. Because the staleness check would read the remote state in full,
copying from the same reader would immediately yield an EOF.

## Tests

* Unit tests for state pull and push mutators that rely on a mocked
filer.
* An integration test that deploys the same bundle from multiple paths,
triggering the staleness logic.

Both failed prior to the fix and now pass.
2023-11-24 11:15:46 +00:00
shreyas-goenka 0c837e5772
Make `file_path` and `artifact_path` fields consistent with json tag (#987)
## Changes
This PR:
1. Renames `FilesPath` -> `FilePath` and `ArtifactsPath` ->
`ArtifactPath` in the bundle and metadata configuration to make them
consistant with the json tags.
2. Fixes development / production mode error messages to point to
`file_path` and `artifact_path`

## Tests
Existing unit tests. This is a strightforward renaming of the fields.
2023-11-15 13:37:26 +00:00
shreyas-goenka f208853626
Fix integration test asserting errors on unknown template parameters (#977)
## Changes
Recent descriptions were made mandatory for input parameters so this
test started failing.

## Tests
The test passes now.
2023-11-10 11:05:32 +00:00
Serge Smertin 56bcb6f833
Make Cobra runner compatible with testing interactive flows (#957)
## Changes
This PR enables testing commands with stdin

## Tests
https://github.com/databricks/cli/pull/914
2023-11-07 19:06:27 +00:00
shreyas-goenka d70d7445c4
Remove resolution of repo names against the Databricks Github account (#940)
## Changes
This functionality is not exercised (and will not be anytime soon).
Instead we use a map to have first party aliases for supported
templates.


1e46b9f88a/cmd/bundle/init.go (L21)

## Tests
Existing tests and manually, bundle init still works.
2023-11-01 13:02:06 +00:00
shreyas-goenka 5a8cd0c5bc
Persist deployment metadata in WSFS (#845)
## Changes

This PR introduces a metadata struct that stores a subset of bundle
configuration that we wish to expose to other Databricks services that
wish to integrate with bundles.

This metadata file is uploaded to a file
`${bundle.workspace.state_path}/metadata.json` in the WSFS destination
of the bundle deployment.

Documentation for emitted metadata fields:
* `version`: Version for the metadata file schema
* `config.bundle.git.branch`: Name of the git branch the bundle was
deployed from.
* `config.bundle.git.origin_url`: URL for git remote "origin"
* `config.bundle.git.bundle_root_path`: Relative path of the bundle root
from the root of the git repository. Is set to "." if they are the same.
* `config.bundle.git.commit`: SHA-1 commit hash of the exact commit this
bundle was deployed from. Note, the deployment might not exactly match
this commit version if there are changes that have not been committed to
git at deploy time,
* `file_path`: Path in workspace where we sync bundle files to. 
* `resources.jobs.[job-ref].id`: Id of the job
* `resources.jobs.[job-ref].relative_path`: Relative path of the yaml
config file from the bundle root where this job was defined.

Example metadata object when bundle root and git root are the same:
```json
{
  "version": 1,
  "config": {
    "bundle": {
      "lock": {},
      "git": {
        "branch": "master",
        "origin_url": "www.host.com",
        "commit": "7af8e5d3f5dceffff9295d42d21606ccf056dce0",
        "bundle_root_path": "."
      }
    },
    "workspace": {
      "file_path": "/Users/shreyas.goenka@databricks.com/.bundle/pipeline-progress/default/files"
    },
    "resources": {
      "jobs": {
        "bar": {
          "id": "245921165354846",
          "relative_path": "databricks.yml"
        }
      }
    },
    "sync": {}
  }
}
```

Example metadata when the git root is one level above the bundle repo:
```json
{
  "version": 1,
  "config": {
    "bundle": {
      "lock": {},
      "git": {
        "branch": "dev-branch",
        "origin_url": "www.my-repo.com",
        "commit": "3db46ef750998952b00a2b3e7991e31787e4b98b",
        "bundle_root_path": "pipeline-progress"
      }
    },
    "workspace": {
      "file_path": "/Users/shreyas.goenka@databricks.com/.bundle/pipeline-progress/default/files"
    },
    "resources": {
      "jobs": {
        "bar": {
          "id": "245921165354846",
          "relative_path": "databricks.yml"
        }
      }
    },
    "sync": {}
  }
}
```


This unblocks integration to the jobs break glass UI for bundles.

## Tests
Unit tests and integration tests.
2023-10-27 12:55:43 +00:00
Andrew Nester 4ce279e386
Added test for tasks with python wheel wrapper on (#897)
## Changes
Added test for tasks with python wheel wrapper on

## Tests
```
2023/10/20 16:42:07 [INFO] Listing secrets from ...
=== RUN   TestAccPythonWheelTaskDeployAndRunWithWrapper
    python_wheel_test.go:13: aws
    helpers.go:43: Configuration for template:  {"node_type_id":"i3.xlarge","python_wheel_wrapper":true,"spark_version":"12.2.x-scala2.12","unique_id":"224a58a5-7ecb-4e7a-9c89-c7f5ea57924e"}
 ...
Resource deployment completed!
Run URL: ...

2023-10-20 16:42:33 "[default] Test Wheel Job 224a58a5-7ecb-4e7a-9c89-c7f5ea57924e" RUNNING 
2023-10-20 16:47:27 "[default] Test Wheel Job 224a58a5-7ecb-4e7a-9c89-c7f5ea57924e" TERMINATED SUCCESS 
    helpers.go:169: [databricks stdout]: Hello from my func
    helpers.go:169: [databricks stdout]: Got arguments:
    helpers.go:169: [databricks stdout]: ['my_test_code', 'one', 'two']
...
--- PASS: TestAccPythonWheelTaskDeployAndRunWithWrapper (321.61s)
PASS
coverage: 93.5% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       322.307s        coverage: 93.5% of statements in ./...
```
2023-10-20 15:03:29 +00:00
shreyas-goenka 5712845329
Make default dev semver a const (#891)
## Changes
<!-- Summary of your changes that are easy to understand -->

## Tests
<!-- How is this tested? -->
2023-10-19 18:56:54 +00:00
shreyas-goenka 3700785dfa
Add support for validating CLI version when loading a jsonschema object (#883)
## Changes
Updates to bundle templates can require updated versions of the CLI.
This PR extends the JSON schema representation to allow template authors
to set a min CLI version they require for their templates.

This is required to make improvements/additions to the mlops-stacks repo

## Tests
Tested using unit tests and manually. 

For manualy testing, I created a custom build of the CLI using go
releaser and then tested it against a local instance of mlops-stack
When mlops-stack schema has:
```
  "min_databricks_cli_version": "v5000.1.1",
```

output (error as expected)
```
shreyas.goenka@THW32HFW6T bricks % ./dist/cli_darwin_arm64/databricks bundle init  ~/mlops-stack
Error: minimum CLI version "v5000.1.1" is greater than current CLI version "v0.207.2-dev+1b992c0". Please upgrade your current Databricks CLI
```

When the mlops-stack schema has:
```
  "min_databricks_cli_version": "v0.1.1",
```

output (validation passes)
```
shreyas.goenka@THW32HFW6T bricks % ./dist/cli_darwin_arm64/databricks bundle init  ~/mlops-stack
Welcome to MLOps Stack. For detailed information on project generation, see the README at https://github.com/databricks/mlops-stack/blob/main/README.md.

Project Name [my-mlops-project]: ^C
```
2023-10-19 14:01:48 +00:00
Andrew Nester 996d6273c7
Escape workspace path string in regexp in artifacts integration test (#886)
## Changes
Escape workspace path string in regexp in artifacts integration test

## Tests
```
Environment: aws-prod
=== RUN   TestAccUploadArtifactFileToCorrectRemotePath
    artifacts_test.go:29: aws
    helpers.go:356: Creating /Users/serge.smertin+deco@databricks.com/integration-test-wsfs-leakafecllkc
artifacts.Upload(test.whl): Uploading...
artifacts.Upload(test.whl): Upload succeeded
    helpers.go:362: Removing /Users/serge.smertin+deco@databricks.com/integration-test-wsfs-leakafecllkc
--- PASS: TestAccUploadArtifactFileToCorrectRemotePath (2.12s)
PASS
coverage: 0.0% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       2.788s  coverage: 0.0% of statements in ./...
```
2023-10-19 12:06:46 +00:00
Andrew Nester 5273d0c51a
Support Python wheels larger than 10MB (#879)
## Changes
Previously we only supported uploading Python wheels smaller than 10mb
due to using Workspace.Import API and `content ` field
https://docs.databricks.com/api/workspace/workspace/import

By switching to use `WorkspaceFilesClient` we overcome the limit because
it uses POST body for the API instead.

## Tests
`TestAccUploadArtifactFileToCorrectRemotePath` integration test passes

```
=== RUN   TestAccUploadArtifactFileToCorrectRemotePath
    artifacts_test.go:28: gcp
2023/10/17 15:24:04 INFO Using Google Credentials sdk=true
    helpers.go:356: Creating /Users/.../integration-test-wsfs-ekggbkcfdkid
artifacts.Upload(test.whl): Uploading...
2023/10/17 15:24:06 INFO Using Google Credentials mutator=artifacts.Upload(test) sdk=true
artifacts.Upload(test.whl): Upload succeeded
    helpers.go:362: Removing /Users/.../integration-test-wsfs-ekggbkcfdkid
--- PASS: TestAccUploadArtifactFileToCorrectRemotePath (5.66s)
PASS
coverage: 14.9% of statements in ./...
ok      github.com/databricks/cli/internal      6.109s  coverage: 14.9% of statements in ./...
```
2023-10-18 10:20:43 +00:00
hectorcast-db 36f30c8b47
Update Go SDK to 0.23.0 and use custom marshaller (#772)
## Changes
Update Go SDK to 0.23.0 and use custom marshaller.
## Tests
* Run unit tests

* Run nightly

* Manual test:
```
./cli jobs create --json @myjob.json
```
with 
```
{
    "name": "my-job-marshal-test-go",
    "tasks": [{
        "task_key": "testgomarshaltask",
        "new_cluster": {
            "num_workers": 0,
            "spark_version": "10.4.x-scala2.12",
            "node_type_id": "Standard_DS3_v2"
        },
        "libraries": [
            {
                "jar": "dbfs:/max/jars/exampleJarTask.jar"
            }
        ],
        "spark_jar_task": {
            "main_class_name":  "com.databricks.quickstart.exampleTask"
        }
    }]
}
```
Main branch:
```
Error: Cluster validation error: Missing required field: settings.cluster_spec.new_cluster.size
```
This branch:
```
{
  "job_id":<jobid>
}
```

---------

Co-authored-by: Miles Yucht <miles@databricks.com>
2023-10-16 06:56:06 +00:00
shreyas-goenka 054df2b58b
Fix workspace import test (#844)
Windows and unix have different new line characters. Separating the
string assertions here to make the test pass.
2023-10-06 18:09:56 +00:00
shreyas-goenka 847b6f4bc3
Fix import export integration tests on windows (#842)
We should be using the path package here because they are paths in WSFS
2023-10-06 10:28:18 +00:00
shreyas-goenka 1e9dbcfa2a
Add `--file` flag to workspace export command (#794)
This PR:
1. Adds the `--file` flag to the workspace export command. This allows
you to specify a output file to write to.
2. Adds e2e integration tests for the workspace export command
2023-10-05 13:20:33 +00:00
shreyas-goenka caade735e3
Improve `workspace import` command by allowing references to local files for content (#793)
## Changes
This PR makes a few really important QOL improvements to the `workspace
import` command.

They are:
1. Adds the `--file` flag, which allows a user to specify a file to read
the content from.
2. Wraps the most common error first time users of this command will run
into with a helpful hint.
3. Minor changes to the command Use string changing `PATH` ->
`TARGET_PATH`


## Tests
Integration tests. The newly added integration tests that check the
--file flag works as expected for both `SOURCE` and `AUTO` format.

Skipped the other formats because the API behaviour for them is
straightforward.
2023-10-05 12:48:59 +00:00
shreyas-goenka 40ae23bb33
Refactor change computation for sync (#785)
## Changes
This PR pays some tech debt by refactoring sync diff computation into
interfaces that are more robust.

Specifically:
1. Refactor the single diff computation function into a `SnapshotState`
class that computes the target state only based on the current local
files making it more robust and not carrying over state from previous
iterations.
2. Adds new validations for the sync state which make sure that the
invariants that downstream code expects are actually held true. This
prevents a class of issues where these invariants break and the
synchroniser behaves unexpectedly.

Note, this does not change the existing schema for the snapshot, only
the way the diff is computed, and thus is backwards compatible (ie does
not require a schema version bump).

## Tests
<!-- How is this tested? -->
2023-10-03 13:47:46 +00:00
Andrew Nester 79e271f859
Added test to submit and run various Python tasks on multiple DBR versions (#806)
## Changes
These tests allow us to get information for execution context
(PYTHONPATH, CWD) for various Python tasks and different cluster setups.

Note: this test won't be executed automatically as part of nightly
builds since it requires RUN_PYTHON_TASKS_TEST env to be executed.

## Tests
Integration test run successfully.

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-10-03 11:18:55 +00:00
Pieter Noordhuis 3685eb16f4
Run tests to verify backend tag validation behavior (#814)
## Changes

Validation rules on tags are different per cloud (they are passed
through to the underlying clusters and as such must comply with
cloud-specific validation rules). This change adds tests to confirm the
current behavior to ensure the normalization we can apply is in line
with how the backend behaves.

## Tests

The new integration tests pass (tested locally).
2023-09-29 08:38:06 +00:00
Andrew Nester 0daa0022af
Make a notebook wrapper for Python wheel tasks optional (#797)
## Changes
Instead of always using notebook wrapper for Python wheel tasks, let's
make this an opt-in option.

Now by default Python wheel tasks will be deployed as is to Databricks
platform.
If notebook wrapper required (DBR < 13.1 or other configuration
differences), users can provide a following experimental setting

```
experimental:
  python_wheel_wrapper: true
```

Fixes #783,
https://github.com/databricks/databricks-asset-bundles-dais2023/issues/8

## Tests
Added unit tests.

Integration tests passed for both cases

```
    helpers.go:163: [databricks stdout]: Hello from my func
    helpers.go:163: [databricks stdout]: Got arguments:
    helpers.go:163: [databricks stdout]: ['my_test_code', 'one', 'two']
    ...
Bundle remote directory is ***/.bundle/ac05d5e8-ed4b-4e34-b3f2-afa73f62b021
Deleted snapshot file at /var/folders/nt/xjv68qzs45319w4k36dhpylc0000gp/T/TestAccPythonWheelTaskDeployAndRunWithWrapper3733431114/001/.databricks/bundle/default/sync-snapshots/cac1e02f3941a97b.json
Successfully deleted files!
--- PASS: TestAccPythonWheelTaskDeployAndRunWithWrapper (214.18s)
PASS
coverage: 93.5% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       214.495s        coverage: 93.5% of statements in ./...

```

```
    helpers.go:163: [databricks stdout]: Hello from my func
    helpers.go:163: [databricks stdout]: Got arguments:
    helpers.go:163: [databricks stdout]: ['my_test_code', 'one', 'two']
    ...
Bundle remote directory is ***/.bundle/0ef67aaf-5960-4049-bf1d-dc9e29157421
Deleted snapshot file at /var/folders/nt/xjv68qzs45319w4k36dhpylc0000gp/T/TestAccPythonWheelTaskDeployAndRunWithoutWrapper2340216760/001/.databricks/bundle/default/sync-snapshots/edf0b322cee93b13.json
Successfully deleted files!
--- PASS: TestAccPythonWheelTaskDeployAndRunWithoutWrapper (192.36s)
PASS
coverage: 93.5% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       195.130s        coverage: 93.5% of statements in ./...

```
2023-09-26 14:32:20 +00:00
shreyas-goenka 327ab0e598
Error when unknown keys are encounters during template execution (#766)
## Tests
New unit test and manually
2023-09-14 15:53:20 +00:00
shreyas-goenka fe32c46dc8
Make bundle deploy work if no resources are defined (#767)
## Changes

This PR sets "resource" to nil in the terraform representation if no
resources are defined in the bundle configuration. This solves two
problems:

1. Makes bundle deploy work without any resources specified. 
2. Previously if a `resources` block was removed after a deployment,
that would fail with an error. Now the resources would get destroyed as
expected.

Also removes `TerraformHasNoResources` which is no longer needed.

## Tests
New e2e tests.
2023-09-13 22:50:37 +00:00
Pieter Noordhuis 3cb74e72a8
Run environment related tests in a pristine environment (#769)
## Changes

If the caller running the test has one or more environment variables
that are used in the test already set, they can interfere and make tests
fail.

## Tests

Ran tests in `./cmd/root` with Databricks related environment variables
set.
2023-09-12 13:28:53 +00:00
Pieter Noordhuis 4ccc70aeac
Consolidate environment variable interaction (#747)
## Changes

There are a couple places throughout the code base where interaction
with environment variables takes place. Moreover, more than one of these
would try to read a value from more than one environment variable as
fallback (for backwards compatibility). This change consolidates those
accesses.

The majority of diffs in this change are mechanical (i.e. add an
argument or replace a call).

This change:
* Moves common environment variable lookups for bundles to
`bundles/env`.
* Adds a `libs/env` package that wraps `os.LookupEnv` and `os.Getenv`
and allows for overrides to take place in a `context.Context`. By
scoping overrides to a `context.Context` we can avoid `t.Setenv` in
testing and unlock parallel test execution for integration tests.
* Updates call sites to pass through a `context.Context` where needed.
* For bundles, introduces `DATABRICKS_BUNDLE_ROOT` as new primary
variable instead of `BUNDLE_ROOT`. This was the last environment
variable that did not use the `DATABRICKS_` prefix.

## Tests

Unit tests pass.
2023-09-11 08:18:43 +00:00
Andrew Nester 17d9f7dd2a
Use unique bundle root path for Python E2E test (#748)
## Changes
It helps to make sure jobs in the tests are deployed and executed
uniquely and isolated


```
Bundle remote directory is /Users/61b77d30-bc10-4214-9650-29cf5db0e941/.bundle/4b630810-5edc-4d8f-85d1-0eb5baf7bb28
Deleted snapshot file at /var/folders/nt/xjv68qzs45319w4k36dhpylc0000gp/T/TestAccPythonWheelTaskDeployAndRun3933198431/001/.databricks/bundle/default/sync-snapshots/dd9db100465e3d91.json
Successfully deleted files!
--- PASS: TestAccPythonWheelTaskDeployAndRun (346.28s)
PASS
coverage: 93.5% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       346.976s        coverage: 93.5% of statements in ./...
```
2023-09-08 09:19:55 +00:00
Andrew Nester 5a14c7cb43
Generate unique name for a job in Python wheel test (#745)
## Changes
Generate unique name for a job in Python wheel test
2023-09-07 20:02:26 +00:00
Andrew Nester 10e0836749
Added end-to-end test for deploying and running Python wheel task (#741)
## Changes
Added end-to-end test for deploying and running Python wheel task

## Tests
Test successfully passed on all environments, takes about 9-10 minutes
to pass.

```
Deleted snapshot file at /var/folders/nt/xjv68qzs45319w4k36dhpylc0000gp/T/TestAccPythonWheelTaskDeployAndRun1845899209/002/.databricks/bundle/default/sync-snapshots/1f7cc766ffe038d6.json
Successfully deleted files!
2023/09/06 17:50:50 INFO Releasing deployment lock mutator=destroy mutator=seq mutator=seq mutator=deferred mutator=lock:release
--- PASS: TestAccPythonWheelTaskDeployAndRun (508.16s)
PASS
coverage: 77.9% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       508.810s        coverage: 77.9% of statements in ./...
```

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2023-09-07 14:08:16 +00:00
shreyas-goenka 9194418ac1
Fix regex error check in mkdir integration test (#735)
## Changes
Fixes test for all cloud provider after the Go SDK bump which introduces
the `non retryable error` prefix to errors. The test passes now.
2023-09-05 14:25:26 +00:00
Pieter Noordhuis 1752e29885
Update Go SDK to v0.19.0 (#729)
## Changes

* Update Go SDK to v0.19.0
* Update commands per OpenAPI spec from Go SDK
* Incorporate `client.Do()` signature change to include a (nil) header
map
* Update `workspace.WorkspaceService` mock with permissions methods
* Skip `files` service in codegen; already implemented under the `fs`
command

## Tests

Unit and integration tests pass.
2023-09-05 09:43:57 +00:00
shreyas-goenka 6a843f28ef
Correct name for force acquire deploy flag (#656)
## Changes
As discussed here, the name for this flag should be `force-lock`:
https://github.com/databricks/cli/pull/578#discussion_r1276233445

## Tests
Manually and existing tests
2023-08-15 19:03:43 +00:00
Andrew Nester 6e708da6fc
Upgraded Go version to 1.21 (#664)
## Changes
Upgraded Go version to 1.21

Upgraded to use `slices` and `slog` from core instead of experimental.

Still use `exp/maps` as our code relies on `maps.Keys` which is not part
of core package and therefore refactoring required.

### Tests

Integration tests passed

```
[DEBUG] Test execution command:  /opt/homebrew/opt/go@1.21/bin/go test ./... -json -timeout 1h -run ^TestAcc
[DEBUG] Test execution directory:  /Users/andrew.nester/cli
2023/08/15 13:20:51 [INFO]  TestAccAlertsCreateErrWhenNoArguments (2.150s)
2023/08/15 13:20:52 [INFO]  TestAccApiGet (0.580s)
2023/08/15 13:20:53 [INFO]  TestAccClustersList (0.900s)
2023/08/15 13:20:54 [INFO]  TestAccClustersGet (0.870s)
2023/08/15 13:21:06 [INFO]  TestAccFilerWorkspaceFilesReadWrite (11.980s)
2023/08/15 13:21:13 [INFO]  TestAccFilerWorkspaceFilesReadDir (7.060s)
2023/08/15 13:21:25 [INFO]  TestAccFilerDbfsReadWrite (12.810s)
2023/08/15 13:21:33 [INFO]  TestAccFilerDbfsReadDir (7.380s)
2023/08/15 13:21:41 [INFO]  TestAccFilerWorkspaceNotebookConflict (7.760s)
2023/08/15 13:21:49 [INFO]  TestAccFilerWorkspaceNotebookWithOverwriteFlag (8.660s)
2023/08/15 13:21:49 [INFO]  TestAccFilerLocalReadWrite (0.020s)
2023/08/15 13:21:49 [INFO]  TestAccFilerLocalReadDir (0.010s)
2023/08/15 13:21:52 [INFO]  TestAccFsCatForDbfs (3.190s)
2023/08/15 13:21:53 [INFO]  TestAccFsCatForDbfsOnNonExistentFile (0.890s)
2023/08/15 13:21:54 [INFO]  TestAccFsCatForDbfsInvalidScheme (0.600s)
2023/08/15 13:21:57 [INFO]  TestAccFsCatDoesNotSupportOutputModeJson (2.960s)
2023/08/15 13:22:28 [INFO]  TestAccFsCpDir (31.480s)
2023/08/15 13:22:43 [INFO]  TestAccFsCpFileToFile (14.530s)
2023/08/15 13:22:58 [INFO]  TestAccFsCpFileToDir (14.610s)
2023/08/15 13:23:29 [INFO]  TestAccFsCpDirToDirFileNotOverwritten (31.810s)
2023/08/15 13:23:47 [INFO]  TestAccFsCpFileToDirFileNotOverwritten (17.500s)
2023/08/15 13:24:04 [INFO]  TestAccFsCpFileToFileFileNotOverwritten (17.260s)
2023/08/15 13:24:37 [INFO]  TestAccFsCpDirToDirWithOverwriteFlag (32.690s)
2023/08/15 13:24:56 [INFO]  TestAccFsCpFileToFileWithOverwriteFlag (19.290s)
2023/08/15 13:25:15 [INFO]  TestAccFsCpFileToDirWithOverwriteFlag (19.230s)
2023/08/15 13:25:17 [INFO]  TestAccFsCpErrorsWhenSourceIsDirWithoutRecursiveFlag (2.010s)
2023/08/15 13:25:18 [INFO]  TestAccFsCpErrorsOnInvalidScheme (0.610s)
2023/08/15 13:25:33 [INFO]  TestAccFsCpSourceIsDirectoryButTargetIsFile (14.900s)
2023/08/15 13:25:37 [INFO]  TestAccFsLsForDbfs (3.770s)
2023/08/15 13:25:41 [INFO]  TestAccFsLsForDbfsWithAbsolutePaths (4.160s)
2023/08/15 13:25:44 [INFO]  TestAccFsLsForDbfsOnFile (2.990s)
2023/08/15 13:25:46 [INFO]  TestAccFsLsForDbfsOnEmptyDir (1.870s)
2023/08/15 13:25:46 [INFO]  TestAccFsLsForDbfsForNonexistingDir (0.850s)
2023/08/15 13:25:47 [INFO]  TestAccFsLsWithoutScheme (0.560s)
2023/08/15 13:25:49 [INFO]  TestAccFsMkdirCreatesDirectory (2.310s)
2023/08/15 13:25:52 [INFO]  TestAccFsMkdirCreatesMultipleDirectories (2.920s)
2023/08/15 13:25:55 [INFO]  TestAccFsMkdirWhenDirectoryAlreadyExists (2.320s)
2023/08/15 13:25:57 [INFO]  TestAccFsMkdirWhenFileExistsAtPath (2.820s)
2023/08/15 13:26:01 [INFO]  TestAccFsRmForFile (4.030s)
2023/08/15 13:26:05 [INFO]  TestAccFsRmForEmptyDirectory (3.530s)
2023/08/15 13:26:08 [INFO]  TestAccFsRmForNonEmptyDirectory (3.190s)
2023/08/15 13:26:09 [INFO]  TestAccFsRmForNonExistentFile (0.830s)
2023/08/15 13:26:13 [INFO]  TestAccFsRmForNonEmptyDirectoryWithRecursiveFlag (3.580s)
2023/08/15 13:26:13 [INFO]  TestAccGitClone (0.800s)
2023/08/15 13:26:14 [INFO]  TestAccGitCloneWithOnlyRepoNameOnAlternateBranch (0.790s)
2023/08/15 13:26:15 [INFO]  TestAccGitCloneErrorsWhenRepositoryDoesNotExist (0.540s)
2023/08/15 13:26:23 [INFO]  TestAccLock (8.630s)
2023/08/15 13:26:27 [INFO]  TestAccLockUnlockWithoutAllowsLockFileNotExist (3.490s)
2023/08/15 13:26:30 [INFO]  TestAccLockUnlockWithAllowsLockFileNotExist (3.130s)
2023/08/15 13:26:39 [INFO]  TestAccSyncFullFileSync (9.370s)
2023/08/15 13:26:50 [INFO]  TestAccSyncIncrementalFileSync (10.390s)
2023/08/15 13:27:00 [INFO]  TestAccSyncNestedFolderSync (10.680s)
2023/08/15 13:27:11 [INFO]  TestAccSyncNestedFolderDoesntFailOnNonEmptyDirectory (10.970s)
2023/08/15 13:27:22 [INFO]  TestAccSyncNestedSpacePlusAndHashAreEscapedSync (10.930s)
2023/08/15 13:27:29 [INFO]  TestAccSyncIncrementalFileOverwritesFolder (7.020s)
2023/08/15 13:27:37 [INFO]  TestAccSyncIncrementalSyncPythonNotebookToFile (7.380s)
2023/08/15 13:27:43 [INFO]  TestAccSyncIncrementalSyncFileToPythonNotebook (6.050s)
2023/08/15 13:27:48 [INFO]  TestAccSyncIncrementalSyncPythonNotebookDelete (5.390s)
2023/08/15 13:27:51 [INFO]  TestAccSyncEnsureRemotePathIsUsableIfRepoDoesntExist (2.570s)
2023/08/15 13:27:56 [INFO]  TestAccSyncEnsureRemotePathIsUsableIfRepoExists (5.540s)
2023/08/15 13:27:58 [INFO]  TestAccSyncEnsureRemotePathIsUsableInWorkspace (1.840s)
2023/08/15 13:27:59 [INFO]  TestAccWorkspaceList (0.790s)
2023/08/15 13:28:08 [INFO]  TestAccExportDir (8.860s)
2023/08/15 13:28:11 [INFO]  TestAccExportDirDoesNotOverwrite (3.090s)
2023/08/15 13:28:14 [INFO]  TestAccExportDirWithOverwriteFlag (3.500s)
2023/08/15 13:28:23 [INFO]  TestAccImportDir (8.330s)
2023/08/15 13:28:34 [INFO]  TestAccImportDirDoesNotOverwrite (10.970s)
2023/08/15 13:28:44 [INFO]  TestAccImportDirWithOverwriteFlag (10.130s)
2023/08/15 13:28:44 [INFO]  68/68 passed, 0 failed, 3 skipped
```
2023-08-15 13:50:40 +00:00
shreyas-goenka 5a6177127f
Fix failing fs mkdir test on azure (#627)
Regex check was missing a "." character and adding it in fixes the test.
The test now passes on all three cloud providers
2023-07-31 10:38:42 +00:00
Lennart Kats (databricks) d55652be07
Extend deployment mode support (#577)
## Changes

This adds `mode: production` option. This mode doesn't do any
transformations but verifies that an environment is configured correctly
for production:

```
environments:
  prod:
    mode: production

    # paths should not be scoped to a user (unless a service principal is used)
    root_path: /Shared/non_user_path/...

    # run_as and permissions should be set at the resource level (or at the top level when that is implemented)
    run_as:
      user_name: Alice
    permissions:
    - level: CAN_MANAGE
      user_name: Alice
```

Additionally, this extends the existing `mode: development` option,
* now prefixing deployed assets with `[dev your.user]` instead of just
`[dev`]
* validating that development deployments _are_ scoped to a user

## Related

https://github.com/databricks/cli/pull/578/files (in draft)

## Tests

Manual testing to validate the experience, error messages, and
functionality with all resource types. Automated unit tests.

---------

Co-authored-by: Fabian Jakobs <fabian.jakobs@databricks.com>
2023-07-30 07:19:49 +00:00
shreyas-goenka 2f4bf844fc
Fix git clone integration test for non-existing repo (#610)
## Changes
This PR changes the integration test to just check an error is returned
rather than asserting specific text is present in the error. This is
required because the error returned can be different based on whether
git ssh keys have been setup.
2023-07-27 13:51:57 +00:00