mirror of https://github.com/databricks/cli.git
7beb0fb8b5
## Changes This PR: 1. Incrementally improves the error messages shown to the user when the volume they are referring to in `workspace.artifact_path` does not exist. 2. Performs this validation in both `bundle validate` and `bundle deploy` compared to before on just deployments. 3. It runs "fast" validations on `bundle deploy`, which earlier were only run on `bundle validate`. ## Tests Unit tests and manually. Also, existing integration tests provide coverage (`TestUploadArtifactToVolumeNotYetDeployed`, `TestUploadArtifactFileToVolumeThatDoesNotExist`) Examples: ``` .venv➜ bundle-playground git:(master) ✗ cli bundle validate Error: cannot access volume capital.whatever.my_volume: User does not have READ VOLUME on Volume 'capital.whatever.my_volume'. at workspace.artifact_path in databricks.yml:7:18 ``` and ``` .venv➜ bundle-playground git:(master) ✗ cli bundle validate Error: volume capital.whatever.foobar does not exist at workspace.artifact_path resources.volumes.foo in databricks.yml:7:18 databricks.yml:12:7 You are using a volume in your artifact_path that is managed by this bundle but which has not been deployed yet. Please first deploy the volume using 'bundle deploy' and then switch over to using it in the artifact_path. ``` |
||
---|---|---|
.. | ||
assumptions | ||
bundle | ||
cmd | ||
internal | ||
libs | ||
python | ||
README.md | ||
enforce_convention_test.go |
README.md
Integration tests
This directory contains integration tests for the project.
The tree structure generally mirrors the source code tree structure.
Requirements for new files in this directory:
- Every package must be named after its directory with
_test
appended- Requiring a different package name for integration tests avoids aliasing with the main package.
- Every integration test package must include a
main_test.go
file.
These requirements are enforced by a unit test in this directory.
Running integration tests
Integration tests require the following environment variables:
CLOUD_ENV
- set to the cloud environment to use (e.g.aws
,azure
,gcp
)DATABRICKS_HOST
- set to the Databricks workspace to useDATABRICKS_TOKEN
- set to the Databricks token to use
Optional environment variables:
TEST_DEFAULT_WAREHOUSE_ID
- set to the default warehouse ID to useTEST_METASTORE_ID
- set to the metastore ID to useTEST_INSTANCE_POOL_ID
- set to the instance pool ID to useTEST_BRICKS_CLUSTER_ID
- set to the cluster ID to use
To run all integration tests, use the following command:
go test ./integration/...
Alternatively:
make integration