mirror of https://github.com/databricks/cli.git
e0952491c9
## Changes Add new type of test helpers that run the command and compare full output (golden files approach). In case of JSON, there is also an option to ignore certain paths. Add test for different versions of Python to go through bundle init default-python / validate / deploy / summary. ## Tests New integration tests. |
||
---|---|---|
.. | ||
assumptions | ||
bundle | ||
cmd | ||
internal | ||
libs | ||
python | ||
README.md | ||
enforce_convention_test.go |
README.md
Integration tests
This directory contains integration tests for the project.
The tree structure generally mirrors the source code tree structure.
Requirements for new files in this directory:
- Every package must be named after its directory with
_test
appended- Requiring a different package name for integration tests avoids aliasing with the main package.
- Every integration test package must include a
main_test.go
file.
These requirements are enforced by a unit test in this directory.
Running integration tests
Integration tests require the following environment variables:
CLOUD_ENV
- set to the cloud environment to use (e.g.aws
,azure
,gcp
)DATABRICKS_HOST
- set to the Databricks workspace to useDATABRICKS_TOKEN
- set to the Databricks token to use
Optional environment variables:
TEST_DEFAULT_WAREHOUSE_ID
- set to the default warehouse ID to useTEST_METASTORE_ID
- set to the metastore ID to useTEST_INSTANCE_POOL_ID
- set to the instance pool ID to useTEST_BRICKS_CLUSTER_ID
- set to the cluster ID to use
To run all integration tests, use the following command:
go test ./integration/...
Alternatively:
make integration