mirror of https://github.com/databricks/cli.git
a002a24e41
## Changes Pipeline backend requires `target` to be always specified. In order to do this we will create an unique schema as part of `TestBundlePipelineRecreateWithoutAutoApprove` test which will be used in pipelines ## Tests ``` helpers_test.go:148: stderr: Destroy complete! --- PASS: TestBundlePipelineRecreateWithoutAutoApprove (415.39s) PASS coverage: [no statements] ok github.com/databricks/cli/integration/bundle 416.141s coverage: [no statements] ``` |
||
---|---|---|
.. | ||
assumptions | ||
bundle | ||
cmd | ||
internal/acc | ||
libs | ||
python | ||
README.md |
README.md
Integration tests
This directory contains integration tests for the project.
The tree structure generally mirrors the source code tree structure.
Requirements for new files in this directory:
- Every package must be named after its directory with
_test
appended- Requiring a different package name for integration tests avoids aliasing with the main package.
- Every integration test package must include a
main_test.go
file.
These requirements are enforced by a unit test in this directory.
Running integration tests
Integration tests require the following environment variables:
CLOUD_ENV
- set to the cloud environment to use (e.g.aws
,azure
,gcp
)DATABRICKS_HOST
- set to the Databricks workspace to useDATABRICKS_TOKEN
- set to the Databricks token to use
Optional environment variables:
TEST_DEFAULT_WAREHOUSE_ID
- set to the default warehouse ID to useTEST_METASTORE_ID
- set to the metastore ID to useTEST_INSTANCE_POOL_ID
- set to the instance pool ID to useTEST_BRICKS_CLUSTER_ID
- set to the cluster ID to use
To run all integration tests, use the following command:
go test ./integration/...
Alternatively:
make integration