mirror of https://github.com/databricks/cli.git
## Changes The `schema` field implies the lifecycle of tables is no longer tied to the lifecycle of the pipeline, as was the case with the `target` field. More information about using the "catalog" and "schema" properties can be found here: https://docs.databricks.com/en/delta-live-tables/target-schema.html ## Tests n/a --------- Co-authored-by: Lennart Kats (databricks) <lennart.kats@databricks.com> |
||
---|---|---|
.. | ||
assumptions | ||
bundle | ||
cmd | ||
internal/acc | ||
libs | ||
python | ||
README.md |
README.md
Integration tests
This directory contains integration tests for the project.
The tree structure generally mirrors the source code tree structure.
Requirements for new files in this directory:
- Every package must be named after its directory with
_test
appended- Requiring a different package name for integration tests avoids aliasing with the main package.
- Every integration test package must include a
main_test.go
file.
These requirements are enforced by a unit test in this directory.
Running integration tests
Integration tests require the following environment variables:
CLOUD_ENV
- set to the cloud environment to use (e.g.aws
,azure
,gcp
)DATABRICKS_HOST
- set to the Databricks workspace to useDATABRICKS_TOKEN
- set to the Databricks token to use
Optional environment variables:
TEST_DEFAULT_WAREHOUSE_ID
- set to the default warehouse ID to useTEST_METASTORE_ID
- set to the metastore ID to useTEST_INSTANCE_POOL_ID
- set to the instance pool ID to useTEST_BRICKS_CLUSTER_ID
- set to the cluster ID to use
To run all integration tests, use the following command:
go test ./integration/...
Alternatively:
make integration