databricks-cli/integration
Anton Nekipelov 3aef065c5c
Refactor TestSparkJarTask* tests to support test environments without Java 8 (#2385)
## Changes
1. Refactored `TestSparkJarTaskDeployAndRunOnVolumes` and
`TestSparkJarTaskDeployAndRunOnWorkspace` to use a table-driven approach
for better organization of similar tests
2. Implemented `testutil.HasJDK()` to replace `testutil.RequireJDK` to
be able to skip tests
3. Ensured the test suite properly fails if no compatible Java version
is found

## Why
It can be tricky to have Java 8 installed on modern dev environments
(e.g. Mac on Apple M3 chip). The absence of which previously caused the
Spark Jar task tests to fail when run locally. This refactoring allows
such environments to be able to run "SparkJar" tests using a newer
Databricks Runtime.

## Tests
1. Ran `TestSparkJarTaskDeployAndRunOnVolumes` and
`TestSparkJarTaskDeployAndRunOnWorkspace` locally on Mac with Java11
installed.
2. Checked that tests against older runtimes are still being run and
passing in CI/CD environments
2025-02-28 08:39:21 +00:00
..
assumptions Clean up TestMain from integration tests to fix caching (#2090) 2025-01-08 11:59:22 +00:00
bundle Refactor TestSparkJarTask* tests to support test environments without Java 8 (#2385) 2025-02-28 08:39:21 +00:00
cmd Bump github.com/databricks/databricks-sdk-go from 0.55.0 to 0.56.1 (#2238) 2025-01-27 13:11:07 +00:00
internal/acc Clean up TestMain from integration tests to fix caching (#2090) 2025-01-08 11:59:22 +00:00
libs Add integration test for the /telemetry-ext endpoint (#2259) 2025-01-29 14:05:58 +00:00
python Clean up TestMain from integration tests to fix caching (#2090) 2025-01-08 11:59:22 +00:00
README.md Move integration tests to `integration` package (#2009) 2024-12-13 15:38:58 +01:00

README.md

Integration tests

This directory contains integration tests for the project.

The tree structure generally mirrors the source code tree structure.

Requirements for new files in this directory:

  • Every package must be named after its directory with _test appended
    • Requiring a different package name for integration tests avoids aliasing with the main package.
  • Every integration test package must include a main_test.go file.

These requirements are enforced by a unit test in this directory.

Running integration tests

Integration tests require the following environment variables:

  • CLOUD_ENV - set to the cloud environment to use (e.g. aws, azure, gcp)
  • DATABRICKS_HOST - set to the Databricks workspace to use
  • DATABRICKS_TOKEN - set to the Databricks token to use

Optional environment variables:

  • TEST_DEFAULT_WAREHOUSE_ID - set to the default warehouse ID to use
  • TEST_METASTORE_ID - set to the metastore ID to use
  • TEST_INSTANCE_POOL_ID - set to the instance pool ID to use
  • TEST_BRICKS_CLUSTER_ID - set to the cluster ID to use

To run all integration tests, use the following command:

go test ./integration/...

Alternatively:

make integration