mirror of https://github.com/databricks/cli.git
Convert integration/bundle/integration/bundle/python_wheel_test.go to acceptance tests. I plan to expand these tests to check patchwheel functionality. Inside each test there were two runs - with params and without, I've expanded each run into separate test to reduce total time as this runs can be done in parallel. Also add new env var DEFAULT_SPARK_VERSION that matches the one in integration tests. The tests are currently enabled on every PR (`CloudLong=true` is commented out), this can be changed after landing. |
||
---|---|---|
.. | ||
apps | ||
artifact_path_with_volume | ||
basic | ||
basic_with_variables | ||
clusters | ||
dashboards | ||
deploy_then_remove_resources | ||
job_metadata | ||
python_wheel_task_with_environments | ||
recreate_pipeline | ||
spark_jar_task | ||
uc_schema | ||
uc_schema_only | ||
volume | ||
with_includes |