mirror of https://github.com/databricks/cli.git
3d9decdda9
## Changes Add JobTaskClusterSpec validate mutator. It catches the case when tasks don't which cluster to use. For example, we can get this error with minor modifications to `default-python` template: ```yaml tasks: - task_key: python_file_task spark_python_task: python_file: ../src/my_project_10/main.py ``` ``` % databricks bundle validate Error: Missing required cluster or environment settings at resources.jobs.my_project_10_job.tasks[0] in resources/my_project_10_job.yml:17:11 Task "print_github_stars" requires a cluster or an environment to run. Specify one of the following fields: job_cluster_key, environment_key, existing_cluster_id, new_cluster. ``` We implicitly rely on "one of" validation, which does not exist. Many bundle fields can't co-exist, for instance, specifying: `JobTask.{existing_cluster_id,job_cluster_key}`, `Library.{whl,pypi}`, `JobTask.{notebook_task,python_wheel_task}`, etc. ## Tests Unit tests --------- Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com> |
||
---|---|---|
.. | ||
all_resources_have_values.go | ||
files_to_sync.go | ||
job_cluster_key_defined.go | ||
job_cluster_key_defined_test.go | ||
job_task_cluster_spec.go | ||
job_task_cluster_spec_test.go | ||
unique_resource_keys.go | ||
validate.go | ||
validate_sync_patterns.go |