## Changes
Previously using python wheel tasks in the tasks with compute referering
to interactive cluster defied in the same bundle would produce a warning
like below
```
GET /api/2.1/clusters/get?cluster_id=${resources.clusters.development_cluster.id}
< HTTP/2.0 400 Bad Request
< {
< "error_code": "INVALID_PARAMETER_VALUE",
< "message": "Cluster ${resources.clusters.development_cluster.id} does not exist"
< } pid=14465 mutator=seq mutator=initialize mutator=seq mutator=PythonWrapperWarning sdk=true
```
This PR fixes it by making sure that we check spark version for such
clusters based on its bundle configuration and don't make API calls
## Tests
Added acceptance test
## Changes
Enable gofumpt and goimports in golangci-lint and apply autofix.
This makes 'make fmt' redundant, will be cleaned up in follow up diff.
## Tests
Existing tests.
## Changes
This change adds a preset for source-linked deployments. It is enabled
by default for targets in `development` mode **if** the Databricks CLI
is running from the `/Workspace` directory on DBR. It does not have an
effect when running the CLI anywhere else.
Key highlights:
1. Files in this mode won't be uploaded to workspace
2. Created resources will use references to source files instead of
their workspace copies
## Tests
1. Apply preset unit test covering conditional logic
2. High-level process target mode unit test for testing integration
between mutators
---------
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>