PR for how to render errors on console for jobs.
Here is the bundle used for the logs below:
```
bundle:
name: deco-438
workspace:
host: https://adb-309687753508875.15.azuredatabricks.net
resources:
jobs:
foo:
name: "[${bundle.name}][${bundle.environment}] a test notebook"
tasks:
- task_key: alpha
existing_cluster_id: 1109-115254-ox7poobk
notebook_task:
notebook_path: "/Users/shreyas.goenka@databricks.com/[deco-438] invalid notebook"
- task_key: beta
existing_cluster_id: 1109-115254-ox7poobk
notebook_task:
notebook_path: "/does-not-exist"
- task_key: gamma
existing_cluster_id: 1109-115254-ox7poobk
notebook_task:
notebook_path: "/Users/shreyas.goenka@databricks.com/[deco-438] valid notebook"
```
And this is a screenshot of the logs from the console:
<img width="1057" alt="Screenshot 2023-02-17 at 7 12 29 PM"
src="https://user-images.githubusercontent.com/88374338/219744768-ab7f1e79-db8f-466a-ad6d-f2b6f85ed17c.png">
Here are the logs when only tasks gamma is executed (successfully):
<img width="1059" alt="Screenshot 2023-02-17 at 7 13 04 PM"
src="https://user-images.githubusercontent.com/88374338/219744992-011d8b91-ec1d-44f0-a849-83c81816dd9f.png">
TODO: Investigate more possible job errors, and make sure state for them
is handled in a robust way here
1. Perform file synchronization on deploy
2. Update notebook file path translation logic to point to the
synchronization target rather than treating the notebook as an artifact
and uploading it separately.
Invoke with `bricks sync SRC DST`.
In bundle context `SRC` and `DST` arguments are taken from bundle configuration.
This PR adds `bricks bundle sync` to disambiguate between the two.
Once the VS Code extension is bundle aware they can again be consolidated.
Consolidating them today would regress the VS Code experience if a
`bundle.yml` file is present in the file tree.
The workspace root path is a base path for bundle storage. If not
specified, it defaults to `~/.bundle/name/environment`. This default, or
other paths starting with `~` are expanded to the current user's home
directory. The configuration also includes fields for the files path,
artifacts path, and state path. By default, these are nested under the
root path, but can be overridden if needed.
This PR contains a struct to allow you to generate JSON schemas from
Golang types and a struct to allow injecting documentation into the json
schema. This will support autocomplete for DABs
If the environment is not set through command line argument or
environment variable, the bundle loads either 1) the only environment,
2) the only environment with the default flag set.
Users can opt out and use the system-installed version with the
following configuration:
```
bundle:
terraform:
exec_path: terraform
```
This will find the binary in $PATH and replace it with the found value.
If this is not set, the initialize phase will install Terraform in the
bundle's cache directory.
Summary:
* All remote path arguments for deployer and locker are now relative to
root specified at initialization
* The workspace client is now a struct field so it doesn't have to be
passed around
This includes 3 mutators:
* Interpolate resources references to TF compatible format
* Convert resources struct to TF JSON format and write it to disk
* Run TF apply
By specifying a function typed `LookupFunction` the caller can customize
which path expressions to interpolate and which ones to skip. When we
express dependencies between resources their values are known by
Terraform at deploy time. Therefore, we have to skip interpolation for
`${resources.jobs.my_job.id}` and instead rewrite it to
`${databricks_job.my_job.id}` before passing it along to Terraform.
Performs interpolation on string field.
It looks for patterns `${foo.bar}` where `foo.bar` points to a string
field in the configuration data model.
It does not support traversal (e.g. `${foo}` with `foo` equal
to`${bar}`), hence "rudimentary".
This adds:
* Top level "artifacts" configuration key
* Support for notebooks (does language detection and upload)
* Merge of per-environment artifacts (or artifact overrides) into top level
Unit tests are now run in all three big OS.
Some of the changes are to make the tests green for windows while we are
skipping some of the other tests on windows/macOS to make the tests
pass. This is a temporary measure and we will incrementally migrate
these tests over so there is parity in unit testing along all three
environments!
While working on artifact upload and workspace interrogation I realized
this mutator interface needs to:
1. Operate at the whole bundle level so it can apply to both
configuration and internal state
2. Include a `context.Context` parameter for a) long running operations
and b) progress reporting
Previous interface:
```
Apply(*config.Root) ([]Mutator, error)
```
New interface:
```
Apply(context.Context, *Bundle) ([]Mutator, error)
```
Used to inspect the bundle configuration after loading and merging all
files.
Once we add variable interpolation this command could show the result
after interpolation as well.
Each of the mutations to this configuration is observable, so we could
add a mode that writes each of the intermediate versions to disk for
even more fine grained introspection.