Commit Graph

14 Commits

Author SHA1 Message Date
Andrew Nester f3db42e622
Added support for top-level permissions (#928)
## Changes
Now it's possible to define top level `permissions` section in bundle
configuration and permissions defined there will be applied to all
resources defined in the bundle.

Supported top-level permission levels: CAN_MANAGE, CAN_VIEW, CAN_RUN.

Permissions are applied to: Jobs, DLT Pipelines, ML Models, ML
Experiments and Model Service Endpoints

```
bundle:
  name: permissions

workspace:
  host: ***

permissions:
  - level: CAN_VIEW
    group_name: test-group
  - level: CAN_MANAGE
    user_name: user@company.com
  - level: CAN_RUN
    service_principal_name: 123456-abcdef
```

## Tests
Added corresponding unit tests + ran `bundle validate` and `bundle
deploy` manually
2023-11-13 11:29:40 +00:00
Andrew Nester aa54a8665a
Added support for glob patterns in pipeline libraries section (#833)
## Changes
Now it's possible to specify glob pattern in pipeline libraries section
and DAB will add all matched files as libraries

```
  pipelines:
    dummy:
      name: " DLT with Python files"
      target: "dlt_python_files"
      libraries:
        - file:
            path: ./*.py
```

## Tests
Added unit test
2023-10-04 13:23:13 +00:00
Andrew Nester 3ee89c41da
Added a warning when Python wheel wrapper needs to be used (#807)
## Changes
Added a warning when Python wheel wrapper needs to be used

## Tests
Added unit tests + manual run with different bundle configurations
2023-09-27 08:26:59 +00:00
Andrew Nester 953dcb4972
Added support for experimental scripts section (#632)
## Changes
Added support for experimental scripts section

It allows execution of arbitrary bash commands during certain bundle
lifecycle steps.

## Tests
Example of configuration

```yaml
bundle:
  name: wheel-task


workspace:
  host: ***

experimental:
  scripts:
    prebuild: |
      echo 'Prebuild 1'
      echo 'Prebuild 2'
    postbuild: "echo 'Postbuild 1' && echo 'Postbuild 2'" 
    predeploy: |
      echo 'Checking go version...'
      go version
    postdeploy: |
      echo 'Checking python version...'
      python --version

resources:
  jobs:
    test_job:
      name: "[${bundle.environment}] My Wheel Job"
      tasks:
        - task_key: TestTask
          existing_cluster_id: "***"
          python_wheel_task:
            package_name: "my_test_code"
            entry_point: "run"
          libraries:
          - whl: ./dist/*.whl
```

Output
```bash
andrew.nester@HFW9Y94129 wheel % databricks bundle deploy
artifacts.whl.AutoDetect: Detecting Python wheel project...
artifacts.whl.AutoDetect: Found Python wheel project at /Users/andrew.nester/dabs/wheel
'Prebuild 1'
'Prebuild 2'

artifacts.whl.Build(my_test_code): Building...
artifacts.whl.Build(my_test_code): Build succeeded
'Postbuild 1'
'Postbuild 2'

'Checking go version...'
go version go1.19.9 darwin/arm64

Starting upload of bundle files
Uploaded bundle files at /Users/andrew.nester@databricks.com/.bundle/wheel-task/default/files!

artifacts.Upload(my_test_code-0.0.0a0-py3-none-any.whl): Uploading...
artifacts.Upload(my_test_code-0.0.0a0-py3-none-any.whl): Upload succeeded
Starting resource deployment
Resource deployment completed!
'Checking python version...'
Python 2.7.18
```
2023-09-14 10:14:13 +00:00
Andrew Nester 4ee926b885
Added run_as section for bundle configuration (#692)
## Changes
Added run_as section for bundle configuration.

This section allows to define an user name or service principal which
will be applied as an execution identity for jobs and DLT pipelines. In
the case of DLT, identity defined in `run_as` will be assigned
`IS_OWNER` permission on this pipeline.

## Tests
Added unit tests for configuration.

Also ran deploy for the following bundle configuration

```
bundle:
  name: "run_as"

run_as:
  # service_principal_name: "f7263fcc-56d0-4981-8baf-c2a45296690b"
  user_name: "lennart.kats@databricks.com"

resources:
  pipelines:
    andrew_pipeline:
      name: "Andrew Nester pipeline"
      libraries:
        - notebook:
            path: ./test.py

  jobs:
    job_one:
      name: Job One
      tasks:
        - task_key: "task"
          new_cluster:
            num_workers: 1
            spark_version: 13.2.x-snapshot-scala2.12
            node_type_id: i3.xlarge
            runtime_engine: PHOTON
          notebook_task: 
            notebook_path: "./test.py"
```
2023-08-23 16:47:07 +00:00
Andrew Nester 56dcd3f0a7
Renamed `environments` to `targets` in bundle configuration (#670)
## Changes
Renamed Environments to Targets in bundle.yml.

The change is backward-compatible and customers can continue to use
`environments` in the time being.

## Tests
Added tests which checks that both `environments` and `targets` sections
in bundle.yml works correctly
2023-08-17 15:22:32 +00:00
Lennart Kats (databricks) 57e75d3e22
Add development runs (#522)
This implements the "development run" functionality that we desire for DABs in the workspace / IDE.

## bundle.yml changes

In bundle.yml, there should be a "dev" environment that is marked as
`mode: debug`:
```
environments:
  dev:
    default: true
    mode: development # future accepted values might include pull_request, production
```

Setting `mode` to `development` indicates that this environment is used
just for running things for development. This results in several changes
to deployed assets:
* All assets will get '[dev]' in their name and will get a 'dev' tag
* All assets will be hidden from the list of assets (future work; e.g.
for jobs we would have a special job_type that hides it from the list)
* All deployed assets will be ephemeral (future work, we need some form
of garbage collection)
* Pipelines will be marked as 'development: true'
* Jobs can run on development compute through the `--compute` parameter
in the CLI
* Jobs get their schedule / triggers paused
* Jobs get concurrent runs (it's really annoying if your runs get
skipped because the last run was still in progress)

Other accepted values for `mode` are `default` (which does nothing) and
`pull-request` (which is reserved for future use).

## CLI changes

To run a single job called "shark_sighting" on existing compute, use the
following commands:
```
$ databricks bundle deploy --compute 0617-201942-9yd9g8ix
$ databricks bundle run shark_sighting
```

which would deploy and run a job called "[dev] shark_sightings" on the
compute provided. Note that `--compute` is not accepted in production
environments, so we show an error if `mode: development` is not used.

The `run --deploy` command offers a convenient shorthand for the common
combination of deploying & running:
```
$ export DATABRICKS_COMPUTE=0617-201942-9yd9g8ix
$ bundle run --deploy shark_sightings
```
The `--deploy` addition isn't really essential and I welcome feedback 🤔
I played with the idea of a "debug" or "dev" command but that seemed to
only make the option space even broader for users. The above could work
well with an IDE or workspace that automatically sets the target
compute.

One more thing I added is`run --no-wait` can now be used to run
something without waiting for it to be completed (useful for IDE-like
environments that can display progress themselves).
```
$ bundle run --deploy shark_sightings --no-wait
```
2023-07-12 08:51:54 +02:00
Pieter Noordhuis 98ebb78c9b
Rename bricks -> databricks (#389)
## Changes

Rename all instances of "bricks" to "databricks".

## Tests

* Confirmed the goreleaser build works, uses the correct new binary
name, and produces the right archives.
* Help output is confirmed to be correct.
* Output of `git grep -w bricks` is minimal with a couple changes
remaining for after the repository rename.
2023-05-16 18:35:39 +02:00
shreyas-goenka c5e940f664
Add support for variables in bundle config (#359)
## Changes
This PR now allows you to define variables in the bundle config and set
them in three ways
1. command line args
2. process environment variable
3. in the bundle config itself

## Tests
manually, unit, and black box tests

---------

Co-authored-by: Miles Yucht <miles@databricks.com>
2023-05-15 11:34:05 +02:00
Pieter Noordhuis 4e4c0658db
Interpolate paths for job tasks that reference files (#306)
## Changes

This change also swaps the order of mutators such that interpolation
happens before path translation. This means that is is possible to use
variables (e.g. `${bundle.environment}`) in notebook or file paths.

## Tests

New tests pass and verified manually.
2023-04-05 16:02:17 +02:00
Pieter Noordhuis 35c3d9fa4e
Add workspace paths (#179)
The workspace root path is a base path for bundle storage. If not
specified, it defaults to `~/.bundle/name/environment`. This default, or
other paths starting with `~` are expanded to the current user's home
directory. The configuration also includes fields for the files path,
artifacts path, and state path. By default, these are nested under the
root path, but can be overridden if needed.
2023-01-26 19:55:38 +01:00
Pieter Noordhuis 4026b2cda2
Mutator to convert paths to local notebooks files into artifacts (#144)
This lets you write:
```yaml
libraries:
  - notebook:
      path: ./events.sql
```

Instead of:
```yaml
artifacts:
  events_sql:
    notebook:
      path: ./events.sql

libraries:
  - notebook:
      path: "${artifacts.events_sql.notebook.remote_path}"
```
2022-12-16 14:49:23 +01:00
Pieter Noordhuis 35243db33c
Automatically install Terraform if needed (#141)
Users can opt out and use the system-installed version with the
following configuration:

```
bundle:
  terraform:
    exec_path: terraform
```

This will find the binary in $PATH and replace it with the found value.

If this is not set, the initialize phase will install Terraform in the
bundle's cache directory.
2022-12-15 17:30:33 +01:00
Pieter Noordhuis c255bd686a
Define deploy command as sequence of build phases (#129) 2022-12-12 12:49:25 +01:00