databricks-cli/libs/template/templates/experimental-jobs-as-code/template/{{.project_name}}
Gleb Kanterov 31c10c1b82
Add experimental-jobs-as-code template (#2177)
## Changes

Add experimental-jobs-as-code template allowing defining jobs using
Python instead of YAML through the `databricks-bundles` PyPI package.

## Tests

Manually and acceptance tests.
2025-01-20 10:15:11 +00:00
..
fixtures Add experimental-jobs-as-code template (#2177) 2025-01-20 10:15:11 +00:00
resources Add experimental-jobs-as-code template (#2177) 2025-01-20 10:15:11 +00:00
scratch Add experimental-jobs-as-code template (#2177) 2025-01-20 10:15:11 +00:00
src Add experimental-jobs-as-code template (#2177) 2025-01-20 10:15:11 +00:00
tests Add experimental-jobs-as-code template (#2177) 2025-01-20 10:15:11 +00:00
.gitignore Add experimental-jobs-as-code template (#2177) 2025-01-20 10:15:11 +00:00
README.md.tmpl Add experimental-jobs-as-code template (#2177) 2025-01-20 10:15:11 +00:00
databricks.yml.tmpl Add experimental-jobs-as-code template (#2177) 2025-01-20 10:15:11 +00:00
pyproject.toml.tmpl Add experimental-jobs-as-code template (#2177) 2025-01-20 10:15:11 +00:00
setup.py.tmpl Add experimental-jobs-as-code template (#2177) 2025-01-20 10:15:11 +00:00

README.md.tmpl

# {{.project_name}}

The '{{.project_name}}' project was generated by using the "Jobs as code" template.

## Prerequisites

1. Install Databricks CLI 0.238 or later.
   See [Install or update the Databricks CLI](https://docs.databricks.com/en/dev-tools/cli/install.html).

2. Install uv. See [Installing uv](https://docs.astral.sh/uv/getting-started/installation/).
   We use uv to create a virtual environment and install the required dependencies.

3. Authenticate to your Databricks workspace if you have not done so already:
    ```
    $ databricks configure
    ```

4. Optionally, install developer tools such as the Databricks extension for Visual Studio Code from
   https://docs.databricks.com/dev-tools/vscode-ext.html.
   {{- if (eq .include_python "yes") }} Or read the "getting started" documentation for
   **Databricks Connect** for instructions on running the included Python code from a different IDE.
   {{- end}}

5. For documentation on the Databricks Asset Bundles format used
   for this project, and for CI/CD configuration, see
   https://docs.databricks.com/dev-tools/bundles/index.html.

## Deploy and run jobs

1. Create a new virtual environment and install the required dependencies:
    ```
    $ uv sync
    ```

2. To deploy the bundle to the development target:
    ```
    $ databricks bundle deploy --target dev
    ```

   *(Note that "dev" is the default target, so the `--target` parameter is optional here.)*

   This deploys everything that's defined for this project.
   For example, the default template would deploy a job called
   `[dev yourname] {{.project_name}}_job` to your workspace.
   You can find that job by opening your workspace and clicking on **Workflows**.

3. Similarly, to deploy a production copy, type:
   ```
   $ databricks bundle deploy --target prod
   ```

   Note that the default job from the template has a schedule that runs every day
   (defined in resources/{{.project_name}}_job.py). The schedule
   is paused when deploying in development mode (see [Databricks Asset Bundle deployment modes](
   https://docs.databricks.com/dev-tools/bundles/deployment-modes.html)).

4. To run a job:
   ```
   $ databricks bundle run
   ```