databricks-cli/libs/template/templates/default-python/template/{{.project_name}}
Fabian Jakobs e61f0e1eb9
Fix DBConnect support in VS Code (#1253)
## Changes

With the current template, we can't execute the Python file and the jobs
notebook using DBConnect from VSCode because we import `from pyspark.sql
import SparkSession`, which doesn't support Databricks unified auth.
This PR fixes this by passing spark into the library code and by
explicitly instantiating a spark session where the spark global is not
available.

Other changes:

* add auto-reload to notebooks
* add DLT typings for code completion
2024-03-05 14:31:27 +00:00
..
.vscode Make the default `databricks bundle init` template more self-explanatory (#796) 2023-09-26 09:12:34 +00:00
fixtures Minor default template tweaks (#758) 2023-09-11 07:36:44 +00:00
resources Improve default template (#1046) 2023-12-11 19:13:14 +00:00
scratch Fix DBConnect support in VS Code (#1253) 2024-03-05 14:31:27 +00:00
src Fix DBConnect support in VS Code (#1253) 2024-03-05 14:31:27 +00:00
tests Fix DBConnect support in VS Code (#1253) 2024-03-05 14:31:27 +00:00
.gitignore Improve default template (#1046) 2023-12-11 19:13:14 +00:00
README.md.tmpl Add an experimental dbt-sql template (#1059) 2024-02-19 09:15:17 +00:00
databricks.yml.tmpl Add an experimental default-sql template (#1051) 2024-02-19 12:01:11 +00:00
pytest.ini databricks bundle init template v1 (#686) 2023-09-05 11:58:34 +00:00
requirements-dev.txt.tmpl Fix DBConnect support in VS Code (#1253) 2024-03-05 14:31:27 +00:00
setup.py.tmpl Change default_python template to auto-update version on each wheel build (#1034) 2023-12-01 13:24:55 +00:00

README.md.tmpl

# {{.project_name}}

The '{{.project_name}}' project was generated by using the default-python template.

## Getting started

1. Install the Databricks CLI from https://docs.databricks.com/dev-tools/cli/databricks-cli.html

2. Authenticate to your Databricks workspace, if you have not done so already:
    ```
    $ databricks configure
    ```

3. To deploy a development copy of this project, type:
    ```
    $ databricks bundle deploy --target dev
    ```
    (Note that "dev" is the default target, so the `--target` parameter
    is optional here.)

    This deploys everything that's defined for this project.
    For example, the default template would deploy a job called
    `[dev yourname] {{.project_name}}_job` to your workspace.
    You can find that job by opening your workpace and clicking on **Workflows**.

4. Similarly, to deploy a production copy, type:
   ```
   $ databricks bundle deploy --target prod
   ```

   Note that the default job from the template has a schedule that runs every day
   (defined in resources/{{.project_name}}_job.yml). The schedule
   is paused when deploying in development mode (see
   https://docs.databricks.com/dev-tools/bundles/deployment-modes.html).

5. To run a job or pipeline, use the "run" command:
   ```
   $ databricks bundle run
   ```

6. Optionally, install developer tools such as the Databricks extension for Visual Studio Code from
   https://docs.databricks.com/dev-tools/vscode-ext.html.
{{- if (eq .include_python "yes") }} Or read the "getting started" documentation for
   **Databricks Connect** for instructions on running the included Python code from a different IDE.
{{- end}}

7. For documentation on the Databricks asset bundles format used
   for this project, and for CI/CD configuration, see
   https://docs.databricks.com/dev-tools/bundles/index.html.