Regenerate artifacts

This commit is contained in:
Ilya Kuznetsov 2025-03-04 12:57:10 +01:00
parent 00d2963b93
commit 175d760f18
No known key found for this signature in database
GPG Key ID: 91F3DDCF5D21CDDF
3 changed files with 631 additions and 196 deletions

View File

@ -1,12 +1,14 @@
--- ---
description: "Configuration reference for databricks.yml" description: "Configuration reference for databricks.yml"
last_update:
date: 2025-02-14
--- ---
<!-- DO NOT EDIT. This file is autogenerated with https://github.com/databricks/cli --> <!--DO NOT EDIT. This file is autogenerated with https://github.com/databricks/cli-->
# Configuration reference # Configuration reference
This article provides reference for keys supported by Databricks Asset Bundles configuration (YAML). See [\_](/dev-tools/bundles/index.md). This article provides reference for keys supported by :re[DABS] configuration (YAML). See [\_](/dev-tools/bundles/index.md).
For complete bundle examples, see [\_](/dev-tools/bundles/resource-examples.md) and the [bundle-examples GitHub repository](https://github.com/databricks/bundle-examples). For complete bundle examples, see [\_](/dev-tools/bundles/resource-examples.md) and the [bundle-examples GitHub repository](https://github.com/databricks/bundle-examples).
@ -34,7 +36,7 @@ artifacts:
- - `build` - - `build`
- String - String
- An optional set of non-default build commands to run locally before deployment. - An optional set of build commands to run locally before deployment.
- - `executable` - - `executable`
- String - String
@ -42,15 +44,15 @@ artifacts:
- - `files` - - `files`
- Sequence - Sequence
- The source files for the artifact. See [\_](#artifactsnamefiles). - The relative or absolute path to the built artifact files. See [\_](#artifactsnamefiles).
- - `path` - - `path`
- String - String
- The location where the built artifact will be saved. - The local path of the directory for the artifact.
- - `type` - - `type`
- String - String
- Required. The type of the artifact. Valid values are `whl`. - Required if the artifact is a Python wheel. The type of the artifact. Valid values are `whl` and `jar`.
::: :::
@ -69,7 +71,7 @@ artifacts:
**`Type: Sequence`** **`Type: Sequence`**
The source files for the artifact. The relative or absolute path to the built artifact files.
@ -81,7 +83,7 @@ The source files for the artifact.
- - `source` - - `source`
- String - String
- Required. The path of the files used to build the artifact. - Required. The artifact source file.
::: :::
@ -106,7 +108,7 @@ The bundle attributes when deploying to this target,
- - `compute_id` - - `compute_id`
- String - String
- - Deprecated. The ID of the compute to use to run the bundle.
- - `databricks_cli_version` - - `databricks_cli_version`
- String - String
@ -305,7 +307,7 @@ Configures loading of Python code defined with 'databricks-bundles' package.
**`Type: Sequence`** **`Type: Sequence`**
Specifies a list of path globs that contain configuration files to include within the bundle. See [_](/dev-tools/bundles/settings.md#include) Specifies a list of path globs that contain configuration files to include within the bundle. See [_](/dev-tools/bundles/settings.md#include).
## permissions ## permissions
@ -419,39 +421,39 @@ resources:
- - `apps` - - `apps`
- Map - Map
- - The app resource defines a [Databricks app](/api/workspace/apps/create). For information about Databricks Apps, see [_](/dev-tools/databricks-apps/index.md).
- - `clusters` - - `clusters`
- Map - Map
- The cluster definitions for the bundle, where each key is the name of a cluster. See [_](/dev-tools/bundles/resources.md#clusters) - The cluster definitions for the bundle, where each key is the name of a cluster. See [_](/dev-tools/bundles/resources.md#clusters).
- - `dashboards` - - `dashboards`
- Map - Map
- The dashboard definitions for the bundle, where each key is the name of the dashboard. See [_](/dev-tools/bundles/resources.md#dashboards) - The dashboard definitions for the bundle, where each key is the name of the dashboard. See [_](/dev-tools/bundles/resources.md#dashboards).
- - `experiments` - - `experiments`
- Map - Map
- The experiment definitions for the bundle, where each key is the name of the experiment. See [_](/dev-tools/bundles/resources.md#experiments) - The experiment definitions for the bundle, where each key is the name of the experiment. See [_](/dev-tools/bundles/resources.md#experiments).
- - `jobs` - - `jobs`
- Map - Map
- The job definitions for the bundle, where each key is the name of the job. See [_](/dev-tools/bundles/resources.md#jobs) - The job definitions for the bundle, where each key is the name of the job. See [_](/dev-tools/bundles/resources.md#jobs).
- - `model_serving_endpoints` - - `model_serving_endpoints`
- Map - Map
- The model serving endpoint definitions for the bundle, where each key is the name of the model serving endpoint. See [_](/dev-tools/bundles/resources.md#model_serving_endpoints) - The model serving endpoint definitions for the bundle, where each key is the name of the model serving endpoint. See [_](/dev-tools/bundles/resources.md#model_serving_endpoints).
- - `models` - - `models`
- Map - Map
- The model definitions for the bundle, where each key is the name of the model. See [_](/dev-tools/bundles/resources.md#models) - The model definitions for the bundle, where each key is the name of the model. See [_](/dev-tools/bundles/resources.md#models).
- - `pipelines` - - `pipelines`
- Map - Map
- The pipeline definitions for the bundle, where each key is the name of the pipeline. See [_](/dev-tools/bundles/resources.md#pipelines) - The pipeline definitions for the bundle, where each key is the name of the pipeline. See [_](/dev-tools/bundles/resources.md#pipelines).
- - `quality_monitors` - - `quality_monitors`
- Map - Map
- The quality monitor definitions for the bundle, where each key is the name of the quality monitor. See [_](/dev-tools/bundles/resources.md#quality_monitors) - The quality monitor definitions for the bundle, where each key is the name of the quality monitor. See [_](/dev-tools/bundles/resources.md#quality_monitors).
- - `registered_models` - - `registered_models`
- Map - Map
@ -459,11 +461,11 @@ resources:
- - `schemas` - - `schemas`
- Map - Map
- The schema definitions for the bundle, where each key is the name of the schema. See [_](/dev-tools/bundles/resources.md#schemas) - The schema definitions for the bundle, where each key is the name of the schema. See [_](/dev-tools/bundles/resources.md#schemas).
- - `volumes` - - `volumes`
- Map - Map
- The volume definitions for the bundle, where each key is the name of the volume. See [_](/dev-tools/bundles/resources.md#volumes) - The volume definitions for the bundle, where each key is the name of the volume. See [_](/dev-tools/bundles/resources.md#volumes).
::: :::
@ -621,7 +623,7 @@ artifacts:
- - `build` - - `build`
- String - String
- An optional set of non-default build commands to run locally before deployment. - An optional set of build commands to run locally before deployment.
- - `executable` - - `executable`
- String - String
@ -629,15 +631,15 @@ artifacts:
- - `files` - - `files`
- Sequence - Sequence
- The source files for the artifact. See [\_](#targetsnameartifactsnamefiles). - The relative or absolute path to the built artifact files. See [\_](#targetsnameartifactsnamefiles).
- - `path` - - `path`
- String - String
- The location where the built artifact will be saved. - The local path of the directory for the artifact.
- - `type` - - `type`
- String - String
- Required. The type of the artifact. Valid values are `whl`. - Required if the artifact is a Python wheel. The type of the artifact. Valid values are `whl` and `jar`.
::: :::
@ -646,7 +648,7 @@ artifacts:
**`Type: Sequence`** **`Type: Sequence`**
The source files for the artifact. The relative or absolute path to the built artifact files.
@ -658,7 +660,7 @@ The source files for the artifact.
- - `source` - - `source`
- String - String
- Required. The path of the files used to build the artifact. - Required. The artifact source file.
::: :::
@ -683,7 +685,7 @@ The bundle attributes when deploying to this target.
- - `compute_id` - - `compute_id`
- String - String
- - Deprecated. The ID of the compute to use to run the bundle.
- - `databricks_cli_version` - - `databricks_cli_version`
- String - String
@ -898,39 +900,39 @@ The resource definitions for the target.
- - `apps` - - `apps`
- Map - Map
- - The app resource defines a [Databricks app](/api/workspace/apps/create). For information about Databricks Apps, see [_](/dev-tools/databricks-apps/index.md).
- - `clusters` - - `clusters`
- Map - Map
- The cluster definitions for the bundle, where each key is the name of a cluster. See [_](/dev-tools/bundles/resources.md#clusters) - The cluster definitions for the bundle, where each key is the name of a cluster. See [_](/dev-tools/bundles/resources.md#clusters).
- - `dashboards` - - `dashboards`
- Map - Map
- The dashboard definitions for the bundle, where each key is the name of the dashboard. See [_](/dev-tools/bundles/resources.md#dashboards) - The dashboard definitions for the bundle, where each key is the name of the dashboard. See [_](/dev-tools/bundles/resources.md#dashboards).
- - `experiments` - - `experiments`
- Map - Map
- The experiment definitions for the bundle, where each key is the name of the experiment. See [_](/dev-tools/bundles/resources.md#experiments) - The experiment definitions for the bundle, where each key is the name of the experiment. See [_](/dev-tools/bundles/resources.md#experiments).
- - `jobs` - - `jobs`
- Map - Map
- The job definitions for the bundle, where each key is the name of the job. See [_](/dev-tools/bundles/resources.md#jobs) - The job definitions for the bundle, where each key is the name of the job. See [_](/dev-tools/bundles/resources.md#jobs).
- - `model_serving_endpoints` - - `model_serving_endpoints`
- Map - Map
- The model serving endpoint definitions for the bundle, where each key is the name of the model serving endpoint. See [_](/dev-tools/bundles/resources.md#model_serving_endpoints) - The model serving endpoint definitions for the bundle, where each key is the name of the model serving endpoint. See [_](/dev-tools/bundles/resources.md#model_serving_endpoints).
- - `models` - - `models`
- Map - Map
- The model definitions for the bundle, where each key is the name of the model. See [_](/dev-tools/bundles/resources.md#models) - The model definitions for the bundle, where each key is the name of the model. See [_](/dev-tools/bundles/resources.md#models).
- - `pipelines` - - `pipelines`
- Map - Map
- The pipeline definitions for the bundle, where each key is the name of the pipeline. See [_](/dev-tools/bundles/resources.md#pipelines) - The pipeline definitions for the bundle, where each key is the name of the pipeline. See [_](/dev-tools/bundles/resources.md#pipelines).
- - `quality_monitors` - - `quality_monitors`
- Map - Map
- The quality monitor definitions for the bundle, where each key is the name of the quality monitor. See [_](/dev-tools/bundles/resources.md#quality_monitors) - The quality monitor definitions for the bundle, where each key is the name of the quality monitor. See [_](/dev-tools/bundles/resources.md#quality_monitors).
- - `registered_models` - - `registered_models`
- Map - Map
@ -938,11 +940,11 @@ The resource definitions for the target.
- - `schemas` - - `schemas`
- Map - Map
- The schema definitions for the bundle, where each key is the name of the schema. See [_](/dev-tools/bundles/resources.md#schemas) - The schema definitions for the bundle, where each key is the name of the schema. See [_](/dev-tools/bundles/resources.md#schemas).
- - `volumes` - - `volumes`
- Map - Map
- The volume definitions for the bundle, where each key is the name of the volume. See [_](/dev-tools/bundles/resources.md#volumes) - The volume definitions for the bundle, where each key is the name of the volume. See [_](/dev-tools/bundles/resources.md#volumes).
::: :::
@ -1022,7 +1024,7 @@ variables:
- - `default` - - `default`
- Any - Any
- - The default value for the variable.
- - `description` - - `description`
- String - String
@ -1055,51 +1057,51 @@ The name of the alert, cluster_policy, cluster, dashboard, instance_pool, job, m
- - `alert` - - `alert`
- String - String
- - The name of the alert for which to retrieve an ID.
- - `cluster` - - `cluster`
- String - String
- - The name of the cluster for which to retrieve an ID.
- - `cluster_policy` - - `cluster_policy`
- String - String
- - The name of the cluster_policy for which to retrieve an ID.
- - `dashboard` - - `dashboard`
- String - String
- - The name of the dashboard for which to retrieve an ID.
- - `instance_pool` - - `instance_pool`
- String - String
- - The name of the instance_pool for which to retrieve an ID.
- - `job` - - `job`
- String - String
- - The name of the job for which to retrieve an ID.
- - `metastore` - - `metastore`
- String - String
- - The name of the metastore for which to retrieve an ID.
- - `notification_destination` - - `notification_destination`
- String - String
- - The name of the notification_destination for which to retrieve an ID.
- - `pipeline` - - `pipeline`
- String - String
- - The name of the pipeline for which to retrieve an ID.
- - `query` - - `query`
- String - String
- - The name of the query for which to retrieve an ID.
- - `service_principal` - - `service_principal`
- String - String
- - The name of the service_principal for which to retrieve an ID.
- - `warehouse` - - `warehouse`
- String - String
- - The name of the warehouse for which to retrieve an ID.
::: :::
@ -1206,7 +1208,7 @@ variables:
- - `default` - - `default`
- Any - Any
- - The default value for the variable.
- - `description` - - `description`
- String - String
@ -1239,51 +1241,51 @@ The name of the `alert`, `cluster_policy`, `cluster`, `dashboard`, `instance_poo
- - `alert` - - `alert`
- String - String
- - The name of the alert for which to retrieve an ID.
- - `cluster` - - `cluster`
- String - String
- - The name of the cluster for which to retrieve an ID.
- - `cluster_policy` - - `cluster_policy`
- String - String
- - The name of the cluster_policy for which to retrieve an ID.
- - `dashboard` - - `dashboard`
- String - String
- - The name of the dashboard for which to retrieve an ID.
- - `instance_pool` - - `instance_pool`
- String - String
- - The name of the instance_pool for which to retrieve an ID.
- - `job` - - `job`
- String - String
- - The name of the job for which to retrieve an ID.
- - `metastore` - - `metastore`
- String - String
- - The name of the metastore for which to retrieve an ID.
- - `notification_destination` - - `notification_destination`
- String - String
- - The name of the notification_destination for which to retrieve an ID.
- - `pipeline` - - `pipeline`
- String - String
- - The name of the pipeline for which to retrieve an ID.
- - `query` - - `query`
- String - String
- - The name of the query for which to retrieve an ID.
- - `service_principal` - - `service_principal`
- String - String
- - The name of the service_principal for which to retrieve an ID.
- - `warehouse` - - `warehouse`
- String - String
- - The name of the warehouse for which to retrieve an ID.
::: :::

View File

@ -1,14 +1,22 @@
--- ---
description: "Learn about resources supported by Databricks Asset Bundles and how to configure them." description: "Learn about resources supported by Databricks Asset Bundles and how to configure them."
last_update:
date: 2025-02-14
--- ---
<!-- DO NOT EDIT. This file is autogenerated with https://github.com/databricks/cli --> <!-- DO NOT EDIT. This file is autogenerated with https://github.com/databricks/cli -->
# :re[DABS] resources # :re[DABS] resources
:re[DABS] allows you to specify information about the :re[Databricks] resources used by the bundle in the `resources` mapping in the bundle configuration. See [resources mapping](settings.md#resources) and [resources key reference](reference.md#resources). :re[DABS] allows you to specify information about the :re[Databricks] resources used by the bundle in the `resources` mapping in the bundle configuration. See [resources mapping](/dev-tools/bundles/settings.md#resources) and [resources key reference](/dev-tools/bundles/reference.md#resources).
This article outlines supported resource types for bundles and provides details and an example for each supported type. For additional examples, see [\_](resource-examples.md). This article outlines supported resource types for bundles and provides details and an example for each supported type. For additional examples, see [\_](/dev-tools/bundles/resource-examples.md).
:::tip
To generate YAML for any existing resource, use the `databricks bundle generate` command. See [\_](/dev-tools/cli/bundle-commands.md#generate).
:::
## <a id="resource-types"></a>Supported resources ## <a id="resource-types"></a>Supported resources
@ -22,64 +30,105 @@ The `databricks bundle validate` command returns warnings if unknown resource pr
::: :::
::::aws-azure
:::list-table :::list-table
- - Resource - - Resource
- Create support - Create support
- Corresponding REST API object - Corresponding REST API object
- - [app](#apps)
- ✓
- [App object](https://docs.databricks.com/api/workspace/apps/create)
- - [cluster](#clusters) - - [cluster](#clusters)
- ✓ - ✓
- [Cluster object](https://docs.databricks.com/api/workspace/clusters/create) - [Cluster object](https://docs.databricks.com/api/workspace/clusters/create)
- - [dashboard](#dashboards) - - [dashboard](#dashboards)
- -
- [Dashboard object](https://docs.databricks.com/api/workspace/lakeview/create) - [Dashboard object](https://docs.databricks.com/api/workspace/lakeview/create)
- - [experiment](#experiments) - - [experiment](#experiments)
- ✓ - ✓
- [Experiment object](https://docs.databricks.com/api/workspace/experiments/createexperiment) - [Experiment object](https://docs.databricks.com/api/workspace/experiments/createexperiment)
- - [job](#job)
- - [job](#jobs)
- ✓ - ✓
- [Job object](https://docs.databricks.com/api/workspace/jobs/create) - [Job object](https://docs.databricks.com/api/workspace/jobs/create)
- - [model (legacy)](#models) - - [model (legacy)](#models)
- ✓ - ✓
- [Model (legacy) object](https://docs.databricks.com/api/workspace/modelregistry/createmodel) - [Model (legacy) object](https://docs.databricks.com/api/workspace/modelregistry/createmodel)
- - [model_serving_endpoint](#model_serving_endpoints) - - [model_serving_endpoint](#model_serving_endpoints)
- ✓ - ✓
- [Model serving endpoint object](https://docs.databricks.com/api/workspace/servingendpoints/create) - [Model serving endpoint object](https://docs.databricks.com/api/workspace/servingendpoints/create)
- - [pipeline](#pipeline)
- - [pipeline](#pipelines)
- ✓ - ✓
- [Pipeline object](https://docs.databricks.com/api/workspace/pipelines/create) - [Pipeline object](https://docs.databricks.com/api/workspace/pipelines/create)
- - [quality_monitor](#quality_monitors) - - [quality_monitor](#quality_monitors)
- ✓ - ✓
- [Quality monitor object](https://docs.databricks.com/api/workspace/qualitymonitors/create) - [Quality monitor object](https://docs.databricks.com/api/workspace/qualitymonitors/create)
- - [registered_model](#registered_models) (:re[UC])
- - [registered_model](#registered_models) (Unity Catalog)
- ✓ - ✓
- [Registered model object](https://docs.databricks.com/api/workspace/registeredmodels/create) - [Registered model object](https://docs.databricks.com/api/workspace/registeredmodels/create)
- - [schema](#schemas) (:re[UC])
- - [schema](#schemas) (Unity Catalog)
- ✓ - ✓
- [Schema object](https://docs.databricks.com/api/workspace/schemas/create) - [Schema object](https://docs.databricks.com/api/workspace/schemas/create)
- - [volume](#volumes) (:re[UC])
- - [volume](#volumes) (Unity Catalog)
- ✓ - ✓
- [Volume object](https://docs.databricks.com/api/workspace/volumes/create) - [Volume object](https://docs.databricks.com/api/workspace/volumes/create)
::: :::
::::
::::gcp
:::list-table
- - Resource
- Create support
- Corresponding REST API object
- - [cluster](#clusters)
- ✓
- [Cluster object](https://docs.databricks.com/api/workspace/clusters/create)
- - [dashboard](#dashboards)
-
- [Dashboard object](https://docs.databricks.com/api/workspace/lakeview/create)
- - [experiment](#experiments)
- ✓
- [Experiment object](https://docs.databricks.com/api/workspace/experiments/createexperiment)
- - [job](#jobs)
- ✓
- [Job object](https://docs.databricks.com/api/workspace/jobs/create)
- - [model (legacy)](#models)
- ✓
- [Model (legacy) object](https://docs.databricks.com/api/workspace/modelregistry/createmodel)
- - [model_serving_endpoint](#model_serving_endpoints)
- ✓
- [Model serving endpoint object](https://docs.databricks.com/api/workspace/servingendpoints/create)
- - [pipeline](#pipelines)
- ✓
- [Pipeline object]](https://docs.databricks.com/api/workspace/pipelines/create)
- - [quality_monitor](#quality_monitors)
- ✓
- [Quality monitor object](https://docs.databricks.com/api/workspace/qualitymonitors/create)
- - [registered_model](#registered_models) (:re[UC])
- ✓
- [Registered model object](https://docs.databricks.com/api/workspace/registeredmodels/create)
- - [schema](#schemas) (:re[UC])
- ✓
- [Schema object](https://docs.databricks.com/api/workspace/schemas/create)
- - [volume](#volumes) (:re[UC])
- ✓
- [Volume object](https://docs.databricks.com/api/workspace/volumes/create)
:::
::::
## apps ## apps
**`Type: Map`** **`Type: Map`**
The app resource defines a [Databricks app](/api/workspace/apps/create). For information about Databricks Apps, see [_](/dev-tools/databricks-apps/index.md).
```yaml ```yaml
apps: apps:
@ -126,6 +175,10 @@ apps:
- String - String
- -
- - `id`
- String
- The unique identifier of the app.
- - `name` - - `name`
- String - String
- -
@ -632,7 +685,7 @@ clusters:
- - `cluster_log_conf` - - `cluster_log_conf`
- Map - Map
- The configuration for delivering spark logs to a long-term storage destination. Two kinds of destinations (dbfs and s3) are supported. Only one destination can be specified for one cluster. If the conf is given, the logs will be delivered to the destination every `5 mins`. The destination of driver logs is `$destination/$clusterId/driver`, while the destination of executor logs is `$destination/$clusterId/executor`. See [\_](#clustersnamecluster_log_conf). - The configuration for delivering spark logs to a long-term storage destination. Three kinds of destinations (DBFS, S3 and Unity Catalog volumes) are supported. Only one destination can be specified for one cluster. If the conf is given, the logs will be delivered to the destination every `5 mins`. The destination of driver logs is `$destination/$clusterId/driver`, while the destination of executor logs is `$destination/$clusterId/executor`. See [\_](#clustersnamecluster_log_conf).
- - `cluster_name` - - `cluster_name`
- String - String
@ -913,7 +966,7 @@ Defines values necessary to configure and run Azure Log Analytics agent
**`Type: Map`** **`Type: Map`**
The configuration for delivering spark logs to a long-term storage destination. The configuration for delivering spark logs to a long-term storage destination.
Two kinds of destinations (dbfs and s3) are supported. Only one destination can be specified Three kinds of destinations (DBFS, S3 and Unity Catalog volumes) are supported. Only one destination can be specified
for one cluster. If the conf is given, the logs will be delivered to the destination every for one cluster. If the conf is given, the logs will be delivered to the destination every
`5 mins`. The destination of driver logs is `$destination/$clusterId/driver`, while `5 mins`. The destination of driver logs is `$destination/$clusterId/driver`, while
the destination of executor logs is `$destination/$clusterId/executor`. the destination of executor logs is `$destination/$clusterId/executor`.
@ -934,6 +987,10 @@ the destination of executor logs is `$destination/$clusterId/executor`.
- Map - Map
- destination and either the region or endpoint need to be provided. e.g. `{ "s3": { "destination" : "s3://cluster_log_bucket/prefix", "region" : "us-west-2" } }` Cluster iam role is used to access s3, please make sure the cluster iam role in `instance_profile_arn` has permission to write data to the s3 destination. See [\_](#clustersnamecluster_log_confs3). - destination and either the region or endpoint need to be provided. e.g. `{ "s3": { "destination" : "s3://cluster_log_bucket/prefix", "region" : "us-west-2" } }` Cluster iam role is used to access s3, please make sure the cluster iam role in `instance_profile_arn` has permission to write data to the s3 destination. See [\_](#clustersnamecluster_log_confs3).
- - `volumes`
- Map
- destination needs to be provided. e.g. `{ "volumes" : { "destination" : "/Volumes/catalog/schema/volume/cluster_log" } }`. See [\_](#clustersnamecluster_log_confvolumes).
::: :::
@ -1007,6 +1064,28 @@ Cluster iam role is used to access s3, please make sure the cluster iam role in
::: :::
### clusters._name_.cluster_log_conf.volumes
**`Type: Map`**
destination needs to be provided. e.g.
`{ "volumes" : { "destination" : "/Volumes/catalog/schema/volume/cluster_log" } }`
:::list-table
- - Key
- Type
- Description
- - `destination`
- String
- Unity Catalog volumes file destination, e.g. `/Volumes/catalog/schema/volume/dir/file`
:::
### clusters._name_.docker_image ### clusters._name_.docker_image
**`Type: Map`** **`Type: Map`**
@ -1296,7 +1375,7 @@ destination needs to be provided. e.g.
- - `destination` - - `destination`
- String - String
- Unity Catalog Volumes file destination, e.g. `/Volumes/my-init.sh` - Unity Catalog volumes file destination, e.g. `/Volumes/catalog/schema/volume/dir/file`
::: :::
@ -2143,7 +2222,7 @@ If new_cluster, a description of a cluster that is created for each task.
- - `cluster_log_conf` - - `cluster_log_conf`
- Map - Map
- The configuration for delivering spark logs to a long-term storage destination. Two kinds of destinations (dbfs and s3) are supported. Only one destination can be specified for one cluster. If the conf is given, the logs will be delivered to the destination every `5 mins`. The destination of driver logs is `$destination/$clusterId/driver`, while the destination of executor logs is `$destination/$clusterId/executor`. See [\_](#jobsnamejob_clustersnew_clustercluster_log_conf). - The configuration for delivering spark logs to a long-term storage destination. Three kinds of destinations (DBFS, S3 and Unity Catalog volumes) are supported. Only one destination can be specified for one cluster. If the conf is given, the logs will be delivered to the destination every `5 mins`. The destination of driver logs is `$destination/$clusterId/driver`, while the destination of executor logs is `$destination/$clusterId/executor`. See [\_](#jobsnamejob_clustersnew_clustercluster_log_conf).
- - `cluster_name` - - `cluster_name`
- String - String
@ -2392,7 +2471,7 @@ Defines values necessary to configure and run Azure Log Analytics agent
**`Type: Map`** **`Type: Map`**
The configuration for delivering spark logs to a long-term storage destination. The configuration for delivering spark logs to a long-term storage destination.
Two kinds of destinations (dbfs and s3) are supported. Only one destination can be specified Three kinds of destinations (DBFS, S3 and Unity Catalog volumes) are supported. Only one destination can be specified
for one cluster. If the conf is given, the logs will be delivered to the destination every for one cluster. If the conf is given, the logs will be delivered to the destination every
`5 mins`. The destination of driver logs is `$destination/$clusterId/driver`, while `5 mins`. The destination of driver logs is `$destination/$clusterId/driver`, while
the destination of executor logs is `$destination/$clusterId/executor`. the destination of executor logs is `$destination/$clusterId/executor`.
@ -2413,6 +2492,10 @@ the destination of executor logs is `$destination/$clusterId/executor`.
- Map - Map
- destination and either the region or endpoint need to be provided. e.g. `{ "s3": { "destination" : "s3://cluster_log_bucket/prefix", "region" : "us-west-2" } }` Cluster iam role is used to access s3, please make sure the cluster iam role in `instance_profile_arn` has permission to write data to the s3 destination. See [\_](#jobsnamejob_clustersnew_clustercluster_log_confs3). - destination and either the region or endpoint need to be provided. e.g. `{ "s3": { "destination" : "s3://cluster_log_bucket/prefix", "region" : "us-west-2" } }` Cluster iam role is used to access s3, please make sure the cluster iam role in `instance_profile_arn` has permission to write data to the s3 destination. See [\_](#jobsnamejob_clustersnew_clustercluster_log_confs3).
- - `volumes`
- Map
- destination needs to be provided. e.g. `{ "volumes" : { "destination" : "/Volumes/catalog/schema/volume/cluster_log" } }`. See [\_](#jobsnamejob_clustersnew_clustercluster_log_confvolumes).
::: :::
@ -2486,6 +2569,28 @@ Cluster iam role is used to access s3, please make sure the cluster iam role in
::: :::
### jobs._name_.job_clusters.new_cluster.cluster_log_conf.volumes
**`Type: Map`**
destination needs to be provided. e.g.
`{ "volumes" : { "destination" : "/Volumes/catalog/schema/volume/cluster_log" } }`
:::list-table
- - Key
- Type
- Description
- - `destination`
- String
- Unity Catalog volumes file destination, e.g. `/Volumes/catalog/schema/volume/dir/file`
:::
### jobs._name_.job_clusters.new_cluster.docker_image ### jobs._name_.job_clusters.new_cluster.docker_image
**`Type: Map`** **`Type: Map`**
@ -2775,7 +2880,7 @@ destination needs to be provided. e.g.
- - `destination` - - `destination`
- String - String
- Unity Catalog Volumes file destination, e.g. `/Volumes/my-init.sh` - Unity Catalog volumes file destination, e.g. `/Volumes/catalog/schema/volume/dir/file`
::: :::
@ -3564,7 +3669,7 @@ If new_cluster, a description of a new cluster that is created for each run.
- - `cluster_log_conf` - - `cluster_log_conf`
- Map - Map
- The configuration for delivering spark logs to a long-term storage destination. Two kinds of destinations (dbfs and s3) are supported. Only one destination can be specified for one cluster. If the conf is given, the logs will be delivered to the destination every `5 mins`. The destination of driver logs is `$destination/$clusterId/driver`, while the destination of executor logs is `$destination/$clusterId/executor`. See [\_](#jobsnametasksnew_clustercluster_log_conf). - The configuration for delivering spark logs to a long-term storage destination. Three kinds of destinations (DBFS, S3 and Unity Catalog volumes) are supported. Only one destination can be specified for one cluster. If the conf is given, the logs will be delivered to the destination every `5 mins`. The destination of driver logs is `$destination/$clusterId/driver`, while the destination of executor logs is `$destination/$clusterId/executor`. See [\_](#jobsnametasksnew_clustercluster_log_conf).
- - `cluster_name` - - `cluster_name`
- String - String
@ -3813,7 +3918,7 @@ Defines values necessary to configure and run Azure Log Analytics agent
**`Type: Map`** **`Type: Map`**
The configuration for delivering spark logs to a long-term storage destination. The configuration for delivering spark logs to a long-term storage destination.
Two kinds of destinations (dbfs and s3) are supported. Only one destination can be specified Three kinds of destinations (DBFS, S3 and Unity Catalog volumes) are supported. Only one destination can be specified
for one cluster. If the conf is given, the logs will be delivered to the destination every for one cluster. If the conf is given, the logs will be delivered to the destination every
`5 mins`. The destination of driver logs is `$destination/$clusterId/driver`, while `5 mins`. The destination of driver logs is `$destination/$clusterId/driver`, while
the destination of executor logs is `$destination/$clusterId/executor`. the destination of executor logs is `$destination/$clusterId/executor`.
@ -3834,6 +3939,10 @@ the destination of executor logs is `$destination/$clusterId/executor`.
- Map - Map
- destination and either the region or endpoint need to be provided. e.g. `{ "s3": { "destination" : "s3://cluster_log_bucket/prefix", "region" : "us-west-2" } }` Cluster iam role is used to access s3, please make sure the cluster iam role in `instance_profile_arn` has permission to write data to the s3 destination. See [\_](#jobsnametasksnew_clustercluster_log_confs3). - destination and either the region or endpoint need to be provided. e.g. `{ "s3": { "destination" : "s3://cluster_log_bucket/prefix", "region" : "us-west-2" } }` Cluster iam role is used to access s3, please make sure the cluster iam role in `instance_profile_arn` has permission to write data to the s3 destination. See [\_](#jobsnametasksnew_clustercluster_log_confs3).
- - `volumes`
- Map
- destination needs to be provided. e.g. `{ "volumes" : { "destination" : "/Volumes/catalog/schema/volume/cluster_log" } }`. See [\_](#jobsnametasksnew_clustercluster_log_confvolumes).
::: :::
@ -3907,6 +4016,28 @@ Cluster iam role is used to access s3, please make sure the cluster iam role in
::: :::
### jobs._name_.tasks.new_cluster.cluster_log_conf.volumes
**`Type: Map`**
destination needs to be provided. e.g.
`{ "volumes" : { "destination" : "/Volumes/catalog/schema/volume/cluster_log" } }`
:::list-table
- - Key
- Type
- Description
- - `destination`
- String
- Unity Catalog volumes file destination, e.g. `/Volumes/catalog/schema/volume/dir/file`
:::
### jobs._name_.tasks.new_cluster.docker_image ### jobs._name_.tasks.new_cluster.docker_image
**`Type: Map`** **`Type: Map`**
@ -4196,7 +4327,7 @@ destination needs to be provided. e.g.
- - `destination` - - `destination`
- String - String
- Unity Catalog Volumes file destination, e.g. `/Volumes/my-init.sh` - Unity Catalog volumes file destination, e.g. `/Volumes/catalog/schema/volume/dir/file`
::: :::
@ -5711,7 +5842,7 @@ The external model to be served. NOTE: Only one of external_model and (entity_na
- - `provider` - - `provider`
- String - String
- The name of the provider for the external model. Currently, the supported providers are 'ai21labs', 'anthropic', 'amazon-bedrock', 'cohere', 'databricks-model-serving', 'google-cloud-vertex-ai', 'openai', and 'palm'. - The name of the provider for the external model. Currently, the supported providers are 'ai21labs', 'anthropic', 'amazon-bedrock', 'cohere', 'databricks-model-serving', 'google-cloud-vertex-ai', 'openai', 'palm', and 'custom'.
- - `task` - - `task`
- String - String
@ -6804,6 +6935,10 @@ the destination of executor logs is `$destination/$clusterId/executor`.
- Map - Map
- destination and either the region or endpoint need to be provided. e.g. `{ "s3": { "destination" : "s3://cluster_log_bucket/prefix", "region" : "us-west-2" } }` Cluster iam role is used to access s3, please make sure the cluster iam role in `instance_profile_arn` has permission to write data to the s3 destination. See [\_](#pipelinesnameclusterscluster_log_confs3). - destination and either the region or endpoint need to be provided. e.g. `{ "s3": { "destination" : "s3://cluster_log_bucket/prefix", "region" : "us-west-2" } }` Cluster iam role is used to access s3, please make sure the cluster iam role in `instance_profile_arn` has permission to write data to the s3 destination. See [\_](#pipelinesnameclusterscluster_log_confs3).
- - `volumes`
- Map
- destination needs to be provided. e.g. `{ "volumes" : { "destination" : "/Volumes/catalog/schema/volume/cluster_log" } }`. See [\_](#pipelinesnameclusterscluster_log_confvolumes).
::: :::
@ -6877,6 +7012,28 @@ Cluster iam role is used to access s3, please make sure the cluster iam role in
::: :::
### pipelines._name_.clusters.cluster_log_conf.volumes
**`Type: Map`**
destination needs to be provided. e.g.
`{ "volumes" : { "destination" : "/Volumes/catalog/schema/volume/cluster_log" } }`
:::list-table
- - Key
- Type
- Description
- - `destination`
- String
- Unity Catalog volumes file destination, e.g. `/Volumes/catalog/schema/volume/dir/file`
:::
### pipelines._name_.clusters.gcp_attributes ### pipelines._name_.clusters.gcp_attributes
**`Type: Map`** **`Type: Map`**
@ -7116,7 +7273,7 @@ destination needs to be provided. e.g.
- - `destination` - - `destination`
- String - String
- Unity Catalog Volumes file destination, e.g. `/Volumes/my-init.sh` - Unity Catalog volumes file destination, e.g. `/Volumes/catalog/schema/volume/dir/file`
::: :::

File diff suppressed because it is too large Load Diff