Compare commits

...

6 Commits

Author SHA1 Message Date
Shreyas Goenka e55df7b207
Merge remote-tracking branch 'origin' into improve/json-schema 2024-09-05 14:16:14 +02:00
Shreyas Goenka 4379d7f09b
allow artifacts referencing 2024-09-05 14:15:35 +02:00
Shreyas Goenka d0aa493281
- 2024-09-05 14:15:08 +02:00
Pieter Noordhuis ceefa80d72
Pass copy of `dyn.Path` to callback function (#1747)
## Changes

Some call sites hold on to the `dyn.Path` provided to them by the
callback. It must therefore never be mutated after the callback returns,
or these mutations leak out into unknown scope.

This change means it is no longer possible for this failure mode to
happen.

## Tests

Unit test.
2024-09-05 11:05:16 +00:00
Andrew Nester f71d9e7649
[Release] Release v0.228.0 (#1752)
CLI:
* Do not error if we cannot prompt for a profile in `auth login`
([#1745](https://github.com/databricks/cli/pull/1745)).

Bundles:

As of this release CLI will show a prompt is if there are configuration
changes which will lead to a DLT recreation.
Users can skip the prompt by specifying the `--auto-approve` flag

* Pass along $AZURE_CONFIG_FILE to Terraform process
([#1734](https://github.com/databricks/cli/pull/1734)).
* Add prompt when a pipeline recreation happens
([#1672](https://github.com/databricks/cli/pull/1672)).
* Use materialized views in the default-sql template
([#1709](https://github.com/databricks/cli/pull/1709)).
* Update templates to latest LTS DBR
([#1715](https://github.com/databricks/cli/pull/1715)).
* Make lock optional in the JSON schema
([#1738](https://github.com/databricks/cli/pull/1738)).
* Do not suppress normalisation diagnostics for resolving variables
([#1740](https://github.com/databricks/cli/pull/1740)).
* Include a permissions section in all templates
([#1713](https://github.com/databricks/cli/pull/1713)).
* Fixed complex variables are not being correctly merged from include
files ([#1746](https://github.com/databricks/cli/pull/1746)).
* Fixed variable override in target with full variable syntax
([#1749](https://github.com/databricks/cli/pull/1749)).

Internal:
* Consider serverless clusters as compatible for Python wheel tasks
([#1733](https://github.com/databricks/cli/pull/1733)).
* PythonMutator: explain missing package error
([#1736](https://github.com/databricks/cli/pull/1736)).
* Add `dyn.Time` to box a timestamp with its original string value
([#1732](https://github.com/databricks/cli/pull/1732)).
* Fix streaming of stdout, stdin, stderr in cobra test runner
([#1742](https://github.com/databricks/cli/pull/1742)).

Dependency updates:
* Bump github.com/Masterminds/semver/v3 from 3.2.1 to 3.3.0
([#1741](https://github.com/databricks/cli/pull/1741)).

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-09-05 08:56:52 +00:00
Andrew Nester 72030844c5
Fixed variable override in target with full variable syntax (#1749)
## Changes
This PR makes sure that both of this override syntax for variables work
correctly
```
targets:
  dev:
    variables:
      cluster1:
        spark_version: "14.2.x-scala2.11"
        node_type_id: "Standard_DS3_v2"
        num_workers: 4
        spark_conf:
          spark.speculation: false
          spark.databricks.delta.retentionDurationCheck.enabled: false
      cluster2:
        default:
          spark_version: "14.2.x-scala2.11"
          node_type_id: "Standard_DS3_v2"
          num_workers: 4
          spark_conf:
            spark.speculation: false
            spark.databricks.delta.retentionDurationCheck.enabled: false
```
## Tests
Added regression test

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-09-04 17:16:40 +00:00
14 changed files with 177 additions and 41 deletions

View File

@ -1,5 +1,34 @@
# Version changelog
## [Release] Release v0.228.0
CLI:
* Do not error if we cannot prompt for a profile in `auth login` ([#1745](https://github.com/databricks/cli/pull/1745)).
Bundles:
As of this release, the CLI will show a prompt if there are configuration changes that lead to DLT pipeline recreation.
Users can skip the prompt by specifying the `--auto-approve` flag.
* Pass along to Terraform process ([#1734](https://github.com/databricks/cli/pull/1734)).
* Add prompt when a pipeline recreation happens ([#1672](https://github.com/databricks/cli/pull/1672)).
* Use materialized views in the default-sql template ([#1709](https://github.com/databricks/cli/pull/1709)).
* Update templates to latest LTS DBR ([#1715](https://github.com/databricks/cli/pull/1715)).
* Make lock optional in the JSON schema ([#1738](https://github.com/databricks/cli/pull/1738)).
* Do not suppress normalisation diagnostics for resolving variables ([#1740](https://github.com/databricks/cli/pull/1740)).
* Include a permissions section in all templates ([#1713](https://github.com/databricks/cli/pull/1713)).
* Fixed complex variables are not being correctly merged from include files ([#1746](https://github.com/databricks/cli/pull/1746)).
* Fixed variable override in target with full variable syntax ([#1749](https://github.com/databricks/cli/pull/1749)).
Internal:
* Consider serverless clusters as compatible for Python wheel tasks ([#1733](https://github.com/databricks/cli/pull/1733)).
* PythonMutator: explain missing package error ([#1736](https://github.com/databricks/cli/pull/1736)).
* Add `dyn.Time` to box a timestamp with its original string value ([#1732](https://github.com/databricks/cli/pull/1732)).
* Fix streaming of stdout, stdin, stderr in cobra test runner ([#1742](https://github.com/databricks/cli/pull/1742)).
Dependency updates:
* Bump github.com/Masterminds/semver/v3 from 3.2.1 to 3.3.0 ([#1741](https://github.com/databricks/cli/pull/1741)).
## [Release] Release v0.227.1
CLI:

View File

@ -33,12 +33,7 @@ func createGlobError(v dyn.Value, p dyn.Path, message string) diag.Diagnostic {
Severity: diag.Error,
Summary: fmt.Sprintf("%s: %s", source, message),
Locations: []dyn.Location{v.Location()},
Paths: []dyn.Path{
// Hack to clone the path. This path copy is mutable.
// To be addressed in a later PR.
p.Append(),
},
Paths: []dyn.Path{p},
}
}

View File

@ -406,6 +406,30 @@ func (r *Root) MergeTargetOverrides(name string) error {
return r.updateWithDynamicValue(root)
}
var variableKeywords = []string{"default", "lookup"}
// isFullVariableOverrideDef checks if the given value is a full syntax varaible override.
// A full syntax variable override is a map with only one of the following
// keys: "default", "lookup".
func isFullVariableOverrideDef(v dyn.Value) bool {
mv, ok := v.AsMap()
if !ok {
return false
}
if mv.Len() != 1 {
return false
}
for _, keyword := range variableKeywords {
if _, ok := mv.GetByString(keyword); ok {
return true
}
}
return false
}
// rewriteShorthands performs lightweight rewriting of the configuration
// tree where we allow users to write a shorthand and must rewrite to the full form.
func rewriteShorthands(v dyn.Value) (dyn.Value, error) {
@ -433,30 +457,27 @@ func rewriteShorthands(v dyn.Value) (dyn.Value, error) {
}, variable.Locations()), nil
case dyn.KindMap, dyn.KindSequence:
lookup, err := dyn.Get(variable, "lookup")
// If lookup is set, we don't want to rewrite the variable and return it as is.
if err == nil && lookup.Kind() != dyn.KindInvalid {
// If it's a full variable definition, leave it as is.
if isFullVariableOverrideDef(variable) {
return variable, nil
}
// Check if the original definition of variable has a type field.
// If it has a type field, it means the shorthand is a value of a complex type.
// Type might not be found if the variable overriden in a separate file
// and configuration is not merged yet.
typeV, err := dyn.GetByPath(v, p.Append(dyn.Key("type")))
if err != nil {
return dyn.NewValue(map[string]dyn.Value{
"default": variable,
}, variable.Locations()), nil
}
if typeV.MustString() == "complex" {
if err == nil && typeV.MustString() == "complex" {
return dyn.NewValue(map[string]dyn.Value{
"type": typeV,
"default": variable,
}, variable.Locations()), nil
}
return variable, nil
// If it's a shorthand, rewrite it to a full variable definition.
return dyn.NewValue(map[string]dyn.Value{
"default": variable,
}, variable.Locations()), nil
default:
return variable, nil

View File

@ -3,7 +3,6 @@ package validate
import (
"context"
"fmt"
"slices"
"sort"
"github.com/databricks/cli/bundle"
@ -66,10 +65,7 @@ func (m *uniqueResourceKeys) Apply(ctx context.Context, b *bundle.Bundle) diag.D
}
}
// dyn.Path under the hood is a slice. The code that walks the configuration
// tree uses the same underlying slice to track the path as it walks
// the tree. So, we need to clone it here.
m.paths = append(m.paths, slices.Clone(p))
m.paths = append(m.paths, p)
m.locations = append(m.locations, v.Locations()...)
resourceMetadata[k] = m

View File

@ -17,6 +17,10 @@
"type": "string",
"pattern": "\\$\\{(workspace(\\.[a-zA-Z]+([-_]?[a-zA-Z0-9]+)*(\\[[0-9]+\\])*)*(\\[[0-9]+\\])*)\\}"
},
{
"type": "string",
"pattern": "\\$\\{(artifacts(\\.[a-zA-Z]+([-_]?[a-zA-Z0-9]+)*(\\[[0-9]+\\])*)*(\\[[0-9]+\\])*)\\}"
},
{
"type": "string",
"pattern": "\\$\\{(var(\\.[a-zA-Z]+([-_]?[a-zA-Z0-9]+)*(\\[[0-9]+\\])*)*(\\[[0-9]+\\])*)\\}"
@ -40,6 +44,10 @@
"type": "string",
"pattern": "\\$\\{(workspace(\\.[a-zA-Z]+([-_]?[a-zA-Z0-9]+)*(\\[[0-9]+\\])*)*(\\[[0-9]+\\])*)\\}"
},
{
"type": "string",
"pattern": "\\$\\{(artifacts(\\.[a-zA-Z]+([-_]?[a-zA-Z0-9]+)*(\\[[0-9]+\\])*)*(\\[[0-9]+\\])*)\\}"
},
{
"type": "string",
"pattern": "\\$\\{(var(\\.[a-zA-Z]+([-_]?[a-zA-Z0-9]+)*(\\[[0-9]+\\])*)*(\\[[0-9]+\\])*)\\}"
@ -4822,6 +4830,10 @@
"type": "string",
"pattern": "\\$\\{(workspace(\\.[a-zA-Z]+([-_]?[a-zA-Z0-9]+)*(\\[[0-9]+\\])*)*(\\[[0-9]+\\])*)\\}"
},
{
"type": "string",
"pattern": "\\$\\{(artifacts(\\.[a-zA-Z]+([-_]?[a-zA-Z0-9]+)*(\\[[0-9]+\\])*)*(\\[[0-9]+\\])*)\\}"
},
{
"type": "string",
"pattern": "\\$\\{(var(\\.[a-zA-Z]+([-_]?[a-zA-Z0-9]+)*(\\[[0-9]+\\])*)*(\\[[0-9]+\\])*)\\}"
@ -4845,6 +4857,10 @@
"type": "string",
"pattern": "\\$\\{(workspace(\\.[a-zA-Z]+([-_]?[a-zA-Z0-9]+)*(\\[[0-9]+\\])*)*(\\[[0-9]+\\])*)\\}"
},
{
"type": "string",
"pattern": "\\$\\{(artifacts(\\.[a-zA-Z]+([-_]?[a-zA-Z0-9]+)*(\\[[0-9]+\\])*)*(\\[[0-9]+\\])*)\\}"
},
{
"type": "string",
"pattern": "\\$\\{(var(\\.[a-zA-Z]+([-_]?[a-zA-Z0-9]+)*(\\[[0-9]+\\])*)*(\\[[0-9]+\\])*)\\}"

View File

@ -38,6 +38,7 @@ func addInterpolationPatterns(typ reflect.Type, s jsonschema.Schema) jsonschema.
{Type: jsonschema.StringType, Pattern: interpolationPattern("resources")},
{Type: jsonschema.StringType, Pattern: interpolationPattern("bundle")},
{Type: jsonschema.StringType, Pattern: interpolationPattern("workspace")},
{Type: jsonschema.StringType, Pattern: interpolationPattern("artifacts")},
{Type: jsonschema.StringType, Pattern: interpolationPattern("var")},
},
}

View File

@ -18,10 +18,8 @@ func matchError(p dyn.Path, l []dyn.Location, message string) diag.Diagnostic {
return diag.Diagnostic{
Severity: diag.Error,
Summary: message,
Paths: []dyn.Path{
p.Append(),
},
Locations: l,
Paths: []dyn.Path{p},
}
}

View File

@ -76,7 +76,7 @@ func collectLocalLibraries(b *bundle.Bundle) (map[string][]configLocation, error
source = filepath.Join(b.RootPath, source)
libs[source] = append(libs[source], configLocation{
configPath: p.Append(), // Hack to get the copy of path
configPath: p,
location: v.Location(),
})

View File

@ -81,9 +81,10 @@ func TestComplexVariablesOverrideWithMultipleFiles(t *testing.T) {
),
))
require.NoError(t, diags.Error())
require.Equal(t, "14.2.x-scala2.11", b.Config.Resources.Jobs["my_job"].JobClusters[0].NewCluster.SparkVersion)
require.Equal(t, "Standard_DS3_v2", b.Config.Resources.Jobs["my_job"].JobClusters[0].NewCluster.NodeTypeId)
require.Equal(t, 4, b.Config.Resources.Jobs["my_job"].JobClusters[0].NewCluster.NumWorkers)
require.Equal(t, "false", b.Config.Resources.Jobs["my_job"].JobClusters[0].NewCluster.SparkConf["spark.speculation"])
for _, cluster := range b.Config.Resources.Jobs["my_job"].JobClusters {
require.Equalf(t, "14.2.x-scala2.11", cluster.NewCluster.SparkVersion, "cluster: %v", cluster.JobClusterKey)
require.Equalf(t, "Standard_DS3_v2", cluster.NewCluster.NodeTypeId, "cluster: %v", cluster.JobClusterKey)
require.Equalf(t, 4, cluster.NewCluster.NumWorkers, "cluster: %v", cluster.JobClusterKey)
require.Equalf(t, "false", cluster.NewCluster.SparkConf["spark.speculation"], "cluster: %v", cluster.JobClusterKey)
}
}

View File

@ -5,13 +5,48 @@ resources:
jobs:
my_job:
job_clusters:
- job_cluster_key: key
new_cluster: ${var.cluster}
- job_cluster_key: key1
new_cluster: ${var.cluster1}
- job_cluster_key: key2
new_cluster: ${var.cluster2}
- job_cluster_key: key3
new_cluster: ${var.cluster3}
- job_cluster_key: key4
new_cluster: ${var.cluster4}
variables:
cluster:
cluster1:
type: complex
description: "A cluster definition"
cluster2:
type: complex
description: "A cluster definition"
cluster3:
type: complex
description: "A cluster definition"
cluster4:
type: complex
description: "A cluster definition"
include:
- ./variables/*.yml
targets:
default:
dev:
variables:
cluster3:
spark_version: "14.2.x-scala2.11"
node_type_id: "Standard_DS3_v2"
num_workers: 4
spark_conf:
spark.speculation: false
spark.databricks.delta.retentionDurationCheck.enabled: false
cluster4:
default:
spark_version: "14.2.x-scala2.11"
node_type_id: "Standard_DS3_v2"
num_workers: 4
spark_conf:
spark.speculation: false
spark.databricks.delta.retentionDurationCheck.enabled: false

View File

@ -2,7 +2,15 @@ targets:
default:
dev:
variables:
cluster:
cluster1:
spark_version: "14.2.x-scala2.11"
node_type_id: "Standard_DS3_v2"
num_workers: 4
spark_conf:
spark.speculation: false
spark.databricks.delta.retentionDurationCheck.enabled: false
cluster2:
default:
spark_version: "14.2.x-scala2.11"
node_type_id: "Standard_DS3_v2"
num_workers: 4

View File

@ -70,7 +70,7 @@ type visitOptions struct {
func visit(v Value, prefix Path, suffix Pattern, opts visitOptions) (Value, error) {
if len(suffix) == 0 {
return opts.fn(prefix, v)
return opts.fn(slices.Clone(prefix), v)
}
// Initialize prefix if it is empty.

View File

@ -21,7 +21,7 @@ func Foreach(fn MapFunc) MapFunc {
for _, pair := range m.Pairs() {
pk := pair.Key
pv := pair.Value
nv, err := fn(append(p, Key(pk.MustString())), pv)
nv, err := fn(p.Append(Key(pk.MustString())), pv)
if err != nil {
return InvalidValue, err
}
@ -32,7 +32,7 @@ func Foreach(fn MapFunc) MapFunc {
s := slices.Clone(v.MustSequence())
for i, value := range s {
var err error
s[i], err = fn(append(p, Index(i)), value)
s[i], err = fn(p.Append(Index(i)), value)
if err != nil {
return InvalidValue, err
}

36
libs/dyn/visit_test.go Normal file
View File

@ -0,0 +1,36 @@
package dyn_test
import (
"testing"
"github.com/databricks/cli/libs/dyn"
assert "github.com/databricks/cli/libs/dyn/dynassert"
)
func TestVisitCallbackPathCopy(t *testing.T) {
vin := dyn.V(map[string]dyn.Value{
"foo": dyn.V(42),
"bar": dyn.V(43),
})
var paths []dyn.Path
// The callback should receive a copy of the path.
// If the same underlying value is used, all collected paths will be the same.
// This test uses `MapByPattern` to collect all paths in the map.
// Visit itself doesn't have public functions and we exclusively use black-box testing for this package.
_, _ = dyn.MapByPattern(vin, dyn.NewPattern(dyn.AnyKey()), func(p dyn.Path, v dyn.Value) (dyn.Value, error) {
paths = append(paths, p)
return v, nil
})
// Verify that the paths retained their original values.
var strings []string
for _, p := range paths {
strings = append(strings, p.String())
}
assert.ElementsMatch(t, strings, []string{
"foo",
"bar",
})
}