Compare commits

..

4 Commits

Author SHA1 Message Date
Pieter Noordhuis 62bc59a3a6
Fail filer integration test if error is nil (#1967)
## Changes

I found a race where this error is nil and the subsequent assert panics
on the error being nil. This change makes the test robust against this
to fail immediately if the error is different from the one we expect.

## Tests

n/a
2024-12-05 18:20:46 +00:00
Pieter Noordhuis 6e754d4f34
Rewrite 'interface{} -> any' (#1959)
## Changes

The `any` alias for `interface{}` has been around since Go 1.18.

Now that we're using golangci-lint (#1953), we can lint on it.

Existing commits can be updated with:
```
gofmt -w -r 'interface{} -> any' .
```

## Tests

n/a
2024-12-05 15:37:24 +00:00
Pieter Noordhuis 7ffe93e4d0
[Release] Release v0.236.0 (#1966)
**New features for Databricks Asset Bundles:**

This release adds support for managing Unity Catalog volumes as part of
your bundle configuration.

Bundles:
* Add DABs support for Unity Catalog volumes
([#1762](https://github.com/databricks/cli/pull/1762)).
* Support lookup by name of notification destinations
([#1922](https://github.com/databricks/cli/pull/1922)).
* Extend "notebook not found" error to warn about missing extension
([#1920](https://github.com/databricks/cli/pull/1920)).
* Skip sync warning if no sync paths are defined
([#1926](https://github.com/databricks/cli/pull/1926)).
* Add validation for single node clusters
([#1909](https://github.com/databricks/cli/pull/1909)).
* Fix segfault in bundle summary command
([#1937](https://github.com/databricks/cli/pull/1937)).
* Add the `bundle_uuid` helper function for templates
([#1947](https://github.com/databricks/cli/pull/1947)).
* Add default value for `volume_type` for DABs
([#1952](https://github.com/databricks/cli/pull/1952)).
* Properly read Git metadata when running inside workspace
([#1945](https://github.com/databricks/cli/pull/1945)).
* Upgrade TF provider to 1.59.0
([#1960](https://github.com/databricks/cli/pull/1960)).

Internal:
* Breakout variable lookup into separate files and tests
([#1921](https://github.com/databricks/cli/pull/1921)).
* Add golangci-lint v1.62.2
([#1953](https://github.com/databricks/cli/pull/1953)).

Dependency updates:
* Bump golang.org/x/term from 0.25.0 to 0.26.0
([#1907](https://github.com/databricks/cli/pull/1907)).
* Bump github.com/Masterminds/semver/v3 from 3.3.0 to 3.3.1
([#1930](https://github.com/databricks/cli/pull/1930)).
* Bump github.com/stretchr/testify from 1.9.0 to 1.10.0
([#1932](https://github.com/databricks/cli/pull/1932)).
* Bump github.com/databricks/databricks-sdk-go from 0.51.0 to 0.52.0
([#1931](https://github.com/databricks/cli/pull/1931)).
2024-12-05 14:39:26 +00:00
Pieter Noordhuis 647b09e6e2
Upgrade TF provider to 1.59.0 (#1960)
## Changes

Notable changes:
* Fixes dashboard deployment if it was trashed out-of-band.
* Removes client-side validation for single-node cluster configuration
(also see #1546).

Beware: for the same reason as in #1900, this excludes the changes for
the quality monitor resource.

## Tests

Integration tests pass.
2024-12-05 12:09:45 +01:00
29 changed files with 419 additions and 86 deletions

View File

@ -15,5 +15,7 @@ linters-settings:
rewrite-rules: rewrite-rules:
- pattern: 'a[b:len(a)]' - pattern: 'a[b:len(a)]'
replacement: 'a[b:]' replacement: 'a[b:]'
- pattern: 'interface{}'
replacement: 'any'
issues: issues:
exclude-dirs-use-default: false # recommended by docs https://golangci-lint.run/usage/false-positives/ exclude-dirs-use-default: false # recommended by docs https://golangci-lint.run/usage/false-positives/

View File

@ -1,5 +1,32 @@
# Version changelog # Version changelog
## [Release] Release v0.236.0
**New features for Databricks Asset Bundles:**
This release adds support for managing Unity Catalog volumes as part of your bundle configuration.
Bundles:
* Add DABs support for Unity Catalog volumes ([#1762](https://github.com/databricks/cli/pull/1762)).
* Support lookup by name of notification destinations ([#1922](https://github.com/databricks/cli/pull/1922)).
* Extend "notebook not found" error to warn about missing extension ([#1920](https://github.com/databricks/cli/pull/1920)).
* Skip sync warning if no sync paths are defined ([#1926](https://github.com/databricks/cli/pull/1926)).
* Add validation for single node clusters ([#1909](https://github.com/databricks/cli/pull/1909)).
* Fix segfault in bundle summary command ([#1937](https://github.com/databricks/cli/pull/1937)).
* Add the `bundle_uuid` helper function for templates ([#1947](https://github.com/databricks/cli/pull/1947)).
* Add default value for `volume_type` for DABs ([#1952](https://github.com/databricks/cli/pull/1952)).
* Properly read Git metadata when running inside workspace ([#1945](https://github.com/databricks/cli/pull/1945)).
* Upgrade TF provider to 1.59.0 ([#1960](https://github.com/databricks/cli/pull/1960)).
Internal:
* Breakout variable lookup into separate files and tests ([#1921](https://github.com/databricks/cli/pull/1921)).
* Add golangci-lint v1.62.2 ([#1953](https://github.com/databricks/cli/pull/1953)).
Dependency updates:
* Bump golang.org/x/term from 0.25.0 to 0.26.0 ([#1907](https://github.com/databricks/cli/pull/1907)).
* Bump github.com/Masterminds/semver/v3 from 3.3.0 to 3.3.1 ([#1930](https://github.com/databricks/cli/pull/1930)).
* Bump github.com/stretchr/testify from 1.9.0 to 1.10.0 ([#1932](https://github.com/databricks/cli/pull/1932)).
* Bump github.com/databricks/databricks-sdk-go from 0.51.0 to 0.52.0 ([#1931](https://github.com/databricks/cli/pull/1931)).
## [Release] Release v0.235.0 ## [Release] Release v0.235.0
**Note:** the `bundle generate` command now uses the `.<resource-type>.yml` **Note:** the `bundle generate` command now uses the `.<resource-type>.yml`

View File

@ -133,7 +133,7 @@ func TestRootMergeTargetOverridesWithVariables(t *testing.T) {
"complex": { "complex": {
Type: variable.VariableTypeComplex, Type: variable.VariableTypeComplex,
Description: "complex var", Description: "complex var",
Default: map[string]interface{}{ Default: map[string]any{
"key": "value", "key": "value",
}, },
}, },
@ -148,7 +148,7 @@ func TestRootMergeTargetOverridesWithVariables(t *testing.T) {
"complex": { "complex": {
Type: "wrong", Type: "wrong",
Description: "wrong", Description: "wrong",
Default: map[string]interface{}{ Default: map[string]any{
"key1": "value1", "key1": "value1",
}, },
}, },
@ -164,7 +164,7 @@ func TestRootMergeTargetOverridesWithVariables(t *testing.T) {
assert.Equal(t, "foo2", root.Variables["foo2"].Default) assert.Equal(t, "foo2", root.Variables["foo2"].Default)
assert.Equal(t, "foo2 var", root.Variables["foo2"].Description) assert.Equal(t, "foo2 var", root.Variables["foo2"].Description)
assert.Equal(t, map[string]interface{}{ assert.Equal(t, map[string]any{
"key1": "value1", "key1": "value1",
}, root.Variables["complex"].Default) }, root.Variables["complex"].Default)
assert.Equal(t, "complex var", root.Variables["complex"].Description) assert.Equal(t, "complex var", root.Variables["complex"].Description)

View File

@ -15,10 +15,10 @@ import (
) )
func (s *Schema) writeTerraformBlock(_ context.Context) error { func (s *Schema) writeTerraformBlock(_ context.Context) error {
var body = map[string]interface{}{ var body = map[string]any{
"terraform": map[string]interface{}{ "terraform": map[string]any{
"required_providers": map[string]interface{}{ "required_providers": map[string]any{
"databricks": map[string]interface{}{ "databricks": map[string]any{
"source": "databricks/databricks", "source": "databricks/databricks",
"version": ProviderVersion, "version": ProviderVersion,
}, },

View File

@ -1,3 +1,3 @@
package schema package schema
const ProviderVersion = "1.58.0" const ProviderVersion = "1.59.0"

View File

@ -3,6 +3,7 @@
package schema package schema
type DataSourceAwsAssumeRolePolicy struct { type DataSourceAwsAssumeRolePolicy struct {
AwsPartition string `json:"aws_partition,omitempty"`
DatabricksAccountId string `json:"databricks_account_id,omitempty"` DatabricksAccountId string `json:"databricks_account_id,omitempty"`
ExternalId string `json:"external_id"` ExternalId string `json:"external_id"`
ForLogDelivery bool `json:"for_log_delivery,omitempty"` ForLogDelivery bool `json:"for_log_delivery,omitempty"`

View File

@ -3,6 +3,7 @@
package schema package schema
type DataSourceAwsBucketPolicy struct { type DataSourceAwsBucketPolicy struct {
AwsPartition string `json:"aws_partition,omitempty"`
Bucket string `json:"bucket"` Bucket string `json:"bucket"`
DatabricksAccountId string `json:"databricks_account_id,omitempty"` DatabricksAccountId string `json:"databricks_account_id,omitempty"`
DatabricksE2AccountId string `json:"databricks_e2_account_id,omitempty"` DatabricksE2AccountId string `json:"databricks_e2_account_id,omitempty"`

View File

@ -4,6 +4,7 @@ package schema
type DataSourceAwsCrossaccountPolicy struct { type DataSourceAwsCrossaccountPolicy struct {
AwsAccountId string `json:"aws_account_id,omitempty"` AwsAccountId string `json:"aws_account_id,omitempty"`
AwsPartition string `json:"aws_partition,omitempty"`
Id string `json:"id,omitempty"` Id string `json:"id,omitempty"`
Json string `json:"json,omitempty"` Json string `json:"json,omitempty"`
PassRoles []string `json:"pass_roles,omitempty"` PassRoles []string `json:"pass_roles,omitempty"`

View File

@ -4,6 +4,7 @@ package schema
type DataSourceAwsUnityCatalogAssumeRolePolicy struct { type DataSourceAwsUnityCatalogAssumeRolePolicy struct {
AwsAccountId string `json:"aws_account_id"` AwsAccountId string `json:"aws_account_id"`
AwsPartition string `json:"aws_partition,omitempty"`
ExternalId string `json:"external_id"` ExternalId string `json:"external_id"`
Id string `json:"id,omitempty"` Id string `json:"id,omitempty"`
Json string `json:"json,omitempty"` Json string `json:"json,omitempty"`

View File

@ -4,6 +4,7 @@ package schema
type DataSourceAwsUnityCatalogPolicy struct { type DataSourceAwsUnityCatalogPolicy struct {
AwsAccountId string `json:"aws_account_id"` AwsAccountId string `json:"aws_account_id"`
AwsPartition string `json:"aws_partition,omitempty"`
BucketName string `json:"bucket_name"` BucketName string `json:"bucket_name"`
Id string `json:"id,omitempty"` Id string `json:"id,omitempty"`
Json string `json:"json,omitempty"` Json string `json:"json,omitempty"`

View File

@ -0,0 +1,51 @@
// Generated from Databricks Terraform provider schema. DO NOT EDIT.
package schema
type DataSourceMwsNetworkConnectivityConfigEgressConfigDefaultRulesAwsStableIpRule struct {
CidrBlocks []string `json:"cidr_blocks,omitempty"`
}
type DataSourceMwsNetworkConnectivityConfigEgressConfigDefaultRulesAzureServiceEndpointRule struct {
Subnets []string `json:"subnets,omitempty"`
TargetRegion string `json:"target_region,omitempty"`
TargetServices []string `json:"target_services,omitempty"`
}
type DataSourceMwsNetworkConnectivityConfigEgressConfigDefaultRules struct {
AwsStableIpRule *DataSourceMwsNetworkConnectivityConfigEgressConfigDefaultRulesAwsStableIpRule `json:"aws_stable_ip_rule,omitempty"`
AzureServiceEndpointRule *DataSourceMwsNetworkConnectivityConfigEgressConfigDefaultRulesAzureServiceEndpointRule `json:"azure_service_endpoint_rule,omitempty"`
}
type DataSourceMwsNetworkConnectivityConfigEgressConfigTargetRulesAzurePrivateEndpointRules struct {
ConnectionState string `json:"connection_state,omitempty"`
CreationTime int `json:"creation_time,omitempty"`
Deactivated bool `json:"deactivated,omitempty"`
DeactivatedAt int `json:"deactivated_at,omitempty"`
EndpointName string `json:"endpoint_name,omitempty"`
GroupId string `json:"group_id,omitempty"`
NetworkConnectivityConfigId string `json:"network_connectivity_config_id,omitempty"`
ResourceId string `json:"resource_id,omitempty"`
RuleId string `json:"rule_id,omitempty"`
UpdatedTime int `json:"updated_time,omitempty"`
}
type DataSourceMwsNetworkConnectivityConfigEgressConfigTargetRules struct {
AzurePrivateEndpointRules []DataSourceMwsNetworkConnectivityConfigEgressConfigTargetRulesAzurePrivateEndpointRules `json:"azure_private_endpoint_rules,omitempty"`
}
type DataSourceMwsNetworkConnectivityConfigEgressConfig struct {
DefaultRules *DataSourceMwsNetworkConnectivityConfigEgressConfigDefaultRules `json:"default_rules,omitempty"`
TargetRules *DataSourceMwsNetworkConnectivityConfigEgressConfigTargetRules `json:"target_rules,omitempty"`
}
type DataSourceMwsNetworkConnectivityConfig struct {
AccountId string `json:"account_id,omitempty"`
CreationTime int `json:"creation_time,omitempty"`
Id string `json:"id,omitempty"`
Name string `json:"name"`
NetworkConnectivityConfigId string `json:"network_connectivity_config_id,omitempty"`
Region string `json:"region,omitempty"`
UpdatedTime int `json:"updated_time,omitempty"`
EgressConfig *DataSourceMwsNetworkConnectivityConfigEgressConfig `json:"egress_config,omitempty"`
}

View File

@ -0,0 +1,9 @@
// Generated from Databricks Terraform provider schema. DO NOT EDIT.
package schema
type DataSourceMwsNetworkConnectivityConfigs struct {
Id string `json:"id,omitempty"`
Names []string `json:"names,omitempty"`
Region string `json:"region,omitempty"`
}

View File

@ -0,0 +1,52 @@
// Generated from Databricks Terraform provider schema. DO NOT EDIT.
package schema
type DataSourceRegisteredModelVersionsModelVersionsAliases struct {
AliasName string `json:"alias_name,omitempty"`
VersionNum int `json:"version_num,omitempty"`
}
type DataSourceRegisteredModelVersionsModelVersionsModelVersionDependenciesDependenciesFunction struct {
FunctionFullName string `json:"function_full_name"`
}
type DataSourceRegisteredModelVersionsModelVersionsModelVersionDependenciesDependenciesTable struct {
TableFullName string `json:"table_full_name"`
}
type DataSourceRegisteredModelVersionsModelVersionsModelVersionDependenciesDependencies struct {
Function []DataSourceRegisteredModelVersionsModelVersionsModelVersionDependenciesDependenciesFunction `json:"function,omitempty"`
Table []DataSourceRegisteredModelVersionsModelVersionsModelVersionDependenciesDependenciesTable `json:"table,omitempty"`
}
type DataSourceRegisteredModelVersionsModelVersionsModelVersionDependencies struct {
Dependencies []DataSourceRegisteredModelVersionsModelVersionsModelVersionDependenciesDependencies `json:"dependencies,omitempty"`
}
type DataSourceRegisteredModelVersionsModelVersions struct {
BrowseOnly bool `json:"browse_only,omitempty"`
CatalogName string `json:"catalog_name,omitempty"`
Comment string `json:"comment,omitempty"`
CreatedAt int `json:"created_at,omitempty"`
CreatedBy string `json:"created_by,omitempty"`
Id string `json:"id,omitempty"`
MetastoreId string `json:"metastore_id,omitempty"`
ModelName string `json:"model_name,omitempty"`
RunId string `json:"run_id,omitempty"`
RunWorkspaceId int `json:"run_workspace_id,omitempty"`
SchemaName string `json:"schema_name,omitempty"`
Source string `json:"source,omitempty"`
Status string `json:"status,omitempty"`
StorageLocation string `json:"storage_location,omitempty"`
UpdatedAt int `json:"updated_at,omitempty"`
UpdatedBy string `json:"updated_by,omitempty"`
Version int `json:"version,omitempty"`
Aliases []DataSourceRegisteredModelVersionsModelVersionsAliases `json:"aliases,omitempty"`
ModelVersionDependencies []DataSourceRegisteredModelVersionsModelVersionsModelVersionDependencies `json:"model_version_dependencies,omitempty"`
}
type DataSourceRegisteredModelVersions struct {
FullName string `json:"full_name"`
ModelVersions []DataSourceRegisteredModelVersionsModelVersions `json:"model_versions,omitempty"`
}

View File

@ -0,0 +1,178 @@
// Generated from Databricks Terraform provider schema. DO NOT EDIT.
package schema
type DataSourceServingEndpointsEndpointsAiGatewayGuardrailsInputPii struct {
Behavior string `json:"behavior"`
}
type DataSourceServingEndpointsEndpointsAiGatewayGuardrailsInput struct {
InvalidKeywords []string `json:"invalid_keywords,omitempty"`
Safety bool `json:"safety,omitempty"`
ValidTopics []string `json:"valid_topics,omitempty"`
Pii []DataSourceServingEndpointsEndpointsAiGatewayGuardrailsInputPii `json:"pii,omitempty"`
}
type DataSourceServingEndpointsEndpointsAiGatewayGuardrailsOutputPii struct {
Behavior string `json:"behavior"`
}
type DataSourceServingEndpointsEndpointsAiGatewayGuardrailsOutput struct {
InvalidKeywords []string `json:"invalid_keywords,omitempty"`
Safety bool `json:"safety,omitempty"`
ValidTopics []string `json:"valid_topics,omitempty"`
Pii []DataSourceServingEndpointsEndpointsAiGatewayGuardrailsOutputPii `json:"pii,omitempty"`
}
type DataSourceServingEndpointsEndpointsAiGatewayGuardrails struct {
Input []DataSourceServingEndpointsEndpointsAiGatewayGuardrailsInput `json:"input,omitempty"`
Output []DataSourceServingEndpointsEndpointsAiGatewayGuardrailsOutput `json:"output,omitempty"`
}
type DataSourceServingEndpointsEndpointsAiGatewayInferenceTableConfig struct {
CatalogName string `json:"catalog_name,omitempty"`
Enabled bool `json:"enabled,omitempty"`
SchemaName string `json:"schema_name,omitempty"`
TableNamePrefix string `json:"table_name_prefix,omitempty"`
}
type DataSourceServingEndpointsEndpointsAiGatewayRateLimits struct {
Calls int `json:"calls"`
Key string `json:"key,omitempty"`
RenewalPeriod string `json:"renewal_period"`
}
type DataSourceServingEndpointsEndpointsAiGatewayUsageTrackingConfig struct {
Enabled bool `json:"enabled,omitempty"`
}
type DataSourceServingEndpointsEndpointsAiGateway struct {
Guardrails []DataSourceServingEndpointsEndpointsAiGatewayGuardrails `json:"guardrails,omitempty"`
InferenceTableConfig []DataSourceServingEndpointsEndpointsAiGatewayInferenceTableConfig `json:"inference_table_config,omitempty"`
RateLimits []DataSourceServingEndpointsEndpointsAiGatewayRateLimits `json:"rate_limits,omitempty"`
UsageTrackingConfig []DataSourceServingEndpointsEndpointsAiGatewayUsageTrackingConfig `json:"usage_tracking_config,omitempty"`
}
type DataSourceServingEndpointsEndpointsConfigServedEntitiesExternalModelAi21LabsConfig struct {
Ai21LabsApiKey string `json:"ai21labs_api_key,omitempty"`
Ai21LabsApiKeyPlaintext string `json:"ai21labs_api_key_plaintext,omitempty"`
}
type DataSourceServingEndpointsEndpointsConfigServedEntitiesExternalModelAmazonBedrockConfig struct {
AwsAccessKeyId string `json:"aws_access_key_id,omitempty"`
AwsAccessKeyIdPlaintext string `json:"aws_access_key_id_plaintext,omitempty"`
AwsRegion string `json:"aws_region"`
AwsSecretAccessKey string `json:"aws_secret_access_key,omitempty"`
AwsSecretAccessKeyPlaintext string `json:"aws_secret_access_key_plaintext,omitempty"`
BedrockProvider string `json:"bedrock_provider"`
}
type DataSourceServingEndpointsEndpointsConfigServedEntitiesExternalModelAnthropicConfig struct {
AnthropicApiKey string `json:"anthropic_api_key,omitempty"`
AnthropicApiKeyPlaintext string `json:"anthropic_api_key_plaintext,omitempty"`
}
type DataSourceServingEndpointsEndpointsConfigServedEntitiesExternalModelCohereConfig struct {
CohereApiBase string `json:"cohere_api_base,omitempty"`
CohereApiKey string `json:"cohere_api_key,omitempty"`
CohereApiKeyPlaintext string `json:"cohere_api_key_plaintext,omitempty"`
}
type DataSourceServingEndpointsEndpointsConfigServedEntitiesExternalModelDatabricksModelServingConfig struct {
DatabricksApiToken string `json:"databricks_api_token,omitempty"`
DatabricksApiTokenPlaintext string `json:"databricks_api_token_plaintext,omitempty"`
DatabricksWorkspaceUrl string `json:"databricks_workspace_url"`
}
type DataSourceServingEndpointsEndpointsConfigServedEntitiesExternalModelGoogleCloudVertexAiConfig struct {
PrivateKey string `json:"private_key,omitempty"`
PrivateKeyPlaintext string `json:"private_key_plaintext,omitempty"`
ProjectId string `json:"project_id,omitempty"`
Region string `json:"region,omitempty"`
}
type DataSourceServingEndpointsEndpointsConfigServedEntitiesExternalModelOpenaiConfig struct {
MicrosoftEntraClientId string `json:"microsoft_entra_client_id,omitempty"`
MicrosoftEntraClientSecret string `json:"microsoft_entra_client_secret,omitempty"`
MicrosoftEntraClientSecretPlaintext string `json:"microsoft_entra_client_secret_plaintext,omitempty"`
MicrosoftEntraTenantId string `json:"microsoft_entra_tenant_id,omitempty"`
OpenaiApiBase string `json:"openai_api_base,omitempty"`
OpenaiApiKey string `json:"openai_api_key,omitempty"`
OpenaiApiKeyPlaintext string `json:"openai_api_key_plaintext,omitempty"`
OpenaiApiType string `json:"openai_api_type,omitempty"`
OpenaiApiVersion string `json:"openai_api_version,omitempty"`
OpenaiDeploymentName string `json:"openai_deployment_name,omitempty"`
OpenaiOrganization string `json:"openai_organization,omitempty"`
}
type DataSourceServingEndpointsEndpointsConfigServedEntitiesExternalModelPalmConfig struct {
PalmApiKey string `json:"palm_api_key,omitempty"`
PalmApiKeyPlaintext string `json:"palm_api_key_plaintext,omitempty"`
}
type DataSourceServingEndpointsEndpointsConfigServedEntitiesExternalModel struct {
Name string `json:"name"`
Provider string `json:"provider"`
Task string `json:"task"`
Ai21LabsConfig []DataSourceServingEndpointsEndpointsConfigServedEntitiesExternalModelAi21LabsConfig `json:"ai21labs_config,omitempty"`
AmazonBedrockConfig []DataSourceServingEndpointsEndpointsConfigServedEntitiesExternalModelAmazonBedrockConfig `json:"amazon_bedrock_config,omitempty"`
AnthropicConfig []DataSourceServingEndpointsEndpointsConfigServedEntitiesExternalModelAnthropicConfig `json:"anthropic_config,omitempty"`
CohereConfig []DataSourceServingEndpointsEndpointsConfigServedEntitiesExternalModelCohereConfig `json:"cohere_config,omitempty"`
DatabricksModelServingConfig []DataSourceServingEndpointsEndpointsConfigServedEntitiesExternalModelDatabricksModelServingConfig `json:"databricks_model_serving_config,omitempty"`
GoogleCloudVertexAiConfig []DataSourceServingEndpointsEndpointsConfigServedEntitiesExternalModelGoogleCloudVertexAiConfig `json:"google_cloud_vertex_ai_config,omitempty"`
OpenaiConfig []DataSourceServingEndpointsEndpointsConfigServedEntitiesExternalModelOpenaiConfig `json:"openai_config,omitempty"`
PalmConfig []DataSourceServingEndpointsEndpointsConfigServedEntitiesExternalModelPalmConfig `json:"palm_config,omitempty"`
}
type DataSourceServingEndpointsEndpointsConfigServedEntitiesFoundationModel struct {
Description string `json:"description,omitempty"`
DisplayName string `json:"display_name,omitempty"`
Docs string `json:"docs,omitempty"`
Name string `json:"name,omitempty"`
}
type DataSourceServingEndpointsEndpointsConfigServedEntities struct {
EntityName string `json:"entity_name,omitempty"`
EntityVersion string `json:"entity_version,omitempty"`
Name string `json:"name,omitempty"`
ExternalModel []DataSourceServingEndpointsEndpointsConfigServedEntitiesExternalModel `json:"external_model,omitempty"`
FoundationModel []DataSourceServingEndpointsEndpointsConfigServedEntitiesFoundationModel `json:"foundation_model,omitempty"`
}
type DataSourceServingEndpointsEndpointsConfigServedModels struct {
ModelName string `json:"model_name,omitempty"`
ModelVersion string `json:"model_version,omitempty"`
Name string `json:"name,omitempty"`
}
type DataSourceServingEndpointsEndpointsConfig struct {
ServedEntities []DataSourceServingEndpointsEndpointsConfigServedEntities `json:"served_entities,omitempty"`
ServedModels []DataSourceServingEndpointsEndpointsConfigServedModels `json:"served_models,omitempty"`
}
type DataSourceServingEndpointsEndpointsState struct {
ConfigUpdate string `json:"config_update,omitempty"`
Ready string `json:"ready,omitempty"`
}
type DataSourceServingEndpointsEndpointsTags struct {
Key string `json:"key"`
Value string `json:"value,omitempty"`
}
type DataSourceServingEndpointsEndpoints struct {
CreationTimestamp int `json:"creation_timestamp,omitempty"`
Creator string `json:"creator,omitempty"`
Id string `json:"id,omitempty"`
LastUpdatedTimestamp int `json:"last_updated_timestamp,omitempty"`
Name string `json:"name,omitempty"`
Task string `json:"task,omitempty"`
AiGateway []DataSourceServingEndpointsEndpointsAiGateway `json:"ai_gateway,omitempty"`
Config []DataSourceServingEndpointsEndpointsConfig `json:"config,omitempty"`
State []DataSourceServingEndpointsEndpointsState `json:"state,omitempty"`
Tags []DataSourceServingEndpointsEndpointsTags `json:"tags,omitempty"`
}
type DataSourceServingEndpoints struct {
Endpoints []DataSourceServingEndpointsEndpoints `json:"endpoints,omitempty"`
}

View File

@ -33,6 +33,8 @@ type DataSources struct {
MlflowModel map[string]any `json:"databricks_mlflow_model,omitempty"` MlflowModel map[string]any `json:"databricks_mlflow_model,omitempty"`
MlflowModels map[string]any `json:"databricks_mlflow_models,omitempty"` MlflowModels map[string]any `json:"databricks_mlflow_models,omitempty"`
MwsCredentials map[string]any `json:"databricks_mws_credentials,omitempty"` MwsCredentials map[string]any `json:"databricks_mws_credentials,omitempty"`
MwsNetworkConnectivityConfig map[string]any `json:"databricks_mws_network_connectivity_config,omitempty"`
MwsNetworkConnectivityConfigs map[string]any `json:"databricks_mws_network_connectivity_configs,omitempty"`
MwsWorkspaces map[string]any `json:"databricks_mws_workspaces,omitempty"` MwsWorkspaces map[string]any `json:"databricks_mws_workspaces,omitempty"`
NodeType map[string]any `json:"databricks_node_type,omitempty"` NodeType map[string]any `json:"databricks_node_type,omitempty"`
Notebook map[string]any `json:"databricks_notebook,omitempty"` Notebook map[string]any `json:"databricks_notebook,omitempty"`
@ -40,10 +42,12 @@ type DataSources struct {
NotificationDestinations map[string]any `json:"databricks_notification_destinations,omitempty"` NotificationDestinations map[string]any `json:"databricks_notification_destinations,omitempty"`
Pipelines map[string]any `json:"databricks_pipelines,omitempty"` Pipelines map[string]any `json:"databricks_pipelines,omitempty"`
RegisteredModel map[string]any `json:"databricks_registered_model,omitempty"` RegisteredModel map[string]any `json:"databricks_registered_model,omitempty"`
RegisteredModelVersions map[string]any `json:"databricks_registered_model_versions,omitempty"`
Schema map[string]any `json:"databricks_schema,omitempty"` Schema map[string]any `json:"databricks_schema,omitempty"`
Schemas map[string]any `json:"databricks_schemas,omitempty"` Schemas map[string]any `json:"databricks_schemas,omitempty"`
ServicePrincipal map[string]any `json:"databricks_service_principal,omitempty"` ServicePrincipal map[string]any `json:"databricks_service_principal,omitempty"`
ServicePrincipals map[string]any `json:"databricks_service_principals,omitempty"` ServicePrincipals map[string]any `json:"databricks_service_principals,omitempty"`
ServingEndpoints map[string]any `json:"databricks_serving_endpoints,omitempty"`
Share map[string]any `json:"databricks_share,omitempty"` Share map[string]any `json:"databricks_share,omitempty"`
Shares map[string]any `json:"databricks_shares,omitempty"` Shares map[string]any `json:"databricks_shares,omitempty"`
SparkVersion map[string]any `json:"databricks_spark_version,omitempty"` SparkVersion map[string]any `json:"databricks_spark_version,omitempty"`
@ -92,6 +96,8 @@ func NewDataSources() *DataSources {
MlflowModel: make(map[string]any), MlflowModel: make(map[string]any),
MlflowModels: make(map[string]any), MlflowModels: make(map[string]any),
MwsCredentials: make(map[string]any), MwsCredentials: make(map[string]any),
MwsNetworkConnectivityConfig: make(map[string]any),
MwsNetworkConnectivityConfigs: make(map[string]any),
MwsWorkspaces: make(map[string]any), MwsWorkspaces: make(map[string]any),
NodeType: make(map[string]any), NodeType: make(map[string]any),
Notebook: make(map[string]any), Notebook: make(map[string]any),
@ -99,10 +105,12 @@ func NewDataSources() *DataSources {
NotificationDestinations: make(map[string]any), NotificationDestinations: make(map[string]any),
Pipelines: make(map[string]any), Pipelines: make(map[string]any),
RegisteredModel: make(map[string]any), RegisteredModel: make(map[string]any),
RegisteredModelVersions: make(map[string]any),
Schema: make(map[string]any), Schema: make(map[string]any),
Schemas: make(map[string]any), Schemas: make(map[string]any),
ServicePrincipal: make(map[string]any), ServicePrincipal: make(map[string]any),
ServicePrincipals: make(map[string]any), ServicePrincipals: make(map[string]any),
ServingEndpoints: make(map[string]any),
Share: make(map[string]any), Share: make(map[string]any),
Shares: make(map[string]any), Shares: make(map[string]any),
SparkVersion: make(map[string]any), SparkVersion: make(map[string]any),

View File

@ -10,29 +10,30 @@ type ResourcePermissionsAccessControl struct {
} }
type ResourcePermissions struct { type ResourcePermissions struct {
Authorization string `json:"authorization,omitempty"` Authorization string `json:"authorization,omitempty"`
ClusterId string `json:"cluster_id,omitempty"` ClusterId string `json:"cluster_id,omitempty"`
ClusterPolicyId string `json:"cluster_policy_id,omitempty"` ClusterPolicyId string `json:"cluster_policy_id,omitempty"`
DashboardId string `json:"dashboard_id,omitempty"` DashboardId string `json:"dashboard_id,omitempty"`
DirectoryId string `json:"directory_id,omitempty"` DirectoryId string `json:"directory_id,omitempty"`
DirectoryPath string `json:"directory_path,omitempty"` DirectoryPath string `json:"directory_path,omitempty"`
ExperimentId string `json:"experiment_id,omitempty"` ExperimentId string `json:"experiment_id,omitempty"`
Id string `json:"id,omitempty"` Id string `json:"id,omitempty"`
InstancePoolId string `json:"instance_pool_id,omitempty"` InstancePoolId string `json:"instance_pool_id,omitempty"`
JobId string `json:"job_id,omitempty"` JobId string `json:"job_id,omitempty"`
NotebookId string `json:"notebook_id,omitempty"` NotebookId string `json:"notebook_id,omitempty"`
NotebookPath string `json:"notebook_path,omitempty"` NotebookPath string `json:"notebook_path,omitempty"`
ObjectType string `json:"object_type,omitempty"` ObjectType string `json:"object_type,omitempty"`
PipelineId string `json:"pipeline_id,omitempty"` PipelineId string `json:"pipeline_id,omitempty"`
RegisteredModelId string `json:"registered_model_id,omitempty"` RegisteredModelId string `json:"registered_model_id,omitempty"`
RepoId string `json:"repo_id,omitempty"` RepoId string `json:"repo_id,omitempty"`
RepoPath string `json:"repo_path,omitempty"` RepoPath string `json:"repo_path,omitempty"`
ServingEndpointId string `json:"serving_endpoint_id,omitempty"` ServingEndpointId string `json:"serving_endpoint_id,omitempty"`
SqlAlertId string `json:"sql_alert_id,omitempty"` SqlAlertId string `json:"sql_alert_id,omitempty"`
SqlDashboardId string `json:"sql_dashboard_id,omitempty"` SqlDashboardId string `json:"sql_dashboard_id,omitempty"`
SqlEndpointId string `json:"sql_endpoint_id,omitempty"` SqlEndpointId string `json:"sql_endpoint_id,omitempty"`
SqlQueryId string `json:"sql_query_id,omitempty"` SqlQueryId string `json:"sql_query_id,omitempty"`
WorkspaceFileId string `json:"workspace_file_id,omitempty"` VectorSearchEndpointId string `json:"vector_search_endpoint_id,omitempty"`
WorkspaceFilePath string `json:"workspace_file_path,omitempty"` WorkspaceFileId string `json:"workspace_file_id,omitempty"`
AccessControl []ResourcePermissionsAccessControl `json:"access_control,omitempty"` WorkspaceFilePath string `json:"workspace_file_path,omitempty"`
AccessControl []ResourcePermissionsAccessControl `json:"access_control,omitempty"`
} }

View File

@ -21,13 +21,13 @@ type Root struct {
const ProviderHost = "registry.terraform.io" const ProviderHost = "registry.terraform.io"
const ProviderSource = "databricks/databricks" const ProviderSource = "databricks/databricks"
const ProviderVersion = "1.58.0" const ProviderVersion = "1.59.0"
func NewRoot() *Root { func NewRoot() *Root {
return &Root{ return &Root{
Terraform: map[string]interface{}{ Terraform: map[string]any{
"required_providers": map[string]interface{}{ "required_providers": map[string]any{
"databricks": map[string]interface{}{ "databricks": map[string]any{
"source": ProviderSource, "source": ProviderSource,
"version": ProviderVersion, "version": ProviderVersion,
}, },

View File

@ -23,10 +23,10 @@ var renderFuncMap = template.FuncMap{
"yellow": color.YellowString, "yellow": color.YellowString,
"magenta": color.MagentaString, "magenta": color.MagentaString,
"cyan": color.CyanString, "cyan": color.CyanString,
"bold": func(format string, a ...interface{}) string { "bold": func(format string, a ...any) string {
return color.New(color.Bold).Sprintf(format, a...) return color.New(color.Bold).Sprintf(format, a...)
}, },
"italic": func(format string, a ...interface{}) string { "italic": func(format string, a ...any) string {
return color.New(color.Italic).Sprintf(format, a...) return color.New(color.Italic).Sprintf(format, a...)
}, },
} }

View File

@ -15,7 +15,7 @@ type LogsOutput struct {
LogsTruncated bool `json:"logs_truncated"` LogsTruncated bool `json:"logs_truncated"`
} }
func structToString(val interface{}) (string, error) { func structToString(val any) (string, error) {
b, err := json.MarshalIndent(val, "", " ") b, err := json.MarshalIndent(val, "", " ")
if err != nil { if err != nil {
return "", err return "", err

View File

@ -104,5 +104,5 @@ func TestComplexVariablesOverrideWithFullSyntax(t *testing.T) {
require.Empty(t, diags) require.Empty(t, diags)
complexvar := b.Config.Variables["complexvar"].Value complexvar := b.Config.Variables["complexvar"].Value
require.Equal(t, map[string]interface{}{"key1": "1", "key2": "2", "key3": "3"}, complexvar) require.Equal(t, map[string]any{"key1": "1", "key2": "2", "key3": "3"}, complexvar)
} }

View File

@ -457,7 +457,7 @@ func TestAccFilerWorkspaceNotebook(t *testing.T) {
// Assert uploading a second time fails due to overwrite mode missing // Assert uploading a second time fails due to overwrite mode missing
err = f.Write(ctx, tc.name, strings.NewReader(tc.content2)) err = f.Write(ctx, tc.name, strings.NewReader(tc.content2))
assert.ErrorIs(t, err, fs.ErrExist) require.ErrorIs(t, err, fs.ErrExist)
assert.Regexp(t, regexp.MustCompile(`file already exists: .*/`+tc.nameWithoutExt+`$`), err.Error()) assert.Regexp(t, regexp.MustCompile(`file already exists: .*/`+tc.nameWithoutExt+`$`), err.Error())
// Try uploading the notebook again with overwrite flag. This time it should succeed. // Try uploading the notebook again with overwrite flag. This time it should succeed.

View File

@ -276,7 +276,7 @@ func (t *cobraTestRunner) Run() (bytes.Buffer, bytes.Buffer, error) {
} }
// Like [require.Eventually] but errors if the underlying command has failed. // Like [require.Eventually] but errors if the underlying command has failed.
func (c *cobraTestRunner) Eventually(condition func() bool, waitFor time.Duration, tick time.Duration, msgAndArgs ...interface{}) { func (c *cobraTestRunner) Eventually(condition func() bool, waitFor time.Duration, tick time.Duration, msgAndArgs ...any) {
ch := make(chan bool, 1) ch := make(chan bool, 1)
timer := time.NewTimer(waitFor) timer := time.NewTimer(waitFor)

View File

@ -28,11 +28,11 @@ func (_m *MockFiler) EXPECT() *MockFiler_Expecter {
// Delete provides a mock function with given fields: ctx, path, mode // Delete provides a mock function with given fields: ctx, path, mode
func (_m *MockFiler) Delete(ctx context.Context, path string, mode ...filer.DeleteMode) error { func (_m *MockFiler) Delete(ctx context.Context, path string, mode ...filer.DeleteMode) error {
_va := make([]interface{}, len(mode)) _va := make([]any, len(mode))
for _i := range mode { for _i := range mode {
_va[_i] = mode[_i] _va[_i] = mode[_i]
} }
var _ca []interface{} var _ca []any
_ca = append(_ca, ctx, path) _ca = append(_ca, ctx, path)
_ca = append(_ca, _va...) _ca = append(_ca, _va...)
ret := _m.Called(_ca...) ret := _m.Called(_ca...)
@ -60,9 +60,9 @@ type MockFiler_Delete_Call struct {
// - ctx context.Context // - ctx context.Context
// - path string // - path string
// - mode ...filer.DeleteMode // - mode ...filer.DeleteMode
func (_e *MockFiler_Expecter) Delete(ctx interface{}, path interface{}, mode ...interface{}) *MockFiler_Delete_Call { func (_e *MockFiler_Expecter) Delete(ctx any, path any, mode ...any) *MockFiler_Delete_Call {
return &MockFiler_Delete_Call{Call: _e.mock.On("Delete", return &MockFiler_Delete_Call{Call: _e.mock.On("Delete",
append([]interface{}{ctx, path}, mode...)...)} append([]any{ctx, path}, mode...)...)}
} }
func (_c *MockFiler_Delete_Call) Run(run func(ctx context.Context, path string, mode ...filer.DeleteMode)) *MockFiler_Delete_Call { func (_c *MockFiler_Delete_Call) Run(run func(ctx context.Context, path string, mode ...filer.DeleteMode)) *MockFiler_Delete_Call {
@ -114,7 +114,7 @@ type MockFiler_Mkdir_Call struct {
// Mkdir is a helper method to define mock.On call // Mkdir is a helper method to define mock.On call
// - ctx context.Context // - ctx context.Context
// - path string // - path string
func (_e *MockFiler_Expecter) Mkdir(ctx interface{}, path interface{}) *MockFiler_Mkdir_Call { func (_e *MockFiler_Expecter) Mkdir(ctx any, path any) *MockFiler_Mkdir_Call {
return &MockFiler_Mkdir_Call{Call: _e.mock.On("Mkdir", ctx, path)} return &MockFiler_Mkdir_Call{Call: _e.mock.On("Mkdir", ctx, path)}
} }
@ -173,7 +173,7 @@ type MockFiler_Read_Call struct {
// Read is a helper method to define mock.On call // Read is a helper method to define mock.On call
// - ctx context.Context // - ctx context.Context
// - path string // - path string
func (_e *MockFiler_Expecter) Read(ctx interface{}, path interface{}) *MockFiler_Read_Call { func (_e *MockFiler_Expecter) Read(ctx any, path any) *MockFiler_Read_Call {
return &MockFiler_Read_Call{Call: _e.mock.On("Read", ctx, path)} return &MockFiler_Read_Call{Call: _e.mock.On("Read", ctx, path)}
} }
@ -232,7 +232,7 @@ type MockFiler_ReadDir_Call struct {
// ReadDir is a helper method to define mock.On call // ReadDir is a helper method to define mock.On call
// - ctx context.Context // - ctx context.Context
// - path string // - path string
func (_e *MockFiler_Expecter) ReadDir(ctx interface{}, path interface{}) *MockFiler_ReadDir_Call { func (_e *MockFiler_Expecter) ReadDir(ctx any, path any) *MockFiler_ReadDir_Call {
return &MockFiler_ReadDir_Call{Call: _e.mock.On("ReadDir", ctx, path)} return &MockFiler_ReadDir_Call{Call: _e.mock.On("ReadDir", ctx, path)}
} }
@ -291,7 +291,7 @@ type MockFiler_Stat_Call struct {
// Stat is a helper method to define mock.On call // Stat is a helper method to define mock.On call
// - ctx context.Context // - ctx context.Context
// - name string // - name string
func (_e *MockFiler_Expecter) Stat(ctx interface{}, name interface{}) *MockFiler_Stat_Call { func (_e *MockFiler_Expecter) Stat(ctx any, name any) *MockFiler_Stat_Call {
return &MockFiler_Stat_Call{Call: _e.mock.On("Stat", ctx, name)} return &MockFiler_Stat_Call{Call: _e.mock.On("Stat", ctx, name)}
} }
@ -314,11 +314,11 @@ func (_c *MockFiler_Stat_Call) RunAndReturn(run func(context.Context, string) (f
// Write provides a mock function with given fields: ctx, path, reader, mode // Write provides a mock function with given fields: ctx, path, reader, mode
func (_m *MockFiler) Write(ctx context.Context, path string, reader io.Reader, mode ...filer.WriteMode) error { func (_m *MockFiler) Write(ctx context.Context, path string, reader io.Reader, mode ...filer.WriteMode) error {
_va := make([]interface{}, len(mode)) _va := make([]any, len(mode))
for _i := range mode { for _i := range mode {
_va[_i] = mode[_i] _va[_i] = mode[_i]
} }
var _ca []interface{} var _ca []any
_ca = append(_ca, ctx, path, reader) _ca = append(_ca, ctx, path, reader)
_ca = append(_ca, _va...) _ca = append(_ca, _va...)
ret := _m.Called(_ca...) ret := _m.Called(_ca...)
@ -347,9 +347,9 @@ type MockFiler_Write_Call struct {
// - path string // - path string
// - reader io.Reader // - reader io.Reader
// - mode ...filer.WriteMode // - mode ...filer.WriteMode
func (_e *MockFiler_Expecter) Write(ctx interface{}, path interface{}, reader interface{}, mode ...interface{}) *MockFiler_Write_Call { func (_e *MockFiler_Expecter) Write(ctx any, path any, reader any, mode ...any) *MockFiler_Write_Call {
return &MockFiler_Write_Call{Call: _e.mock.On("Write", return &MockFiler_Write_Call{Call: _e.mock.On("Write",
append([]interface{}{ctx, path, reader}, mode...)...)} append([]any{ctx, path, reader}, mode...)...)}
} }
func (_c *MockFiler_Write_Call) Run(run func(ctx context.Context, path string, reader io.Reader, mode ...filer.WriteMode)) *MockFiler_Write_Call { func (_c *MockFiler_Write_Call) Run(run func(ctx context.Context, path string, reader io.Reader, mode ...filer.WriteMode)) *MockFiler_Write_Call {

View File

@ -100,7 +100,7 @@ func trimRightSpace(s string) string {
} }
// tmpl executes the given template text on data, writing the result to w. // tmpl executes the given template text on data, writing the result to w.
func tmpl(w io.Writer, text string, data interface{}) error { func tmpl(w io.Writer, text string, data any) error {
t := template.New("top") t := template.New("top")
t.Funcs(templateFuncs) t.Funcs(templateFuncs)
template.Must(t.Parse(text)) template.Must(t.Parse(text))

View File

@ -297,10 +297,10 @@ var renderFuncMap = template.FuncMap{
"yellow": color.YellowString, "yellow": color.YellowString,
"magenta": color.MagentaString, "magenta": color.MagentaString,
"cyan": color.CyanString, "cyan": color.CyanString,
"bold": func(format string, a ...interface{}) string { "bold": func(format string, a ...any) string {
return color.New(color.Bold).Sprintf(format, a...) return color.New(color.Bold).Sprintf(format, a...)
}, },
"italic": func(format string, a ...interface{}) string { "italic": func(format string, a ...any) string {
return color.New(color.Italic).Sprintf(format, a...) return color.New(color.Italic).Sprintf(format, a...)
}, },
"replace": strings.ReplaceAll, "replace": strings.ReplaceAll,

View File

@ -5,7 +5,7 @@ import (
"github.com/stretchr/testify/assert" "github.com/stretchr/testify/assert"
) )
func Equal(t assert.TestingT, expected interface{}, actual interface{}, msgAndArgs ...interface{}) bool { func Equal(t assert.TestingT, expected any, actual any, msgAndArgs ...any) bool {
ev, eok := expected.(dyn.Value) ev, eok := expected.(dyn.Value)
av, aok := actual.(dyn.Value) av, aok := actual.(dyn.Value)
if eok && aok && ev.IsValid() && av.IsValid() { if eok && aok && ev.IsValid() && av.IsValid() {
@ -32,86 +32,86 @@ func Equal(t assert.TestingT, expected interface{}, actual interface{}, msgAndAr
return assert.Equal(t, expected, actual, msgAndArgs...) return assert.Equal(t, expected, actual, msgAndArgs...)
} }
func EqualValues(t assert.TestingT, expected, actual interface{}, msgAndArgs ...interface{}) bool { func EqualValues(t assert.TestingT, expected, actual any, msgAndArgs ...any) bool {
return assert.EqualValues(t, expected, actual, msgAndArgs...) return assert.EqualValues(t, expected, actual, msgAndArgs...)
} }
func NotEqual(t assert.TestingT, expected interface{}, actual interface{}, msgAndArgs ...interface{}) bool { func NotEqual(t assert.TestingT, expected any, actual any, msgAndArgs ...any) bool {
return assert.NotEqual(t, expected, actual, msgAndArgs...) return assert.NotEqual(t, expected, actual, msgAndArgs...)
} }
func Len(t assert.TestingT, object interface{}, length int, msgAndArgs ...interface{}) bool { func Len(t assert.TestingT, object any, length int, msgAndArgs ...any) bool {
return assert.Len(t, object, length, msgAndArgs...) return assert.Len(t, object, length, msgAndArgs...)
} }
func Empty(t assert.TestingT, object interface{}, msgAndArgs ...interface{}) bool { func Empty(t assert.TestingT, object any, msgAndArgs ...any) bool {
return assert.Empty(t, object, msgAndArgs...) return assert.Empty(t, object, msgAndArgs...)
} }
func Nil(t assert.TestingT, object interface{}, msgAndArgs ...interface{}) bool { func Nil(t assert.TestingT, object any, msgAndArgs ...any) bool {
return assert.Nil(t, object, msgAndArgs...) return assert.Nil(t, object, msgAndArgs...)
} }
func NotNil(t assert.TestingT, object interface{}, msgAndArgs ...interface{}) bool { func NotNil(t assert.TestingT, object any, msgAndArgs ...any) bool {
return assert.NotNil(t, object, msgAndArgs...) return assert.NotNil(t, object, msgAndArgs...)
} }
func NoError(t assert.TestingT, err error, msgAndArgs ...interface{}) bool { func NoError(t assert.TestingT, err error, msgAndArgs ...any) bool {
return assert.NoError(t, err, msgAndArgs...) return assert.NoError(t, err, msgAndArgs...)
} }
func Error(t assert.TestingT, err error, msgAndArgs ...interface{}) bool { func Error(t assert.TestingT, err error, msgAndArgs ...any) bool {
return assert.Error(t, err, msgAndArgs...) return assert.Error(t, err, msgAndArgs...)
} }
func EqualError(t assert.TestingT, theError error, errString string, msgAndArgs ...interface{}) bool { func EqualError(t assert.TestingT, theError error, errString string, msgAndArgs ...any) bool {
return assert.EqualError(t, theError, errString, msgAndArgs...) return assert.EqualError(t, theError, errString, msgAndArgs...)
} }
func ErrorContains(t assert.TestingT, theError error, contains string, msgAndArgs ...interface{}) bool { func ErrorContains(t assert.TestingT, theError error, contains string, msgAndArgs ...any) bool {
return assert.ErrorContains(t, theError, contains, msgAndArgs...) return assert.ErrorContains(t, theError, contains, msgAndArgs...)
} }
func ErrorIs(t assert.TestingT, theError, target error, msgAndArgs ...interface{}) bool { func ErrorIs(t assert.TestingT, theError, target error, msgAndArgs ...any) bool {
return assert.ErrorIs(t, theError, target, msgAndArgs...) return assert.ErrorIs(t, theError, target, msgAndArgs...)
} }
func True(t assert.TestingT, value bool, msgAndArgs ...interface{}) bool { func True(t assert.TestingT, value bool, msgAndArgs ...any) bool {
return assert.True(t, value, msgAndArgs...) return assert.True(t, value, msgAndArgs...)
} }
func False(t assert.TestingT, value bool, msgAndArgs ...interface{}) bool { func False(t assert.TestingT, value bool, msgAndArgs ...any) bool {
return assert.False(t, value, msgAndArgs...) return assert.False(t, value, msgAndArgs...)
} }
func Contains(t assert.TestingT, list interface{}, element interface{}, msgAndArgs ...interface{}) bool { func Contains(t assert.TestingT, list any, element any, msgAndArgs ...any) bool {
return assert.Contains(t, list, element, msgAndArgs...) return assert.Contains(t, list, element, msgAndArgs...)
} }
func NotContains(t assert.TestingT, list interface{}, element interface{}, msgAndArgs ...interface{}) bool { func NotContains(t assert.TestingT, list any, element any, msgAndArgs ...any) bool {
return assert.NotContains(t, list, element, msgAndArgs...) return assert.NotContains(t, list, element, msgAndArgs...)
} }
func ElementsMatch(t assert.TestingT, listA, listB interface{}, msgAndArgs ...interface{}) bool { func ElementsMatch(t assert.TestingT, listA, listB any, msgAndArgs ...any) bool {
return assert.ElementsMatch(t, listA, listB, msgAndArgs...) return assert.ElementsMatch(t, listA, listB, msgAndArgs...)
} }
func Panics(t assert.TestingT, f func(), msgAndArgs ...interface{}) bool { func Panics(t assert.TestingT, f func(), msgAndArgs ...any) bool {
return assert.Panics(t, f, msgAndArgs...) return assert.Panics(t, f, msgAndArgs...)
} }
func PanicsWithValue(t assert.TestingT, expected interface{}, f func(), msgAndArgs ...interface{}) bool { func PanicsWithValue(t assert.TestingT, expected any, f func(), msgAndArgs ...any) bool {
return assert.PanicsWithValue(t, expected, f, msgAndArgs...) return assert.PanicsWithValue(t, expected, f, msgAndArgs...)
} }
func PanicsWithError(t assert.TestingT, errString string, f func(), msgAndArgs ...interface{}) bool { func PanicsWithError(t assert.TestingT, errString string, f func(), msgAndArgs ...any) bool {
return assert.PanicsWithError(t, errString, f, msgAndArgs...) return assert.PanicsWithError(t, errString, f, msgAndArgs...)
} }
func NotPanics(t assert.TestingT, f func(), msgAndArgs ...interface{}) bool { func NotPanics(t assert.TestingT, f func(), msgAndArgs ...any) bool {
return assert.NotPanics(t, f, msgAndArgs...) return assert.NotPanics(t, f, msgAndArgs...)
} }
func JSONEq(t assert.TestingT, expected string, actual string, msgAndArgs ...interface{}) bool { func JSONEq(t assert.TestingT, expected string, actual string, msgAndArgs ...any) bool {
return assert.JSONEq(t, expected, actual, msgAndArgs...) return assert.JSONEq(t, expected, actual, msgAndArgs...)
} }

View File

@ -14,7 +14,7 @@ type loader struct {
path string path string
} }
func errorf(loc dyn.Location, format string, args ...interface{}) error { func errorf(loc dyn.Location, format string, args ...any) error {
return fmt.Errorf("yaml (%s): %s", loc, fmt.Sprintf(format, args...)) return fmt.Errorf("yaml (%s): %s", loc, fmt.Sprintf(format, args...))
} }

View File

@ -59,7 +59,7 @@ func (ec errorChain) Unwrap() error {
return ec[1:] return ec[1:]
} }
func (ec errorChain) As(target interface{}) bool { func (ec errorChain) As(target any) bool {
return errors.As(ec[0], target) return errors.As(ec[0], target)
} }

View File

@ -11,10 +11,10 @@ import (
func TestFromTypeBasic(t *testing.T) { func TestFromTypeBasic(t *testing.T) {
type myStruct struct { type myStruct struct {
S string `json:"s"` S string `json:"s"`
I *int `json:"i,omitempty"` I *int `json:"i,omitempty"`
V interface{} `json:"v,omitempty"` V any `json:"v,omitempty"`
TriplePointer ***int `json:"triple_pointer,omitempty"` TriplePointer ***int `json:"triple_pointer,omitempty"`
// These fields should be ignored in the resulting schema. // These fields should be ignored in the resulting schema.
NotAnnotated string NotAnnotated string