-
Notifications
You must be signed in to change notification settings - Fork 120
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
spark version in dbx #854
Comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Expected Behavior
We are using dbx to deploy and launch databricks jobs. Currently we are using 11.3 LTS runtime and now planning to migrate to 13.3 LTS. The runtime version is currently configured in deployment.yml in dbx for all jobs and also in cluster policy. For easy management, we would like to get rid of the spark version parameter from deployment.yml and get it enforced using the cluster policy. But, we are facing following errors.
Current Behavior
Scenario 1: YML Version = 11.X and Policy Version = 13.X . The job failed with the following error
{ 'error_code': 'INVALID_PARAMETER_VALUE',
'message': 'Cluster validation error: Validation failed for spark_version, '
'must be 13.3.x-scala2.12 (is an element in '
'"List(11.3.x-aarch64-scala2.12, 11.3.x-scala2.12)")'}
Scenario 2: YML (Removed Spark Version Parameter ) and Policy Version = 13.x. The job failed with the following error.
ValidationError: 2 validation errors for Deployment
workflows -> 0 -> Workflow -> job_clusters -> 0 -> new_cluster -> spark_version
field required (type=value_error.missing)
workflows -> 1 -> Workflow -> job_clusters -> 0 -> new_cluster -> spark_version
field required (type=value_error.missing)
ERROR during core_deployment workflow deployment (1)!
Context
We are using dbx to deploy and launch databricks jobs. Currently we are using 11.3 LTS runtime and now planning to migrate to 13.3 LTS. The runtime version is currently configured in deployment.yml in dbx for all jobs and also in cluster policy. For easy management, we would like to get rid of the spark version parameter from deployment.yml and get it enforced using the cluster policy. But, we are facing following errors.
The text was updated successfully, but these errors were encountered: