We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Describe the bug
It seems to be unable for working with secret contain files in K8S
Steps to reproduce the issue:
In GCP k8s cluster, create a secret that contains the google cloud service account file https://kubernetes.io/docs/concepts/configuration/secret/#using-secrets-as-files-from-a-pod
Install helm chart https://artifacthub.io/packages/helm/banzaicloud-stable/spark-hs with sparkEventLogStorage.secretName point to newly created secret
sparkEventLogStorage.secretName
Expected behavior
New spark-hs service failed to read /opt/spark/conf/secret/google.json
Additional context
If I change manually in the template deployment from
- name: secret-volume mountPath: /opt/spark/conf/secret/google.json subPath: google.json
to
- name: secret-volume mountPath: /opt/spark/conf/secret/ readOnly: true
The files from secret are created
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Describe the bug
It seems to be unable for working with secret contain files in K8S
Steps to reproduce the issue:
In GCP k8s cluster, create a secret that contains the google cloud service account file
https://kubernetes.io/docs/concepts/configuration/secret/#using-secrets-as-files-from-a-pod
Install helm chart https://artifacthub.io/packages/helm/banzaicloud-stable/spark-hs with
sparkEventLogStorage.secretName
point to newly created secretExpected behavior
New spark-hs service failed to read /opt/spark/conf/secret/google.json
Additional context
If I change manually in the template deployment from
to
The files from secret are created
The text was updated successfully, but these errors were encountered: