我能够在k8s中执行sparkpi并部署(在gke中)。
但是,当我试图向我的微服务广播pi值时 toys-broadcast-svc.toys.svc.cluster.local
我无法解析dns(未知hostexception)。有人能帮忙吗?我是不是漏了什么?
供参考:
我已经安装了舵机 helm install sparkoperator incubator/sparkoperator --namespace toys-spark-operator --set sparkJobNamespace=toys-spark,enableWebhook=true
我正在使用spark操作符(microservice在名称空间中称为 toys
spark在命名空间中称为 toys-spark
)
apiVersion: "sparkoperator.k8s.io/v1beta2"
kind: SparkApplication
metadata:
name: spark-pi
namespace: toys-spark #apps namespace
spec:
type: Java
mode: cluster
image: toysindia/spark:3.0.1
imagePullPolicy: Always
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: local:///opt/spark/examples/jars/spark-examples_2.12-3.0.1.jar
sparkVersion: 3.0.1
restartPolicy:
type: Never
volumes:
- name: "toys-spark-test-volume-driver"
hostPath:
path: "/host_mnt/usr/local/storage/k8s/dock-storage/spark/driver"
type: Directory
- name: "toys-spark-test-volume-executor"
hostPath:
path: "/host_mnt/usr/local/storage/k8s/dock-storage/spark/executor"
type: Directory
driver:
cores: 1
coreLimit: "1200m"
memory: "512m"
labels:
version: 3.0.1
serviceAccount: spark
volumeMounts:
- name: "toys-spark-test-volume-driver"
mountPath: "/host_mnt/usr/local/storage/k8s/dock-storage/spark/driver"
executor:
cores: 1
instances: 1
memory: "512m"
labels:
version: 3.0.1
volumeMounts:
- name: "toys-spark-test-volume-executor"
mountPath: "/host_mnt/usr/local/storage/k8s/dock-storage/spark/executor"
sparkConf:
spark.eventLog.dir:
spark.eventLog.enabled: "true"
---
apiVersion: v1
kind: Namespace
metadata:
name: toys-spark-operator
---
apiVersion: v1
kind: Namespace
metadata:
name: toys-spark #apps namespace
---
apiVersion: v1
kind: ServiceAccount
metadata:
name: spark
namespace: toys-spark #apps namespace
---
apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRoleBinding
metadata:
name: spark-operator-role
namespace: toys-spark #apps namespace
roleRef:
apiGroup: rbac.authorization.k8s.io
kind: ClusterRole
name: edit
subjects:
- kind: ServiceAccount
name: spark
namespace: toys-spark #apps namespace
暂无答案!
目前还没有任何答案,快来回答吧!