部署在aks上时,无法将spark history server的日志目录配置为存储blob

x7yiwoj4  于 2021-05-24  发布在  Spark
关注(0)|答案(0)|浏览(345)

我试图在aks上部署spark history server,并希望将其日志目录指向存储blob。为了实现这一点,我将配置放在values.yaml文件中below:-

wasbs:
  enableWASBS: true
  secret: azure-secrets
  sasKeyMode: false
  storageAccountKeyName: azure-storage-account-key
  storageAccountNameKeyName: azure-storage-account-name
  containerKeyName:  azure-blob-container-name
  logDirectory: wasbs:///test/piyush/spark-history

pvc:
  enablePVC: false
nfs:
  enableExampleNFS: false

首先,我使用下面的command:-

kubectl create secret generic azure-secrets --from-file=azure-storage-account-name --from-file=azure-blob-container-name --from-file=azure-storage-account-key

之后,我将运行以下一组commands:-

helm repo add stable https://kubernetes-charts.storage.googleapis.com
helm install stable/spark-history-server --values values.yaml --generate-name

但在做围棋的时候,我得到了以下几点error:-

2020-10-05 19:17:56 INFO  HistoryServer:2566 - Started daemon with process name: 12@spark-history-server-1601925447-57c5476fb-5wh6q
2020-10-05 19:17:56 INFO  SignalUtils:54 - Registered signal handler for TERM
2020-10-05 19:17:56 INFO  SignalUtils:54 - Registered signal handler for HUP
2020-10-05 19:17:56 INFO  SignalUtils:54 - Registered signal handler for INT
2020-10-05 19:17:56 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2020-10-05 19:17:56 INFO  SecurityManager:54 - Changing view acls to: root
2020-10-05 19:17:56 INFO  SecurityManager:54 - Changing modify acls to: root
2020-10-05 19:17:56 INFO  SecurityManager:54 - Changing view acls groups to:
2020-10-05 19:17:56 INFO  SecurityManager:54 - Changing modify acls groups to:
2020-10-05 19:17:56 INFO  SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
2020-10-05 19:17:56 INFO  FsHistoryProvider:54 - History server ui acls disabled; users with admin permissions: ; groups with admin permissions
Exception in thread "main" java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:280)
        at org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala)
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.azure.NativeAzureFileSystem not found
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195)
        at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2654)
        at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2667)
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:94)
        at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2703)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2685)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:364)
        at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
        at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:117)
        at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:86)
        ... 6 more
Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.azure.NativeAzureFileSystem not found
        at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101)
        at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193)
        ... 16 more

任何形式的帮助或建议都将不胜感激。
提前谢谢!

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题