databricks dbutils抛出nullpointerexception

mdfafbf1  于 2021-05-24  发布在  Spark
关注(0)|答案(1)|浏览(514)

尝试使用databricks dbutils从azure密钥保管库读取机密,但遇到以下异常:

  1. OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0
  2. Warning: Ignoring non-Spark config property: eventLog.rolloverIntervalSeconds
  3. Exception in thread "main" java.lang.NullPointerException
  4. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  5. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  6. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  7. at java.lang.reflect.Method.invoke(Method.java:498)
  8. at com.databricks.dbutils_v1.DBUtilsHolder$$anon$1.invoke(DBUtilsHolder.scala:17)

下面是代码段:

  1. Object secret = DBUtilsHolder.dbutils().secrets().get("<scope>", "<key>");
  2. System.out.println("secret " + secret.toString());

以下是pom:

  1. <dependencies>
  2. <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
  3. <dependency>
  4. <groupId>org.apache.spark</groupId>
  5. <artifactId>spark-core_2.11</artifactId>
  6. <version>2.4.5</version>
  7. <scope>provided</scope>
  8. </dependency>
  9. <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
  10. <dependency>
  11. <groupId>org.apache.spark</groupId>
  12. <artifactId>spark-sql_2.11</artifactId>
  13. <version>2.4.5</version>
  14. <scope>provided</scope>
  15. </dependency>
  16. <dependency>
  17. <groupId>io.airlift</groupId>
  18. <artifactId>airline</artifactId>
  19. <version>0.8</version>
  20. </dependency>
  21. <dependency>
  22. <groupId>org.projectlombok</groupId>
  23. <artifactId>lombok</artifactId>
  24. <version>1.18.12</version>
  25. <scope>provided</scope>
  26. </dependency>
  27. <!-- https://mvnrepository.com/artifact/com.databricks/dbutils-api -->
  28. <dependency>
  29. <groupId>com.databricks</groupId>
  30. <artifactId>dbutils-api_2.11</artifactId>
  31. <version>0.0.4</version>
  32. <scope>provided</scope>
  33. </dependency>
  34. <dependencies>

当作为spark submit作业从databricks运行时,它失败了,但是当尝试使用setjar选项时,它工作了。

zlhcx6iw

zlhcx6iw1#

根据文档,databricks实用程序不可用于spark提交作业。如果您想使用databricks实用程序,请改用jar作业。

相关问题