spark hbase-gcp模板(1/3)-hortonworks连接器如何本地打包?

vecaoik1  于 2021-06-07  发布在  Hbase
关注(0)|答案(1)|浏览(438)

我尝试在gcp上下文中测试spark hbase连接器,并尝试遵循[1],它要求使用maven(我尝试了maven 3.6.3)为spark 2.4本地打包连接器[2],并导致以下问题。
错误“branch-2.4”: [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project shc-core: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed.: NullPointerException -> [Help 1] 参考文献
[1] https://github.com/googlecloudplatform/cloud-bigtable-examples/tree/master/scala/bigtable-shc
[2] https://github.com/hortonworks-spark/shc/tree/branch-2.4

ztyzrc3y

ztyzrc3y1#

正如评论中所建议的(谢谢@ismail!),使用java 8构建连接器:
sdk use java 8.0.275-zulu mvn clean package -DskipTests 然后可以将jar导入 Dependencies.scala gcp模板的定义如下。

...
val shcCore = "com.hortonworks" % "shc-core" % "1.1.3-2.4-s_2.11" from "file:///<path_to_jar>/shc-core-1.1.3-2.4-s_2.11.jar"
...
// shcCore % (shcVersionPrefix + scalaBinaryVersion) excludeAll(
shcCore excludeAll(
...

相关问题