我尝试在gcp上下文中测试spark hbase连接器,并尝试遵循[1],它要求使用maven(我尝试了maven 3.6.3)为spark 2.4本地打包连接器[2],并导致以下问题。
错误“branch-2.4”: [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project shc-core: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed.: NullPointerException -> [Help 1]
参考文献
[1] https://github.com/googlecloudplatform/cloud-bigtable-examples/tree/master/scala/bigtable-shc
[2] https://github.com/hortonworks-spark/shc/tree/branch-2.4
1条答案
按热度按时间ztyzrc3y1#
正如评论中所建议的(谢谢@ismail!),使用java 8构建连接器:
sdk use java 8.0.275-zulu
mvn clean package -DskipTests
然后可以将jar导入Dependencies.scala
gcp模板的定义如下。