spark shell中的hbase sparkcontext类型不匹配

ds97pgxw  于 2021-07-13  发布在  Spark
关注(0)|答案(0)|浏览(262)

我试着在内部使用hbase spark-shell . 有两个集群:cloudera和hortonwork。cloudera cluster执行完美的命令,hortonwork one使用相同的命令失败,并出现一个奇怪的错误:

scala> new HBaseContext(sc, conf)
<console>:30: error: type mismatch;
 found   : org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.SparkContext
 required: org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.SparkContext
Error occurred in an application involving default arguments.
        new HBaseContext(sc, conf)
                         ^

下面是更详细的输出 spark-shell 在hdp上:

scala> import org.apach.hadoop.hbase.HBaseConfiguration
import org.apach.hadoop.hbase.HBaseConfiguration
warning: Class org.apache.yetus.audience.InterfaceAudience not found - continuing with a stub

scala> import org.apache.hadoop.hbase.spark.HBaseContext
org.apache.hadoop.hbase.spark.HBaseContext
warning: Class org.apache.yetus.audience.InterfaceAudience not found - continuing with a stub
error: missing or invalid dependency detected while loading class file 'HBaseContext.class'.
Could not access term yetus in  package org.apache,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HBaseContext.class' was compiled against an incompatible version of org.apache.
error: missing or invalid dependency detected while loading class file 'HBaseContext.class'.
Could not access term audience in value org.apache.yetus,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HBaseContext.class' was compiled against an incompatible version of org.apache.yetus.

scala> val conf = HBaseConfiguration.create()
conf: org.apache.hadoop.conf.Configuration = Configuration: core-default.xml, core-site.xml, mapred-default.xml, mapred-site.xml, yarn-default.xml, yarn-site.xml, hdfs-default.xml, hdfs-site.xml, hbase-defaul.xml, hbase-site.xml

scala> conf.addResource(new Path("/etc/hbase/3.0.1.0-187/0/hbase-site.xml")

scala> new HBaseContext(sc, conf)
<console>:30: error: type mismatch;
 found   : org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.SparkContext
 required: org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.SparkContext
Error occurred in an application involving default arguments.
        new HBaseContext(sc, conf)
                         ^

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题