我想通过sparksql从hbase加载数据,我使用 hbase-spark
官样文章及其启示 NullPointerException
我的 build.sbt
文件是:
name := "proj_1"
version := "0.1"
scalaVersion := "2.11.12"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.11" % "2.3.1",
"org.apache.spark" % "spark-sql_2.11" % "2.3.1",
"org.apache.spark" % "spark-mllib_2.11" % "2.3.1",
"org.apache.spark" % "spark-streaming_2.11" % "2.3.1",
"org.apache.spark" % "spark-hive_2.11" % "2.3.1",
"org.elasticsearch" % "elasticsearch-hadoop" % "6.4.0",
"org.apache.hadoop" % "hadoop-core" % "2.6.0-mr1-cdh5.15.1",
"org.apache.hbase" % "hbase" % "2.1.0",
"org.apache.hbase" % "hbase-server" % "2.1.0",
"org.apache.hbase" % "hbase-common" % "2.1.0",
"org.apache.hbase" % "hbase-client" % "2.1.0",
"org.apache.hbase" % "hbase-spark" % "2.1.0-cdh6.x-SNAPSHOT"
)
resolvers += "Cloudera Repository" at "https://repository.cloudera.com/artifactory/cloudera-repos/"
resolvers += "clojars" at "https://clojars.org/repo"
resolvers += "conjars" at "http://conjars.org/repo"
resolvers += "Apache HBase" at "https://repository.apache.org/content/repositories/releases"
错误代码是:
def withCatalog(cat: String): DataFrame = {
sqlContext
.read
.options(Map(HBaseTableCatalog.tableCatalog->cat))
.format("org.apache.hadoop.hbase.spark")
.option("zkUrl", "127.0.0.1:2181/chen_test")
.load()
}
val df = withCatalog(catalog)
例外信息是:
Exception in thread "main" java.lang.NullPointerException
at org.apache.hadoop.hbase.spark.HBaseRelation.<init> (DefaultSource.scala:139)
at org.apache.hadoop.hbase.spark.DefaultSource.createRelation(DefaultSource.scala:70)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:340)
at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:239)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:227)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:164)
at hbase_test$.withCatalog$1(hbase_test.scala:57)
at hbase_test$.main(hbase_test.scala:59)
at hbase_test.main(hbase_test.scala)
我该怎么修?你能帮助我吗?
1条答案
按热度按时间bjg7j2ky1#
最近遇到这个问题。建议你试试这个:
最后一个表达式是在环境中引入一个稳定值;在扫描hbase的代码库时意外地发现了这一点。希望有帮助。