我正在尝试使用以下代码读取hbase表单
JavaPairRDD<ImmutableBytesWritable, Result> pairRdd = ctx
.newAPIHadoopRDD(conf, TableInputFormat.class,
ImmutableBytesWritable.class,
org.apache.hadoop.hbase.client.Result.class).cache().cache();
System.out.println(pairRdd.count());
但是获取异常java.lang.illegalstateexception:未读块数据
查找以下代码
SparkConf sparkConf = new SparkConf().setAppName("JavaSparkSQL");
sparkConf.set("spark.master","spark://192.168.50.247:7077");
/string[]stjars={“/home/breakdown/sparkdemo2/target/sparkdemo2-0.0.1-snapshot.jar”};sparkconf.setjars(stjars);/javasparkcontext ctx=新的javasparkcontext(sparkconf);javasqlcontext sqlctx=新的javasqlcontext(ctx);
Configuration conf= HBaseConfiguration.create();
;
conf.set("hbase.master","192.168.50.73:60000");
conf.set("hbase.zookeeper.quorum","192.168.50.73");
conf.set("hbase.zookeeper.property.clientPort","2181");
conf.set("zookeeper.session.timeout","6000");
conf.set("zookeeper.recovery.retry","1");
conf.set("hbase.mapreduce.inputtable","employee11");
任何一个指针都会有很大的帮助
spark版本1.1.1 hadoop 2 hadoop 2.2.0 hbase 0.98.8-hadoop2
pfb堆栈跟踪14/12/17 21:18:45警告nativecodeloader:无法为您的平台加载本机hadoop库。。。在适用的情况下使用内置java类14/12/17 21:18:46 info appclient$clientactor:连接到主服务器spark://192.168.50.247:7077... 14/12/17 21:18:46 info sparkdeployschedulerbackend:schedulerbackend在达到minregisteredresourcesratio:0.0 14/12/17 21:18:46 info sparkdeployschedulerbackend:connected to spark cluster with app id app-20141217211846-0035 14/12/17 21:18:47 info tasksetmanager:starting task 0.0 in stage 0.0(tid 0,192.168.50.253,任意,1256字节)14/12/17 21:18:47信息blockmanagermasteractor:用265.4 mb ram注册块管理器192.168.50.253:41717,blockmanagerid(0,192.168.50.253,41717,0)14/12/17 21:18:48警告tasksetmanager:阶段0.0中丢失任务0.0(tid 0,192.168.50.253):java.lang.illegalstateexception:未读块数据java.io.objectinputstream$blockdatainputstream.setblockdatamode(objectinputstream)。java:2420)java.io.objectinputstream.readobject0(objectinputstream。java:1380)java.io.objectinputstream.defaultreadfields(objectinputstream。java:1989) java.io.objectinputstream.readserialdata(objectinputstream。java:1913)java.io.objectinputstream.readordinaryobject(objectinputstream。java:1796)java.io.objectinputstream.readobject0(objectinputstream。java:1348)java.io.objectinputstream.readobject(objectinputstream。java:370) org.apache.spark.serializer.javadeserializationstream.readobject(javaserializer。scala:62)org.apache.spark.serializer.javaserializerinstance.deserialize(javaserializer。scala:87)org.apache.spark.executor.executor$taskrunner.run(executor。scala:160)java.util.concurrent.threadpoolexecutor.runworker(threadpoolexecutor。java:1145) java.util.concurrent.threadpoolexecutor$worker.run(threadpoolexecutor。java:615)java.lang.thread.run(线程。java:724)
暂无答案!
目前还没有任何答案,快来回答吧!