我已经使用flume将twitter数据下载到hdfs中,但是当我尝试使用pig查询它时,我得到了一个类转换异常,无法从utf-8转换为string。
grunt> A= LOAD '/apps/hive/warehouse/twtr_uk.db/twitterdata_09062015/' USING AvroStorage ('{
>> "type" : "record",
>> "name" : "Doc",
>> "doc" : "adoc",
>> "fields" : [
>> {
>> "name" : "id",
>> "type" : "string"
>> },
>> {
>> "name" : "user_friends_count",
>> "type" : [ "int", "null" ]
>> },
>> {
>> "name" : "user_location",
>> "type" : [ "string", "null" ]
>> },
>> {
>> "name" : "user_description",
>> "type" : [ "string", "null" ]
>> }, {
>> "name" : "user_statuses_count",
>> "type" : [ "int", "null" ]
>> }, {
>> "name" : "user_followers_count",
>> "type" : [ "int", "null" ]
>> }, {
>> "name" : "user_name",
>> "type" : [ "string", "null" ]
>> }, {
>> "name" : "user_screen_name",
>> "type" : [ "string", "null" ]
>> }, {
>> "name" : "created_at",
>> "type" : [ "string", "null" ]
>> }, {
>> "name" : "text",
>> "type" : [ "string", "null" ]
>> }, {
>> "name" : "retweet_count",
>> "type" : [ "long", "null" ]
>> }, {
>> "name" : "retweeted",
>> "type" : [ "boolean", "null" ]
>> }, {
>> "name" : "in_reply_to_user_id",
>> "type" : [ "long", "null" ]
>> }, {
>> "name" : "source",
>> "type" : [ "string", "null" ]
>> }, {
>> "name" : "in_reply_to_status_id",
>> "type" : [ "long", "null" ]
>> }, {
>> "name" : "media_url_https",
>> "type" : [ "string", "null" ]
>> }, {
>> "name" : "expanded_url",
>> "type" : [ "string", "null" ]
>> } ]
>> }');
grunt> illustrate A;
2015-06-11 10:07:05,361 [main] INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to hadoop file system at: hdfs://sandbox.hortonworks.com:8020
2015-06-11 10:07:05,382 [main] WARN org.apache.pig.data.SchemaTupleBackend - SchemaTupleBackend has already been initialized
2015-06-11 10:07:05,382 [main] INFO org.apache.pig.newplan.logical.optimizer.LogicalPlanOptimizer - {RULES_ENABLED=[ConstantCalculator, LoadTypeCastInserter, PredicatePushdownOptimizer, StreamTypeCastInserter], RULES_DISABLED=[AddForEach, ColumnMapKeyPrune, GroupByConstParallelSetter, LimitOptimizer, MergeFilter, MergeForEach, PartitionFilterOptimizer, PushDownForEachFlatten, PushUpFilter, SplitFilter]}
2015-06-11 10:07:05,383 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler - File concatenation threshold: 100 optimistic? false
2015-06-11 10:07:05,384 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size before optimization: 1
2015-06-11 10:07:05,384 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size after optimization: 1
2015-06-11 10:07:05,385 [main] INFO org.apache.pig.tools.pigstats.mapreduce.MRScriptState - Pig script settings are added to the job
2015-06-11 10:07:05,385 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3
2015-06-11 10:07:05,426 [main] WARN org.apache.pig.data.SchemaTupleBackend - SchemaTupleBackend has already been initialized
2015-06-11 10:07:05,426 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigMapOnly$Map - Aliases being processed per job phase (AliasName[line,offset]): M: A[123,3] C: R:
2015-06-11 10:07:05,436 [main] INFO org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input paths to process : 6
2015-06-11 10:07:05,436 [main] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths to process : 6
java.lang.ClassCastException: org.apache.avro.util.Utf8 cannot be cast to java.lang.String
at org.apache.pig.impl.util.avro.AvroTupleWrapper.getMemorySize(AvroTupleWrapper.java:201)
at org.apache.pig.impl.util.avro.AvroTupleWrapper.getMemorySize(AvroTupleWrapper.java:178)
at org.apache.pig.pen.util.ExampleTuple.getMemorySize(ExampleTuple.java:97)
at org.apache.pig.data.DefaultAbstractBag.sampleContents(DefaultAbstractBag.java:101)
错误2997:遇到ioexception。例外
1条答案
按热度按时间oipij1gg1#
如果hdfs中有avro数据,则不需要显式指定avro模式,请尝试如下运行。
a=使用avrostorage()加载“/apps/hive/warehouse/twtrïu uk.db/twitterdataï09062015/”;