如何在spark 2.2.0的配置单元2.1.1上执行配置单元查询?

vfwfrxfs  于 2021-06-26  发布在  Hive
关注(0)|答案(0)|浏览(280)

简单的查询,例如。 select ,工作正常,但当我使用聚合函数时,例如。 count ,我面临错误。
我用 beeline 使用spark 2.2.0和hadoop 2.8连接到hive 2.1.1。 hive-site.xml 具体如下:

<property>
    <name>hive.execution.engine</name>
    <value>spark</value>
    <description>
      Expects one of [mr, tez, spark].
      Chooses execution engine. Options are: mr (Map reduce, default), tez, spark. While MR
      remains the default engine for historical reasons, it is itself a historical engine
      and is deprecated in Hive 2 line. It may be removed without further warning.
    </description>
  </property>
<property>
<name>spark.master</name>
<value>spark://master:7077</value>
<description>Spark Master URL</description>
</property>
<property>
<name>spark.eventLog.enabled</name>
<value>true</value>
<description>Spark Event Log</description>
</property>
<property>
<name>spark.eventLog.dir</name>
<value>hdfs://master:8020/user/spark/eventLogging</value>
<description>Spark event log folder</description>
</property>
<property>
<name>spark.executor.memory</name>
<value>512m</value>
<description>Spark executor memory</description>
</property>
<property>
<name>spark.serializer</name>
<value>org.apache.spark.serializer.KryoSerializer</value>
<description>Spark serializer</description>
</property>
<property>
<name>spark.yarn.jars</name>
<value>hdfs://master:9000:/user/spark/spark-jars/*</value>
</property>
<property>
<name>spark.master</name>
<value>spark://master:7077</value>
<description>Spark Master URL</description>
</property>
<property>
<name>spark.eventLog.enabled</name>
<value>true</value>
<description>Spark Event Log</description>
</property>
<property>
<name>spark.eventLog.dir</name>
<value>hdfs://master:8020/user/spark/eventLogging</value>
<description>Spark event log folder</description>
</property>
<property>
<name>spark.executor.memory</name>
<value>512m</value>
<description>Spark executor memory</description>
</property>
<property>
<name>spark.serializer</name>
<value>org.apache.spark.serializer.KryoSerializer</value>
<description>Spark serializer</description>
</property>
<property>
<name>spark.yarn.jars</name>
<value>hdfs://master:9000:/user/spark/spark-jars/*</value>
</property>

执行时 select count(*) from table 在hive中,我得到以下错误:

WARN thrift.ThriftCLIService: Error executing statement:
org.apache.hive.service.cli.HiveSQLException: Error running query: java.lang.NoClassDefFoundError: scala/collection/Iterable
        at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:225) ~[hive-service-2.1.1.jar:2.1.1]
        at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:276) ~[hive-service-2.1.1.jar:2.1.1]
        at org.apache.hive.service.cli.operation.Operation.run(Operation.java:324) ~[hive-service-2.1.1.jar:2.1.1]
        at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:499) ~[hive-service-2.1.1.jar:2.1.1]
        at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:486) ~[hive-service-2.1.1.jar:2.1.1]
        at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:295) ~[hive-service-2.1.1.jar:2.1.1]
        at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:506) [hive-service-2.1.1.jar:2.1.1]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_121]
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_121]
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_121]
        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_121]
        at org.apache.hive.jdbc.HiveConnection$SynchronizedHandler.invoke(HiveConnection.java:1412) [hive-jdbc-2.1.1.jar:2.1.1]
        at com.sun.proxy.$Proxy35.ExecuteStatement(Unknown Source) [?:?]
        at org.apache.hive.jdbc.HiveStatement.runAsyncOnServer(HiveStatement.java:308) [hive-jdbc-2.1.1.jar:2.1.1]
        at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:250) [hive-jdbc-2.1.1.jar:2.1.1]

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题