从源代码编译和构建配置单元时出错

m1m5dgzv  于 2021-06-02  发布在  Hadoop
关注(0)|答案(0)|浏览(261)

我一直在尝试使用maven从源代码构建hive,但是当我从这里运行命令时仍然会遇到一些错误。我在主人身上编译Hive。我在网上搜索了将近一个星期,但没有找到解决问题的方法。
以下是每次尝试从主服务器上的源代码构建配置单元时遇到的错误:

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:compile (default-compile) on project hive-serde: Compilation failure: Compilation failure:
[ERROR] /home/elbasir/hive/serde/src/java/org/apache/hadoop/hive/serde2/thrift/ThriftJDBCBinarySerDe.java:[42,42] package org.apache.hive.service.rpc.thrift does not exist
[ERROR] /home/elbasir/hive/serde/src/java/org/apache/hadoop/hive/serde2/thrift/ColumnBuffer.java:[29,42] package org.apache.hive.service.rpc.thrift does not exist
[ERROR] /home/elbasir/hive/serde/src/java/org/apache/hadoop/hive/serde2/thrift/ColumnBuffer.java:[30,42] package org.apache.hive.service.rpc.thrift does not exist
[ERROR] /home/elbasir/hive/serde/src/java/org/apache/hadoop/hive/serde2/thrift/ColumnBuffer.java:[31,42] package org.apache.hive.service.rpc.thrift does not exist
[ERROR] /home/elbasir/hive/serde/src/java/org/apache/hadoop/hive/serde2/thrift/ColumnBuffer.java:[32,42] package org.apache.hive.service.rpc.thrift does not exist
[ERROR] /home/elbasir/hive/serde/src/java/org/apache/hadoop/hive/serde2/thrift/ColumnBuffer.java:[33,42] package org.apache.hive.service.rpc.thrift does not exist
[ERROR] /home/elbasir/hive/serde/src/java/org/apache/hadoop/hive/serde2/thrift/ColumnBuffer.java:[34,42] package org.apache.hive.service.rpc.thrift does not exist
[ERROR] /home/elbasir/hive/serde/src/java/org/apache/hadoop/hive/serde2/thrift/ColumnBuffer.java:[35,42] package org.apache.hive.service.rpc.thrift does not exist
[ERROR] /home/elbasir/hive/serde/src/java/org/apache/hadoop/hive/serde2/thrift/ColumnBuffer.java:[36,42] package org.apache.hive.service.rpc.thrift does not exist
[ERROR] /home/elbasir/hive/serde/src/java/org/apache/hadoop/hive/serde2/thrift/ColumnBuffer.java:[37,42] package org.apache.hive.service.rpc.thrift does not exist
[ERROR] /home/elbasir/hive/serde/src/java/org/apache/hadoop/hive/serde2/thrift/Type.java:[28,42] package org.apache.hive.service.rpc.thrift does not exist
[ERROR] /home/elbasir/hive/serde/src/java/org/apache/hadoop/hive/serde2/thrift/ColumnBuffer.java:[135,23] cannot find symbol
[ERROR] symbol:   class TColumn
[ERROR] location: class org.apache.hadoop.hive.serde2.thrift.ColumnBuffer
[ERROR] /home/elbasir/hive/serde/src/java/org/apache/hadoop/hive/serde2/thrift/ColumnBuffer.java:[315,10] cannot find symbol
[ERROR] symbol:   class TColumn
[ERROR] location: class org.apache.hadoop.hive.serde2.thrift.ColumnBuffer
[ERROR] /home/elbasir/hive/serde/src/java/org/apache/hadoop/hive/serde2/thrift/Type.java:[113,17] cannot find symbol
[ERROR] symbol:   class TTypeId
[ERROR] location: class org.apache.hadoop.hive.serde2.thrift.Type
[ERROR] /home/elbasir/hive/serde/src/java/org/apache/hadoop/hive/serde2/thrift/Type.java:[119,38] cannot find symbol
[ERROR] symbol:   class TTypeId
[ERROR] location: class org.apache.hadoop.hive.serde2.thrift.Type
[ERROR] /home/elbasir/hive/serde/src/java/org/apache/hadoop/hive/serde2/thrift/Type.java:[129,38] cannot find symbol
[ERROR] symbol:   class TTypeId
[ERROR] location: class org.apache.hadoop.hive.serde2.thrift.Type
[ERROR] /home/elbasir/hive/serde/src/java/org/apache/hadoop/hive/serde2/thrift/Type.java:[133,38] cannot find symbol
[ERROR] symbol:   class TTypeId
[ERROR] location: class org.apache.hadoop.hive.serde2.thrift.Type
[ERROR] /home/elbasir/hive/serde/src/java/org/apache/hadoop/hive/serde2/thrift/Type.java:[153,30] cannot find symbol
[ERROR] symbol:   class TTypeId
[ERROR] location: class org.apache.hadoop.hive.serde2.thrift.Type
[ERROR] /home/elbasir/hive/serde/src/java/org/apache/hadoop/hive/serde2/thrift/Type.java:[435,10] cannot find symbol
[ERROR] symbol:   class TTypeId
[ERROR] location: class org.apache.hadoop.hive.serde2.thrift.Type
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hive-serde

这是我的pom文件中的一些信息

<groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-compiler-plugin</artifactId>
      <version>${maven.compiler.plugin.version}</version>
    </plugin>
    <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-antrun-plugin</artifactId>
      <version>${maven.antrun.plugin.version}</version>
      <dependencies>
        <dependency>
          <groupId>ant-contrib</groupId>
          <artifactId>ant-contrib</artifactId>
          <version>${ant.contrib.version}</version>
          <exclusions>
            <exclusion>
              <groupId>ant</groupId>
              <artifactId>ant</artifactId>
            </exclusion>
          </exclusions>
        </dependency>
      </dependencies>
    </plugin>
    <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-assembly-plugin</artifactId>
      <version>${maven.assembly.plugin.version}</version>
    </plugin>
    <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-eclipse-plugin</artifactId>
      <version>${maven.eclipse.plugin.version}</version>
      <configuration>
        <downloadJavadocs>true</downloadJavadocs>
        <downloadSources>true</downloadSources>
        <workspaceActiveCodeStyleProfileName>Hive</workspaceActiveCodeStyleProfileName>
        <workspaceCodeStylesURL>${basedir}/dev-support/eclipse-styles.xml</workspaceCodeStylesURL>
      </configuration>
    </plugin>
    <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-checkstyle-plugin</artifactId>
      <version>${maven.checkstyle.plugin.version}</version>
    </plugin>
    <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-enforcer-plugin</artifactId>
      <version>${maven.enforcer.plugin.version}</version>
    </plugin>
    <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-install-plugin</artifactId>
      <version>${maven.install.plugin.version}</version>
    </plugin>
    <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-shade-plugin</artifactId>
      <version>${maven.shade.plugin.version}</version>
    </plugin>
    <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-surefire-plugin</artifactId>
      <version>2.18.1</version>
    </plugin>
    <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-jar-plugin</artifactId>
      <version>${maven.jar.plugin.version}</version>
      <configuration>
        <archive>
          <manifest>
            <addDefaultImplementationEntries>true</addDefaultImplementationEntries>
          </manifest>
        </archive>
      </configuration>
    </plugin>
    <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-war-plugin</artifactId>
      <version>${maven.war.plugin.version}</version>
    </plugin>
    <plugin>
      <groupId>org.datanucleus</groupId>
      <artifactId>datanucleus-maven-plugin</artifactId>
      <version>${datanucleus.maven.plugin.version}</version>
      <dependencies>
        <dependency>
          <groupId>org.datanucleus</groupId>
          <artifactId>datanucleus-core</artifactId>
          <version>${datanucleus-core.version}</version>
        </dependency>
      </dependencies>
    </plugin>
    <plugin>
      <groupId>org.apache.felix</groupId>
      <artifactId>maven-bundle-plugin</artifactId>
      <version>${felix.version}</version>
    </plugin>
    <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-dependency-plugin</artifactId>
      <version>${maven.dependency.plugin.version}</version>
    </plugin>
    <plugin>
      <groupId>org.codehaus.mojo</groupId>
      <artifactId>build-helper-maven-plugin</artifactId>
      <version>${maven.build-helper.plugin.version}</version>
    </plugin>
  </plugins>
</pluginManagement>

<plugins>
  <!-- plugins are always listed in sorted order by groupId, artifectId -->
  <plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-antrun-plugin</artifactId>
    <executions>
      <execution>
        <id>define-classpath</id>
        <phase>process-resources</phase>
        <goals>
          <goal>run</goal>
        </goals>
        <configuration>
          <exportAntProperties>true</exportAntProperties>
          <target>
            <property name="maven.test.classpath" refid="maven.test.classpath"/>
          </target>
        </configuration>
      </execution>
      <execution>
        <id>setup-test-dirs</id>
        <phase>process-test-resources</phase>
        <goals>
          <goal>run</goal>
        </goals>
        <configuration>
          <target>
            <delete dir="${test.tmp.dir}" />
            <delete dir="${test.conf.dir}" />
            <delete dir="${test.warehouse.dir}" />
            <mkdir dir="${test.tmp.dir}" />
            <mkdir dir="${test.warehouse.dir}" />
            <mkdir dir="${test.conf.dir}" />
            <!-- copies hive-site.xml so it can be modified -->
            <copy todir="${test.conf.dir}">
              <fileset dir="${basedir}/${hive.path.to.root}/data/conf/"/>
            </copy>
          </target>
        </configuration>
      </execution>
    </executions>
  </plugin>
  <plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-clean-plugin</artifactId>
    <version>2.5</version>
    <configuration>
      <filesets>
        <fileset>
          <directory>./</directory>
          <includes>
            <include>datanucleus.log</include>
            <include>derby.log</include>
          </includes>
          <followSymlinks>false</followSymlinks>
        </fileset>
        <fileset>
          <directory>build</directory>
          <followSymlinks>false</followSymlinks>
        </fileset>
      </filesets>
    </configuration>
  </plugin>
  <plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-checkstyle-plugin</artifactId>
    <configuration>
      <configLocation>${checkstyle.conf.dir}/checkstyle.xml</configLocation>
      <propertyExpansion>basedir=${checkstyle.conf.dir}</propertyExpansion>
    </configuration>
  </plugin>
  <plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-enforcer-plugin</artifactId>
    <executions>
      <execution>
        <id>enforce-no-snapshots</id>
        <goals>
          <goal>enforce</goal>
        </goals>
        <configuration>
          <rules>
            <requireReleaseDeps>
              <message>Release builds are not allowed to have SNAPSHOT depenendencies</message>
              <searchTransitive>true</searchTransitive>
              <onlyWhenRelease>true</onlyWhenRelease>
            </requireReleaseDeps>
          </rules>
          <fail>true</fail>
        </configuration>
      </execution>
      <execution>
        <id>enforce-banned-dependencies</id>
        <goals>
          <goal>enforce</goal>
        </goals>
        <configuration>
          <rules>
            <bannedDependencies>
              <excludes>
                <!--LGPL licenced library-->
                <exclude>com.google.code.findbugs:annotations</exclude>
              </excludes>
            </bannedDependencies>
          </rules>
          <fail>true</fail>
        </configuration>
      </execution>
    </executions>
  </plugin>
  <plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-surefire-plugin</artifactId>
    <configuration>
      <excludes>
        <exclude>**/TestSerDe.java</exclude>
        <exclude>**/TestHiveMetaStore.java</exclude>
        <exclude>**/ql/exec/vector/util/*.java</exclude>
        <exclude>**/ql/exec/vector/udf/legacy/*.java</exclude>
        <exclude>**/ql/exec/vector/udf/generic/*.java</exclude>
        <exclude>**/TestHiveServer2Concurrency.java</exclude>
        <exclude>${test.excludes.additional}</exclude>
        <exclude>${skip.spark.files}</exclude>
      </excludes>
      <redirectTestOutputToFile>true</redirectTestOutputToFile>
      <reuseForks>false</reuseForks>
      <failIfNoTests>false</failIfNoTests>
      <argLine>${maven.test.jvm.args}</argLine>
      <trimStackTrace>false</trimStackTrace>
      <additionalClasspathElements>
        <additionalClasspathElement>${test.conf.dir}</additionalClasspathElement>
        <additionalClasspathElement>${basedir}/${hive.path.to.root}/conf</additionalClasspathElement>
      </additionalClasspathElements>
      <environmentVariables>
        <TZ>US/Pacific</TZ>
        <LANG>en_US.UTF-8</LANG>
        <HADOOP_CLASSPATH>${test.conf.dir}:${basedir}/${hive.path.to.root}/conf</HADOOP_CLASSPATH>
        <HIVE_HADOOP_TEST_CLASSPATH>${test.hive.hadoop.classpath}</HIVE_HADOOP_TEST_CLASSPATH>
        <SPARK_SUBMIT_CLASSPATH>${spark.home}/lib/spark-assembly-${spark.version}-hadoop2.4.0.jar:${test.hive.hadoop.classpath}</SPARK_SUBMIT_CLASSPATH>
        <SPARK_OSX_TEST_OPTS>-Dorg.xerial.snappy.tempdir=/tmp -Dorg.xerial.snappy.lib.name=libsnappyjava.jnilib</SPARK_OSX_TEST_OPTS>
        <PATH>${env.PATH}${test.extra.path}</PATH>
      </environmentVariables>
      <systemPropertyVariables>
        <build.dir>${project.build.directory}</build.dir>
        <!-- required by zk test ClientBase -->
        <build.test.dir>${test.tmp.dir}</build.test.dir>
        <!-- required by a few tests to find the derby jar -->
        <derby.version>${derby.version}</derby.version>
        <derby.stream.error.file>${test.tmp.dir}/derby.log</derby.stream.error.file>
        <hadoop.bin.path>${hadoop.bin.path}</hadoop.bin.path>
        <!-- required by Hadoop's JobHistory -->
        <hadoop.log.dir>${test.tmp.dir}</hadoop.log.dir>
        <hive.root>${basedir}/${hive.path.to.root}/</hive.root>
        <hive.version>${project.version}</hive.version>
        <!-- required for hive-exec jar path and tests which reference a jar -->
        <maven.local.repository>${maven.repo.local}</maven.local.repository>
        <mapred.job.tracker>local</mapred.job.tracker>
        <log4j.configurationFile>${test.log4j.scheme}${test.conf.dir}/hive-log4j2.properties</log4j.configurationFile>
        <hive.test.console.log.level>${test.console.log.level}</hive.test.console.log.level>
        <log4j.debug>true</log4j.debug>
        <!-- don't diry up /tmp -->
        <java.io.tmpdir>${test.tmp.dir}</java.io.tmpdir>
        <spark.home>${spark.home}</spark.home>
        <!-- Hadoop's minidfs class uses this -->
        <test.build.data>${test.tmp.dir}</test.build.data>
        <!-- required by QTestUtil -->
        <test.data.files>${basedir}/${hive.path.to.root}/data/files</test.data.files>
        <test.data.dir>${basedir}/${hive.path.to.root}/data/files</test.data.dir>
        <test.tmp.dir>${test.tmp.dir}</test.tmp.dir>
        <test.tmp.dir.uri>${test.tmp.dir.uri}</test.tmp.dir.uri>
        <test.dfs.mkdir>${test.dfs.mkdir}</test.dfs.mkdir>
        <test.output.overwrite>${test.output.overwrite}</test.output.overwrite>
        <test.warehouse.dir>${test.warehouse.scheme}${test.warehouse.dir}</test.warehouse.dir>
        <java.net.preferIPv4Stack>true</java.net.preferIPv4Stack>
        <!-- EnforceReadOnlyTables hook and QTestUtil -->
        <test.src.tables>src,src1,srcbucket,srcbucket2,src_json,src_thrift,src_sequencefile,srcpart,alltypesorc,src_hbase,cbo_t1,cbo_t2,cbo_t3,src_cbo,part,lineitem</test.src.tables>
        <java.security.krb5.conf>${test.conf.dir}/krb5.conf</java.security.krb5.conf>
        <!-- Required by spark to work around SPARK-14958 -->
        <antlr.version>${antlr.version}</antlr.version>
        <qfile>${qfile}</qfile>
        <initScript>${initScript}</initScript>
        <clustermode>${clustermode}</clustermode>
        <qfile_regex>${qfile_regex}</qfile_regex>
        <run_disabled>${run_disabled}</run_disabled>
      </systemPropertyVariables>
    </configuration>
  </plugin>

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题