使用winutils和hbaseminicluster对代码进行单元测试

ddrv8njm  于 2021-06-08  发布在  Hbase
关注(0)|答案(0)|浏览(239)

我试图通过遵循本教程生成一个hbaseminicluster
http://blog.cloudera.com/blog/2013/09/how-to-test-hbase-applications-using-popular-tools/
唯一不同的是我是用scala做的
我也设置了winutils(因为我的错误与windows上的hadoop相同)。Yarn无法从java.lang.UnsatifiedLinkError开始)
下面是我的pom文件

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.vikas</groupId>
  <artifactId>TestHBase</artifactId>
  <version>1.0</version>
  <name>${project.artifactId}</name>
  <description>My wonderfull scala app</description>
  <inceptionYear>2015</inceptionYear>
  <licenses>
    <license>
      <name>My License</name>
      <url>http://....</url>
      <distribution>repo</distribution>
    </license>
  </licenses>

  <repositories>
    <repository>
      <id>cloudera</id>
      <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
    </repository>
    <repository>
      <id>apache-repo</id>
      <name>Apache Repository</name>
      <url>https://repository.apache.org/content/repositories/releases</url>
      <releases>
        <enabled>true</enabled>
      </releases>
      <snapshots>
        <enabled>false</enabled>
      </snapshots>
    </repository>
    <repository>
      <id>cloudera-repo-releases</id>
      <url>https://repository.cloudera.com/artifactory/repo/</url>
    </repository>
  </repositories>

  <dependencies>
    <dependency>
      <groupId>org.scala-lang</groupId>
      <artifactId>scala-library</artifactId>
      <version>2.10.5</version>
      <scope>provided</scope>
    </dependency>
    <dependency>
      <groupId>org.scalatest</groupId>
      <artifactId>scalatest_2.10</artifactId>
      <version>2.2.6</version>
      <scope>test</scope>
      <exclusions>
        <exclusion>
          <!-- make sure wrong scala version is not pulled in -->
          <groupId>org.scala-lang</groupId>
          <artifactId>scala-library</artifactId>
        </exclusion>
      </exclusions>
    </dependency>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>4.1</version>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>org.specs</groupId>
      <artifactId>specs</artifactId>
      <version>1.2.5</version>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>org.apache.hbase</groupId>
      <artifactId>hbase-testing-util</artifactId>
      <version>1.2.0-cdh5.9.1</version>
      <scope>test</scope>
    </dependency>

    <dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>2.6.0-cdh5.9.1</version>
    <scope>provided</scope>
    </dependency>

    <dependency>
    <groupId>org.apache.hbase</groupId>
    <artifactId>hbase-common</artifactId>
    <version>1.2.0-cdh5.9.1</version>
    <scope>provided</scope>
    </dependency>
    <dependency>
      <groupId>org.apache.hbase</groupId>
      <artifactId>hbase-client</artifactId>
      <version>1.2.0-cdh5.9.1</version>
      <scope>provided</scope>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.10</artifactId>
      <version>1.6.0-cdh5.9.1</version>
      <scope>provided</scope>
    </dependency>
    <!--
    <dependency>
      <groupId>org.slf4j</groupId>
      <artifactId>slf4j-log4j12</artifactId>
      <version>1.7.5</version>
      <scope>provided</scope>
    </dependency>
    <dependency>
      <groupId>org.apache.logging.log4j</groupId>
      <artifactId>log4j-api</artifactId>
     <version>2.1</version>
      <scope>provided</scope>
    </dependency>
    <dependency>
      <groupId>org.apache.logging.log4j</groupId>
      <artifactId>log4j-core</artifactId>
      <version>2.1</version>
     <scope>provided</scope>
    </dependency>
    -->
  </dependencies>

  <build>
    <sourceDirectory>src/main/scala</sourceDirectory>
    <testSourceDirectory>src/test/scala</testSourceDirectory>
    <plugins>
      <plugin>
        <groupId>org.scala-tools</groupId>
        <artifactId>maven-scala-plugin</artifactId>
        <version>2.15.2</version>
        <executions>
          <execution>
            <goals> 
              <goal>compile</goal>
              <goal>testCompile</goal>
            </goals>
          </execution>
        </executions>
      </plugin>
      <plugin>
        <artifactId>maven-compiler-plugin</artifactId>
        <version>2.5.1</version>
        <configuration>
          <source>1.8</source>
          <target>1.8</target>
        </configuration>
      </plugin>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-surefire-plugin</artifactId>
        <version>2.18.1</version>
        <configuration>
          <useFile>false</useFile>
          <disableXmlReport>true</disableXmlReport>
          <!-- If you have classpath issue like NoDefClassError,... -->
          <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
          <includes>
            <include>**/Test*.*</include>
            <include>**/*Suite.*</include>
          </includes>
        </configuration>
      </plugin>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-surefire-report-plugin</artifactId>
        <version>2.18.1</version>
        <configuration>
          <outputDirectory>target/surefire-report</outputDirectory>
        </configuration>
        <executions>
          <execution>
            <id>during-tests</id>
            <phase>test</phase>
            <goals>
              <goal>report</goal>
            </goals>
          </execution>
        </executions>
      </plugin>
    </plugins>
  </build>
</project>

下面是我的测试用例

import com.vikas.TestHBaseGetSpark
import org.junit.Assert._
import org.apache.hadoop.hbase.HBaseTestingUtility
import org.apache.hadoop.hbase.client.{Get, HTableInterface, Put, Result}
import org.apache.hadoop.hbase.util.Bytes
import org.junit.Before
import org.junit.Test

object TestMyHBaseIntegration{
  val utility:HBaseTestingUtility=new HBaseTestingUtility
}

class TestMyHBaseIntegration {

  val test: Array[Byte] = "TestTable".getBytes
  val CF: Array[Byte] = "CF".getBytes
  val CQ1: Array[Byte] = "CQ-1".getBytes
  val CQ2: Array[Byte] = "CQ-2".getBytes

  @Before
  def setup(): Unit = {
    //utility = new HBaseTestingUtility
    TestMyHBaseIntegration.utility.startMiniCluster
  }

  import org.apache.hadoop.hbase.util.Bytes
  import org.junit.Test

  @Test
   def testFetch(): Unit = {
    val table = TestMyHBaseIntegration.utility.createTable(test, CF)
    val objPut:Put = new Put(Bytes.toBytes("123"))
    objPut.addColumn(CF, CQ1, Bytes.toBytes("abc"))
    objPut.addColumn(CF, CQ2, Bytes.toBytes("xyz"))
    table.put(objPut)
    assertEquals("TestTable", table.getTableName)
  }
}

当我从cli启动mvn测试时,我得到以下错误

Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 13.094 sec <<< FAILURE! - in TestMyHBaseIntegration
testFetch(TestMyHBaseIntegration)  Time elapsed: 13.093 sec  <<< ERROR!
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
        at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
        at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:609)
        at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:999)
        at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:540)
        at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:500)
        at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:319)
        at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:212)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:1097)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:779)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:614)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:676)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:844)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:823)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1547)
        at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:1114)
        at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:985)
        at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:814)
        at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:745)
        at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniDFSCluster(HBaseTestingUtility.java:585)
        at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:987)
        at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:868)
        at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:862)
        at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:806)
        at TestMyHBaseIntegration.setup(TestMyHBaseIntegration.scala:25)

如果这里有人遇到过类似的问题,你能帮我找出我做错了什么吗?
ps:我在用Java8
在运行mvn测试之后,还会显示以下消息

7/09/19 18:31:40 INFO hbase.HBaseCommonTestingUtility: Starting up minicluster with 1 master(s) and 1 regionserver(s) and 1 datanode(s)
17/09/19 18:31:40 INFO hbase.HBaseCommonTestingUtility: Created new mini-cluster data directory: D:\scala_dev\TestHBase\target\test-data\642abce9-f187-4f95-a9c3-50e3913228e6\dfscluster_2b336f9c-ba28-4fcd-bd93-bdb954b52fa5, deleteOnExit=true
17/09/19 18:31:40 INFO hbase.HBaseCommonTestingUtility: Setting test.cache.data to D:/scala_dev/TestHBase/target/test-data/642abce9-f187-4f95-a9c3-50e3913228e6/cache_data in system properties and HBase conf
17/09/19 18:31:40 INFO hbase.HBaseCommonTestingUtility: Setting hadoop.tmp.dir to D:/scala_dev/TestHBase/target/test-data/642abce9-f187-4f95-a9c3-50e3913228e6/hadoop_tmp in system properties and HBase conf
17/09/19 18:31:40 INFO hbase.HBaseCommonTestingUtility: Setting hadoop.log.dir to D:/scala_dev/TestHBase/target/test-data/642abce9-f187-4f95-a9c3-50e3913228e6/hadoop_logs in system properties and HBase conf
17/09/19 18:31:40 INFO hbase.HBaseCommonTestingUtility: Setting mapreduce.cluster.local.dir to D:/scala_dev/TestHBase/target/test-data/642abce9-f187-4f95-a9c3-50e3913228e6/mapred_local in system properties and HBase conf
17/09/19 18:31:40 INFO hbase.HBaseCommonTestingUtility: Setting mapreduce.cluster.temp.dir to D:/scala_dev/TestHBase/target/test-data/642abce9-f187-4f95-a9c3-50e3913228e6/mapred_temp in system properties and HBase conf
17/09/19 18:31:40 INFO hbase.HBaseCommonTestingUtility: read short circuit is OFF
17/09/19 18:31:40 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

谢谢

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题