我需要在hbase中读取图像并转换为 opencv
用于人脸检测的垫子。
我的代码如下
public static class FaceCountMapper extends TableMapper<Text, Text> {
private CascadeClassifier faceDetector;
public void setup(Context context) throws IOException, InterruptedException {
if (context.getCacheFiles() != null && context.getCacheFiles().length > 0) {
URI mappingFileUri = context.getCacheFiles()[0];
if (mappingFileUri != null) {
System.out.println(mappingFileUri);
faceDetector = new CascadeClassifier(mappingFileUri.toString());
}
}
super.setup(context);
} // setup()
public ArrayList<Object> detectFaces(Mat image, String file_name) {
ArrayList<Object> facemap = new ArrayList<Object>();
MatOfRect faceDetections = new MatOfRect();
faceDetector.detectMultiScale(image, faceDetections);
System.out.println(String.format("Detected %s faces", faceDetections.toArray().length));
output.put(faceDetections.toArray().length);
facemap.add(output);
}
return facemap;
}
public void map(ImmutableBytesWritable row, Result result, Context context)
throws InterruptedException, IOException {
String file_name = Bytes.toString(result.getValue(Bytes.toBytes("Filename"), Bytes.toBytes("data")));
String mimetype = Bytes.toString(result.getValue(Bytes.toBytes("mime"), Bytes.toBytes("data")));
byte[] image_data = result.getValue(Bytes.toBytes("Data"), Bytes.toBytes("data"));
BufferedImage bi = ImageIO.read(new ByteArrayInputStream(image_data));
Mat mat = new Mat(bi.getHeight(), bi.getWidth(), CvType.CV_8UC3);
mat.put(0, 0, image_data);
detectFaces(mat, file_name);
}
作业配置如下
Configuration conf = this.getConf();
conf.set("hbase.master", "101.192.0.122:16000");
conf.set("hbase.zookeeper.quorum", "101.192.0.122");
conf.setInt("hbase.zookeeper.property.clientPort", 2181);
conf.set("zookeeper.znode.parent", "/hbase-unsecure");
// Initialize and configure MapReduce job
Job job = Job.getInstance(conf);
job.setJarByClass(FaceCount3.class);
job.setMapperClass(FaceCountMapper.class);
job.getConfiguration().set("fs.hdfs.impl", org.apache.hadoop.hdfs.DistributedFileSystem.class.getName());
job.getConfiguration().set("fs.file.impl", org.apache.hadoop.fs.LocalFileSystem.class.getName());
Scan scan = new Scan();
scan.setCaching(500); // 1 is the default in Scan, which will be bad for
// MapReduce jobs
scan.setCacheBlocks(false); // don't set to true for MR jobs
TableMapReduceUtil.initTableMapperJob("Image", // input HBase table name
scan, // Scan instance to control CF and attribute selection
FaceCountMapper.class, // mapper
null, // mapper output key
null, // mapper output value
job);
job.setOutputFormatClass(NullOutputFormat.class); // because we aren't
// emitting anything
// from mapper
job.addCacheFile(new URI("/user/hduser/haarcascade_frontalface_alt.xml"));
job.addFileToClassPath(new Path("/user/hduser/hipi-2.1.0.jar"));
job.addFileToClassPath(new Path("/user/hduser/javacpp.jar"));
DistributedCache.addFileToClassPath(new Path("/user/hduser/haarcascade_frontalface_alt.xml"), conf);
conf.set("mapred.job.tracker", "local");
// Execute the MapReduce job and block until it complets
boolean success = job.waitForCompletion(true);
// Return success or failure
return success ? 0 : 1;
当我跑步的时候
java.lang.exception:java.lang.unsatifiedLinkError:org.opencv.objdetect.cascadeclassifier.cascadeclassifier\u 1(ljava/lang/string;)j
错误。
但是opencv.jar在hadoop\u类路径中提供了
1条答案
按热度按时间gwo2fgha1#
当应用程序尝试加载本机库(如
.so
在linux中,.dll
在窗口或.dylib
在mac中,该库不存在。具体地说,为了找到所需的本机库,jvm同时查看path环境变量和java.library.path
系统属性。此外,如果应用程序已经加载了库,并且应用程序尝试再次加载它,则
UnsatisfiedLinkError
将由jvm抛出。此外,必须验证本机库是否存在于应用程序的java.library.path或path环境库中。如果仍然找不到库,请尝试提供system.loadlibrary方法的绝对路径。在您的情况下,请从调用者处尝试下面的方法,看看类路径元素是什么。
基于这些输入,您可以调整类路径条目(在本例中是opencvjar之类的),并查看是否正常工作。