我想把mongodb和hadoop结合起来。我找到的是mongo hadoop连接器。但是我找不到关于这个例子的完整文档。
文件中有四个文件 mongo-hadoop/examples/sensors
, build
, run_job.sh
, src
, testdata_generator.js
分别是。我使用 testdata_generator.js
,dbs是 demo
. 当我想跑的时候 run_job.sh
,有一个例外:
MongoDB shell version: 2.6.1
connecting to: demo
false
Exception in thread "main" java.lang.ClassNotFoundException: -D
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:249)
at org.apache.hadoop.util.RunJar.main(RunJar.java:205)
运行\u job.sh
# !/bin/sh
mongo demo --eval "db.logs_aggregate.drop()"
# Set your HADOOP_HOME directory here.
# export HADOOP_HOME="/Users/mike/hadoop/hadoop-2.0.0-cdh4.3.0"
export HADOOP_HOME="/home/hduser/hadoop"
# FIRST PASS - map all the devices into an output collection
declare -a job1_args
job1_args=("jar" "`pwd`/build/libs/sensors-1.2.1-SNAPSHOT-hadoop_2.2.jar")
# job1_args=(${job1_args[@]} "com.mongodb.hadoop.examples.sensors.Devices")
job1_args=(${job1_args[@]} "-D" "mongo.job.input.format=com.mongodb.hadoop.MongoInputFormat")
job1_args=(${job1_args[@]} "-D" "mongo.input.uri=mongodb://localhost:27017/demo.devices")
job1_args=(${job1_args[@]} "-D" "mongo.job.mapper=com.mongodb.hadoop.examples.sensors.DeviceMapper")
job1_args=(${job1_args[@]} "-D" "mongo.job.reducer=com.mongodb.hadoop.examples.sensors.DeviceReducer")
job1_args=(${job1_args[@]} "-D" "mongo.job.output.key=org.apache.hadoop.io.Text")
job1_args=(${job1_args[@]} "-D" "mongo.job.output.value=org.apache.hadoop.io.Text")
job1_args=(${job1_args[@]} "-D" "mongo.output.uri=mongodb://localhost:27017/demo.logs_aggregate")
job1_args=(${job1_args[@]} "-D" "mongo.job.output.format=com.mongodb.hadoop.MongoOutputFormat")
$HADOOP_HOME/bin/hadoop "${job1_args[@]}" "$1"
我可以在我的计算机上运行基本的map/reduce示例,但是这个问题困扰了我很多天。。。
新编辑内容:
我可以通过以下步骤运行此示例:
编译 Devices.java
, DeviceMapper.java
, DeviceReducer.java
,和 SensorDataGenerator.java
到。类;命令是 javac -classpath [library files] -d [folders] Devices.java DeviceMapper.java DeviceReducer.java SensorDataGenerator.java
将.class文件编译成.jar;命令是 jar -cvf [jar file name] -C [path]
执行hadoop;命令是hadoop jar [jar file name] [class name]
但我不知道为什么我不能执行 run_job.sh
成功。
devices.java是本示例中的主java文件:
public class Devices extends MongoTool {
public Devices() throws UnknownHostException {
Configuration conf = new Configuration();
MongoConfig config = new MongoConfig(conf);
setConf(conf);
config.setInputFormat(MongoInputFormat.class);
config.setInputURI("mongodb://localhost:27017/demo.devices");
config.setOutputFormat(MongoOutputFormat.class);
config.setOutputURI("mongodb://localhost:27017/demo.logs_aggregate");
config.setMapper(DeviceMapper.class);
config.setReducer(DeviceReducer.class);
config.setMapperOutputKey(Text.class);
config.setMapperOutputValue(Text.class);
config.setOutputKey(IntWritable.class);
config.setOutputValue(BSONWritable.class);
new SensorDataGenerator().run();
}
public static void main(final String[] pArgs) throws Exception {
System.exit(ToolRunner.run(new Devices(), pArgs));
}
}
1条答案
按热度按时间krugob8w1#
使用gradle运行它。这些bash脚本有点过时,应该删除:
/梯度传感器数据