我尝试使用hadoopmongodb连接器将集合从mongodb复制到hadoop,代码如下:package hdfs;
import java.io.*;
import org.apache.commons.logging.*;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.lib.output.*;
import org.apache.hadoop.mapreduce.*;
import org.bson.*;
import com.mongodb.hadoop.*;
import com.mongodb.hadoop.util.*;
public class ImportWeblogsFromMongo {
private static final Log log = LogFactory.getLog(ImportWeblogsFromMongo.class);
public static class ReadWeblogsFromMongo extends Mapper<Object, BSONObject, Text, Text> {
public void map(Object key, BSONObject value, Context context) throws IOException, InterruptedException {
System.out.println("Key: " + key);
System.out.println("Value: " + value);
String md5 = value.get("md5").toString();
String url = value.get("url").toString();
String date = value.get("date").toString();
String time = value.get("time").toString();
String ip = value.get("ip").toString();
String output = "\t" + url + "\t" + date + "\t" + time + "\t" + ip;
context.write(new Text(md5), new Text(output));
}
}
public static void main(String[] args) throws Exception {
final Configuration conf = new Configuration();
MongoConfigUtil.setInputURI(conf,"mongodb://localhost:27017/clusterdb.fish");
MongoConfigUtil.setCreateInputSplits(conf, false);
System.out.println("Configuration: " + conf);
@SuppressWarnings("deprecation")
final Job job = new Job(conf, "Mongo Import");
Path out = new Path("/home/mongo_import");
FileOutputFormat.setOutputPath(job, out);
job.setJarByClass(ImportWeblogsFromMongo.class);
job.setMapperClass(ReadWeblogsFromMongo.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(Text.class);
job.setInputFormatClass(MongoInputFormat.class);
job.setOutputFormatClass(TextOutputFormat.class);
job.setNumReduceTasks(0);
System.exit(job.waitForCompletion(true) ? 0 : 1);
}
}
1导出名为 importmongo.jar
2我试图执行这个命令 hadoop jar /home/yass/importmongo.jar hdfs.ImportWeblogsFromMongo
但我有以下错误:
Exception in thread "main" java.lang.NoClassDefFoundError: com/mongodb/hadoop/util/MongoConfigUtil
at hdfs.ImportWeblogsFromMongo.main(ImportWeblogsFromMongo.java:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.ClassNotFoundException: com.mongodb.hadoop.util.MongoConfigUtil
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
注意:clustedb是数据库名,fish是它的集合,hdfs.importweblogsfrommongo是package.class
有什么建议吗
1条答案
按热度按时间t2a7ltrp1#
我没有用这种方法解决这个问题,但我用
Mongodump
通过将文件复制到Hdfs
,下面的行可以帮助某人完成工作