我对cascading/hadoop不太熟悉,正在尝试以本地模式(即在内存中)运行一个简单的示例。该示例仅复制一个文件:
import java.util.Properties;
import cascading.flow.Flow;
import cascading.flow.FlowConnector;
import cascading.flow.FlowDef;
import cascading.flow.local.LocalFlowConnector;
import cascading.pipe.Pipe;
import cascading.property.AppProps;
import cascading.scheme.hadoop.TextLine;
import cascading.tap.Tap;
import cascading.tap.hadoop.Hfs;
public class CascadingTest {
public static void main(String[] args) {
Properties properties = new Properties();
AppProps.setApplicationJarClass( properties, CascadingTest.class );
FlowConnector flowConnector = new LocalFlowConnector();
// create the source tap
Tap inTap = new Hfs( new TextLine(), "D:\\git_workspace\\Impatient\\part1\\data\\rain.txt" );
// create the sink tap
Tap outTap = new Hfs( new TextLine(), "D:\\git_workspace\\Impatient\\part1\\data\\out.txt" );
// specify a pipe to connect the taps
Pipe copyPipe = new Pipe( "copy" );
// connect the taps, pipes, etc., into a flow
FlowDef flowDef = FlowDef.flowDef()
.addSource( copyPipe, inTap )
.addTailSink( copyPipe, outTap );
// run the flow
Flow flow = flowConnector.connect( flowDef );
flow.complete();
}
}
下面是我得到的错误:
09-25-12 11:30:38,114 INFO - AppProps - using app.id: 9C82C76AC667FDAA2F6969A0DF3949C6
Exception in thread "main" cascading.flow.planner.PlannerException: could not build flow from assembly: [java.util.Properties cannot be cast to org.apache.hadoop.mapred.JobConf]
at cascading.flow.planner.FlowPlanner.handleExceptionDuringPlanning(FlowPlanner.java:515)
at cascading.flow.local.planner.LocalPlanner.buildFlow(LocalPlanner.java:84)
at cascading.flow.FlowConnector.connect(FlowConnector.java:454)
at com.x.y.CascadingTest.main(CascadingTest.java:37)
Caused by: java.lang.ClassCastException: java.util.Properties cannot be cast to org.apache.hadoop.mapred.JobConf
at cascading.tap.hadoop.Hfs.sourceConfInit(Hfs.java:78)
at cascading.flow.local.LocalFlowStep.initTaps(LocalFlowStep.java:77)
at cascading.flow.local.LocalFlowStep.getInitializedConfig(LocalFlowStep.java:56)
at cascading.flow.local.LocalFlowStep.createFlowStepJob(LocalFlowStep.java:135)
at cascading.flow.local.LocalFlowStep.createFlowStepJob(LocalFlowStep.java:38)
at cascading.flow.planner.BaseFlowStep.getFlowStepJob(BaseFlowStep.java:588)
at cascading.flow.BaseFlow.initializeNewJobsMap(BaseFlow.java:1162)
at cascading.flow.BaseFlow.initialize(BaseFlow.java:184)
at cascading.flow.local.planner.LocalPlanner.buildFlow(LocalPlanner.java:78)
... 2 more
3条答案
按热度按时间j2cgzkjk1#
欢迎来到级联-
我刚刚回答了级联用户列表,但简而言之,问题是本地和hadoop模式类的混合。。此代码具有localflowconnector,但随后使用hfs抽头。
当我回到“不耐烦”教程中使用的类时,它会正确运行:https://gist.github.com/3784194
oaxa6hgo2#
更详细一点:不能在级联中混合本地类和hadoop类,因为它们假设不同且不兼容的环境。在您的例子中,您正在尝试使用hadoop taps创建一个本地流,后者需要hadoop
JobConf
而不是Properties
用于配置本地抽头的对象。如果您使用
cascading.tap.local.FileTap
而不是cascading.tap.hadoop.Hfs
.eoxn13cs3#
是的,您需要使用lfs(本地文件系统)tap而不是hfs(hadoop文件系统)。
您还可以在本地模式下/从eclipse使用junit测试用例(带有级联unittestjar)测试代码。
http://www.cascading.org/2012/08/07/cascading-for-the-impatient-part-6/