我使用sparkauncher从代码提交spark应用程序。
Map<String, String> env = new HashMap<String, String>();
env.put("SPARK_PRINT_LAUNCH_COMMAND", "1");
System.out.println("Creating SparkLauncher");
SparkLauncher launcher = new SparkLauncher(env);
launcher.setSparkHome(sparkHome);
launcher.setAppResource(appResource);
launcher.setMaster(sparkMaster);
launcher.setMainClass(mainClass);
launcher.setAppName("TestFromJersey")
.setVerbose(true)
.setConf("spark.cores.max", "20")
.setConf("spark.executor.memory", "30G")
.setConf("spark.executor.extraJavaOptions", "-XX:+UseG1GC -XX:+PrintFlagsFinal -XX:+PrintReferenceGC -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+PrintAdaptiveSizePolicy -XX:+UnlockDiagnosticVMOptions -XX:+G1SummarizeConcMark -XX:ConcGCThreads=13 -XX:NewRatio=1 -XX:+PrintTenuringDistribution");
for(File f : new File("/home/user/jars").listFiles()){
launcher.addJar(f.getAbsolutePath());
}
try {
System.out.println("Launching Spark Job from SparkLauncher");
launcher.addAppArgs("--jobName myJobName",
"--time "+System.currentTimeMillis()+"",
"--authUser admin",
"--savePage true");
launcher.startApplication(new SparkAppHandle.Listener() {
@Override public void stateChanged(SparkAppHandle h) {
System.out.println("App State:" + h.getState());
}
@Override public void infoChanged(SparkAppHandle h) { }
});
} catch (IOException e) {
// TODO Auto-generated catch block
System.out.println("error in launching Spark Application");
e.printStackTrace();
}
现在,作业根本没有运行,处理程序函数也没有被调用,我认为这是因为作业配置中的错误。在输出日志中,我可以看到以下错误:
org.apache.commons.cli.UnrecognizedOptionException: Unrecognized option: -jobName myJobName
所以,在我看来,代码无法区分幽灵和斯巴卡古门。
有什么建议吗?我用来添加图片的格式正确吗?
1条答案
按热度按时间9rnv2umw1#
好的,经过一些反复试验,指定参数的正确方法是: