org.apache.spark.util.Utils.isWindows()方法的使用及代码示例

x33g5p2x  于2022-02-01 转载在 其他  
字(3.6k)|赞(0)|评价(0)|浏览(131)

本文整理了Java中org.apache.spark.util.Utils.isWindows()方法的一些代码示例,展示了Utils.isWindows()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。Utils.isWindows()方法的具体详情如下:
包路径:org.apache.spark.util.Utils
类名称:Utils
方法名:isWindows

Utils.isWindows介绍

暂无

代码示例

代码示例来源:origin: org.apache.spark/spark-core

@Test
public void testChildProcLauncher() throws Exception {
 // This test is failed on Windows due to the failure of initiating executors
 // by the path length limitation. See SPARK-18718.
 assumeTrue(!Utils.isWindows());
 SparkSubmitOptionParser opts = new SparkSubmitOptionParser();
 Map<String, String> env = new HashMap<>();
 env.put("SPARK_PRINT_LAUNCH_COMMAND", "1");
 launcher
  .setMaster("local")
  .setAppResource(SparkLauncher.NO_RESOURCE)
  .addSparkArg(opts.CONF,
   String.format("%s=-Dfoo=ShouldBeOverriddenBelow", SparkLauncher.DRIVER_EXTRA_JAVA_OPTIONS))
  .setConf(SparkLauncher.DRIVER_EXTRA_JAVA_OPTIONS,
   "-Dfoo=bar -Dtest.appender=console")
  .setConf(SparkLauncher.DRIVER_EXTRA_CLASSPATH, System.getProperty("java.class.path"))
  .addSparkArg(opts.CLASS, "ShouldBeOverriddenBelow")
  .setMainClass(SparkLauncherTestApp.class.getName())
  .redirectError()
  .addAppArgs("proc");
 final Process app = launcher.launch();
 new OutputRedirector(app.getInputStream(), getClass().getName() + ".child", TF);
 assertEquals(0, app.waitFor());
}

代码示例来源:origin: org.apache.spark/spark-core_2.11

@Test
public void testChildProcLauncher() throws Exception {
 // This test is failed on Windows due to the failure of initiating executors
 // by the path length limitation. See SPARK-18718.
 assumeTrue(!Utils.isWindows());
 SparkSubmitOptionParser opts = new SparkSubmitOptionParser();
 Map<String, String> env = new HashMap<>();
 env.put("SPARK_PRINT_LAUNCH_COMMAND", "1");
 launcher
  .setMaster("local")
  .setAppResource(SparkLauncher.NO_RESOURCE)
  .addSparkArg(opts.CONF,
   String.format("%s=-Dfoo=ShouldBeOverriddenBelow", SparkLauncher.DRIVER_EXTRA_JAVA_OPTIONS))
  .setConf(SparkLauncher.DRIVER_EXTRA_JAVA_OPTIONS,
   "-Dfoo=bar -Dtest.appender=console")
  .setConf(SparkLauncher.DRIVER_EXTRA_CLASSPATH, System.getProperty("java.class.path"))
  .addSparkArg(opts.CLASS, "ShouldBeOverriddenBelow")
  .setMainClass(SparkLauncherTestApp.class.getName())
  .redirectError()
  .addAppArgs("proc");
 final Process app = launcher.launch();
 new OutputRedirector(app.getInputStream(), getClass().getName() + ".child", TF);
 assertEquals(0, app.waitFor());
}

代码示例来源:origin: org.apache.spark/spark-core_2.10

@Test
public void testChildProcLauncher() throws Exception {
 // This test is failed on Windows due to the failure of initiating executors
 // by the path length limitation. See SPARK-18718.
 assumeTrue(!Utils.isWindows());
 SparkSubmitOptionParser opts = new SparkSubmitOptionParser();
 Map<String, String> env = new HashMap<>();
 env.put("SPARK_PRINT_LAUNCH_COMMAND", "1");
 launcher
  .setMaster("local")
  .setAppResource(SparkLauncher.NO_RESOURCE)
  .addSparkArg(opts.CONF,
   String.format("%s=-Dfoo=ShouldBeOverriddenBelow", SparkLauncher.DRIVER_EXTRA_JAVA_OPTIONS))
  .setConf(SparkLauncher.DRIVER_EXTRA_JAVA_OPTIONS,
   "-Dfoo=bar -Dtest.appender=childproc")
  .setConf(SparkLauncher.DRIVER_EXTRA_CLASSPATH, System.getProperty("java.class.path"))
  .addSparkArg(opts.CLASS, "ShouldBeOverriddenBelow")
  .setMainClass(SparkLauncherTestApp.class.getName())
  .addAppArgs("proc");
 final Process app = launcher.launch();
 new OutputRedirector(app.getInputStream(), TF);
 new OutputRedirector(app.getErrorStream(), TF);
 assertEquals(0, app.waitFor());
}

相关文章