我是spark的新手,我想用java语言在这个框架上运行一个应用程序。我尝试了以下代码:
public class Alert_Arret {
private static final SparkSession sparkSession = SparkSession.builder().master("local[*]").appName("Stateful Streaming Example").config("spark.sql.warehouse.dir", "file:////C:/Users/sgulati/spark-warehouse").getOrCreate();
public static Properties Connection_db () {
Properties connectionProperties = new Properties();
connectionProperties.put("user", "xxxxx");
connectionProperties.put("password", "xxxxx");
connectionProperties.put("driver","com.mysql.jdbc.Driver");
connectionProperties.put("url","xxxxxxxxxxxxxxxxx");
return connectionProperties;
}
public static void GetData() {
boolean checked = true;
String dtable = "alerte_prog";
String dtable2 = "last_tracking_tdays";
Dataset<Row> df_a_prog = sparkSession.read().jdbc("jdbc:mysql://host:port/database", dtable, Connection_db());
// df_a_prog.printSchema();
Dataset<Row> track = sparkSession.read().jdbc("jdbc:mysql://host:port/database", dtable2, Connection_db());
if (df_a_prog.select("heureDebut") != null && df_a_prog.select("heureFin") != null ) {
track.withColumn("tracking_hour/minute", from_unixtime(unix_timestamp(col("tracking_time")), "HH:mm")).show() }
}
public static void main(String[] args) {
Connection_db();
GetData();
}
}
当我运行此代码时,不会显示任何内容,我得到以下结果:
0/05/11 14:00:31 WARN ProcfsMetricsGetter: Exception when trying to compute pagesize, as a result reporting of ProcessTree metrics is stopped
# A fatal error has been detected by the Java Runtime Environment:
# EXCEPTION_ACCESS_VIOLATION (0xc0000005) at pc=0x000000006c40022a,pid=3376, tid=0x0000000000002e84
我使用intellij思想和spark:3.0.0版本。
暂无答案!
目前还没有任何答案,快来回答吧!