在运行ide的spark代码(安装在mac上)和运行cloudera vm(5.13)的作业时,我得到了退出代码1。我尝试了多个博客,但无法解决此问题。我可以在本地模式下运行。
有人能帮我修改一下配置吗?
错误:
20/05/11 14:00:50 INFO Client: Application report for application_1589208198746_0018 (state: FAILED)
20/05/11 14:00:50 INFO Client:
client token: N/A
diagnostics: Application application_1589208198746_0018 failed 2 times due to AM Container for appattempt_1589208198746_0018_000002 exited with exitCode: 1
For more detailed output, check application tracking page:http://quickstart.cloudera:8088/proxy/application_1589208198746_0018/Then, click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_1589208198746_0018_02_000001
Exit code: 1
使用的代码:
package sparkDataframe
import org.apache.spark.sql.SparkSession
import java.util.Properties
object clouderaSpark extends App {
val spark = SparkSession.builder
.master("yarn")
.config("spark.hadoop.fs.defaultFS","hdfs://112.162.1.127:8020")
.config("spark.hadoop.yarn.resourcemanager.address","hdfs://112.162.1.127:8032")
.config("spark.yarn.jars","hdfs://112.162.1.127:8020/user/sanjeebpanda/jars/*.jar")
.config("spark.hadoop.yarn.application.classpath", "/etc/alternatives/hadoop-conf/*,/usr/lib/hadoop/lib/*,/usr/lib/hadoop/*,/usr/lib/hadoop-hdfs/*,/usr/lib/hadoop-hdfs/lib/*,/usr/lib/hadoop-mapreduce/*,/usr/lib/hadoop-mapreduce/lib/*,/usr/lib/hadoop-yarn/*,/usr/lib/hadoop-yarn/lib/*")
.appName("Spark Word Count")
.getOrCreate()
println("great to start")
}
暂无答案!
目前还没有任何答案,快来回答吧!