有人知道是什么让spark花这么长时间来装填它的外壳吗?sparkshell命令为我加载sparkshell大约需要6分钟。我想这不正常。我在一个hadoop集群上运行这个程序,这个集群由4个raspberrypi4组成,内存为4gb。以下是Spark壳加载,
Java HotSpot(TM) Server VM warning: You have loaded library /opt/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
2020-07-10 09:05:07,903 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
2020-07-10 09:05:47,663 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
Spark context Web UI available at http://pi1:4040
Spark context available as 'sc' (master = yarn, app id = application_1594337770867_0003).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 3.0.0
/_/
Using Scala version 2.12.10 (Java HotSpot(TM) Server VM, Java 1.8.0_251)
Type in expressions to have them evaluated.
Type :help for more information.
任何建议或帮助都将不胜感激!
暂无答案!
目前还没有任何答案,快来回答吧!