在创建会话之前取消pyspark警告

gcuhipw9  于 2024-01-06  发布在  Spark
关注(0)|答案(1)|浏览(170)

在计算机上本地工作时,我无法抑制pyspark警告。我收到以下警告,我想“抑制”它们:

  1. WARN Utils: Your hostname, [HOSTNAME] resolves to a loopback address: 127.0.1.1; using 192.168.26.41 instead (on interface [INTERFACE])
  2. WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
  3. Setting default log level to "WARN".
  4. To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
  5. WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

字符串
以下是我试图抑制警告的内容,但它们都不起作用:

  1. from pyspark.sql import SparkSession
  2. method_number = 1 # 2, 3
  3. def get_spark_object():
  4. return SparkSession.builder.getOrCreate()
  5. if method_number == 1:
  6. import sys
  7. save_stdout = sys.stdout
  8. sys.stdout = open('trash', 'w')
  9. spark = get_spark_object()
  10. sys.stdout = save_stdout
  11. elif method_number == 2:
  12. import warnings
  13. warnings.filterwarnings('ignore')
  14. warnings.simplefilter('ignore')
  15. spark = get_spark_object()
  16. elif method_number == 3:
  17. import shutup
  18. shutup.please()
  19. spark = get_spark_object()

tez616oj

tez616oj1#

这里有一个替代方法,可以帮助抑制这些特定的警告。

  1. from pyspark.sql import SparkSession
  2. import os
  3. # Set environment variables to suppress specific warnings
  4. os.environ["SPARK_LOCAL_IP"] = "127.0.0.1"
  5. os.environ["PYSPARK_SUBMIT_ARGS"] = "--conf spark.ui.showConsoleProgress=false"
  6. # Create Spark session
  7. spark = SparkSession.builder.appName("yourAppName").getOrCreate()

字符串
这样,您就不需要重定向或过滤警告;相反,您正在配置Spark以避免发出某些警告。

相关问题