无法在jupyter笔记本中安装pyspark

inb24sb2  于 2021-05-27  发布在  Spark
关注(0)|答案(1)|浏览(698)

我已经成功地安装了python和anaconda,但是在配置pyspark时我遇到了一些问题。

  1. Python 3.7.6 (default, Jan 8 2020, 20:23:39)
  2. [MSC v.1916 64 bit (AMD64)] :: Anaconda, Inc. on win32
  3. Type "help", "copyright", "credits" or "license" for more information.
  4. 20/05/24 13:33:48 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  5. Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
  6. Setting default log level to "WARN".
  7. To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
  8. Welcome to
  9. ____ __
  10. / __/__ ___ _____/ /__
  11. _\ \/ _ \/ _ `/ __/ '_/
  12. /__ / .__/\_,_/_/ /_/\_\ version 3.0.0-preview2
  13. /_/
  14. Using Python version 3.7.6 (default, Jan 8 2020 20:23:39)
  15. SparkSession available as 'spark'.
  16. >>> 20/05/24 13:34:05 WARN ProcfsMetricsGetter: Exception when trying to compute pagesize, as a result reporting of ProcessTree metrics is stopped.

谁能帮我这个或提供我的博客配置pyspark为jupyter笔记本。

q8l4jmvw

q8l4jmvw1#

您可以尝试在windows中设置以下环境变量
pyspark\u驱动程序\u python\u选择作为笔记本
pyspark\u驱动程序\u python作为jupyter
一旦设置了这些环境变量,当您在cmd中键入pyspark时,它将直接打开配置了pyspark的jupyter笔记本。

相关问题