如何为集成pyspark和flume添加sbt依赖项

dw1jzc5e  于 2021-06-04  发布在  Flume
关注(0)|答案(1)|浏览(376)

我已经试了很多次了,但我一次又一次地面对这个问题,有人能帮我添加pyspark和flume集成的sbt依赖下面是我的代码。

  1. spark-submit --packages 'org.apache.spark:spark-streaming-flume-assembly_2.12:2.4.5' spark_flume.py
  2. Ivy Default Cache set to: /home/hduser/.ivy2/cache
  3. The jars for the packages stored in: /home/hduser/.ivy2/jars
  4. :: loading settings :: url = jar:file:/usr/local/spark/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
  5. org.apache.spark#spark-streaming-flume-assembly_2.12 added as a dependency
  6. :: resolving dependencies :: org.apache.spark#spark-submit-parent-ab867e8f-f121-4402-a63c-942bac3932c1;1.0
  7. confs: [default]
  8. found org.apache.spark#spark-streaming-flume-assembly_2.12;2.4.5 in central
  9. found org.spark-project.spark#unused;1.0.0 in central
  10. :: resolution report :: resolve 812ms :: artifacts dl 15ms
  11. :: modules in use:
  12. org.apache.spark#spark-streaming-flume-assembly_2.12;2.4.5 from central in [default]
  13. org.spark-project.spark#unused;1.0.0 from central in [default]
  14. ---------------------------------------------------------------------
  15. | | modules || artifacts |
  16. | conf | number| search|dwnlded|evicted|| number|dwnlded|
  17. ---------------------------------------------------------------------
  18. | default | 2 | 0 | 0 | 0 || 2 | 0 |
  19. ---------------------------------------------------------------------
  20. :: retrieving :: org.apache.spark#spark-submit-parent-ab867e8f-f121-4402-a63c-942bac3932c1
  21. confs: [default]
  22. 0 artifacts copied, 2 already retrieved (0kB/18ms)
  23. 20/05/15 15:35:18 WARN Utils: Your hostname, localhost.localdomain resolves to a loopback address: 127.0.0.1; using 192.168.19.137 instead (on interface ens33)
  24. 20/05/15 15:35:18 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
  25. 20/05/15 15:35:19 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  26. File "/home/hduser/pyspark_data1/spark_stream1/spark_flume.py", line 6
  27. artifactID=spark-streaming-flume_2.12
  28. ^
  29. SyntaxError: invalid syntax
  30. log4j:WARN No appenders could be found for logger (org.apache.spark.util.ShutdownHookManager).
  31. log4j:WARN Please initialize the log4j system properly.
  32. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
  33. [hduser@localhost spark_stream1]$
qzlgjiam

qzlgjiam1#

这是一个 SyntaxError 在你的 spark_flume.py 第6行的文件,与

  1. artifactID=spark-streaming-flume_2.12

我相信你需要 spark-streaming-flume_2.12 作为一根弦 "..."

相关问题