spark sbt错误获取工件https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-hdfs/2.6.5/hadoop-hdfs-2.6.5.jar 校验和错误

m1m5dgzv  于 2021-05-29  发布在  Spark
关注(0)|答案(1)|浏览(556)

当我在intellij中创建第一个基于sparkscala的项目时,我的sbt设置工作正常。sbt如下

  1. name := "sample1"
  2. version := "0.1"
  3. //running in cluster
  4. //scalaVersion := "2.11.0"
  5. //val sparkVersion = "2.1.1"
  6. //running in local
  7. scalaVersion := "2.11.12"
  8. val sparkVersion = "2.3.2"
  9. //sbt when running in cluster using spark-submit
  10. /*
  11. libraryDependencies ++= Seq(
  12. "org.apache.spark" %% "spark-core" % sparkVersion % "provided",
  13. "org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
  14. "org.apache.spark" %% "spark-mllib" % sparkVersion % "provided",
  15. "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided"
  16. )
  17. * /
  18. //sbt when running locally
  19. libraryDependencies ++= Seq(
  20. "org.apache.spark" %% "spark-core" % sparkVersion,
  21. "org.apache.spark" %% "spark-sql" % sparkVersion,
  22. "org.apache.spark" %% "spark-mllib" % sparkVersion,
  23. "org.apache.spark" %% "spark-streaming" % sparkVersion
  24. )

但是当我用新的sbt设置创建一个新项目时,我得到了一个错误。试图更改版本,但没有帮助。新项目的新sbt是

  1. name := "ranger_policy"
  2. version := "0.1"
  3. scalaVersion := "2.11.11"
  4. libraryDependencies ++= Seq(
  5. "org.apache.spark" % "spark-core_2.11" % "2.2.0",
  6. "org.apache.spark" % "spark-sql_2.11" % "2.2.0",
  7. "org.apache.spark" % "spark-mllib_2.10" % "1.1.0"
  8. )

我的错误是

  1. [error] stack trace is suppressed; run 'last update' for the full output
  2. [error] stack trace is suppressed; run 'last ssExtractDependencies' for the full output
  3. [error] (update) lmcoursier.internal.shaded.coursier.error.FetchError$DownloadingArtifacts: Error fetching artifacts:
  4. [error] https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-hdfs/2.6.5/hadoop-hdfs-2.6.5.jar: wrong checksum: C:\Users\vidushi.jaiswal\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\hadoop\hadoop-hdfs\2.6.5\hadoop-hdfs-2.6.5.jar (expected SHA-1 8bd0f95e29b9ba7960b4239a7f3706b37b183652 in C:\Users\vidushi.jaiswal\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\hadoop\hadoop-hdfs\2.6.5\.hadoop-hdfs-2.6.5.jar__sha1, got 495045c7fe5110559fa09d7f2381cd3483189a7)
  5. [error] (ssExtractDependencies) lmcoursier.internal.shaded.coursier.error.FetchError$DownloadingArtifacts: Error fetching artifacts:
  6. [error] https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-hdfs/2.6.5/hadoop-hdfs-2.6.5.jar: wrong checksum: C:\Users\vidushi.jaiswal\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\hadoop\hadoop-hdfs\2.6.5\hadoop-hdfs-2.6.5.jar (expected SHA-1 8bd0f95e29b9ba7960b4239a7f3706b37b183652 in C:\Users\vidushi.jaiswal\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\hadoop\hadoop-hdfs\2.6.5\.hadoop-hdfs-2.6.5.jar__sha1, got 495045c7fe5110559fa09d7f2381cd3483189a7)
  7. [error] Total time: 247 s (04:07), completed Jun 7, 2020 5:35:48 PM
  8. [info] shutting down sbt server
fae0ux8s

fae0ux8s1#

我在通过sbt导入sparksql包时也遇到了同样的问题。中有多个sparksql文件夹,即sparksql2.12和sparksql2.11。当我删除它们并刷新build.sbt时,问题得到了解决。
在这个路径中c:\users\vidushi.jaiswal\appdata\local\coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\hadoop\hadoop hdfs
同一个包可能有多个不同版本的文件夹。全部删除并刷新build.sbt。所有依赖项都将新下载。

相关问题