sbt无法解析依赖项

siv3szwd  于 2021-06-03  发布在  Hadoop
关注(0)|答案(3)|浏览(557)

我在一个代理后面运行sbt(配置在sbt的conf文件中),它无法解析某些 org.apache.hadoop#hadoop-mapreduce-client* 依赖项:

[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.hadoop#hadoop-mapreduce-client-app;2.3.0: not found
[error] unresolved dependency: org.apache.hadoop#hadoop-mapreduce-client-core;2.3.0: not found
[error] unresolved dependency: org.apache.hadoop#hadoop-mapreduce-client-jobclient;2.3.0: not found
sbt.ResolveException: unresolved dependency: org.apache.hadoop#hadoop-mapreduce-client-app;2.3.0: not found
unresolved dependency: org.apache.hadoop#hadoop-mapreduce-client-core;2.3.0: not found
unresolved dependency: org.apache.hadoop#hadoop-mapreduce-client-jobclient;2.3.0: not found
    at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:278)
    at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:175)
    at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:157)
    at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:151)
    at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:151)
    at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:128)
    at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:56)
    at sbt.IvySbt$$anon$4.call(Ivy.scala:64)
    at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:93)
    at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:78)
    at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:97)
    at xsbt.boot.Using$.withResource(Using.scala:10)
    at xsbt.boot.Using$.apply(Using.scala:9)
    at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:58)
    at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:48)
    at xsbt.boot.Locks$.apply0(Locks.scala:31)
    at xsbt.boot.Locks$.apply(Locks.scala:28)
    at sbt.IvySbt.withDefaultLogger(Ivy.scala:64)
    at sbt.IvySbt.withIvy(Ivy.scala:123)
    at sbt.IvySbt.withIvy(Ivy.scala:120)
    at sbt.IvySbt$Module.withModule(Ivy.scala:151)
    at sbt.IvyActions$.updateEither(IvyActions.scala:157)
    at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1318)
    at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1315)
    at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$85.apply(Defaults.scala:1345)
    at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$85.apply(Defaults.scala:1343)
    at sbt.Tracked$$anonfun$lastOutput$1.apply(Tracked.scala:35)
    at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1348)
    at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1342)
    at sbt.Tracked$$anonfun$inputChanged$1.apply(Tracked.scala:45)
    at sbt.Classpaths$.cachedUpdate(Defaults.scala:1360)
    at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1300)
    at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1275)
    at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
    at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
    at sbt.std.Transform$$anon$4.work(System.scala:63)
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
    at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
    at sbt.Execute.work(Execute.scala:235)
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
    at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
    at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
[error] sbt.ResolveException: unresolved dependency: org.apache.hadoop#hadoop-mapreduce-client-app;2.3.0: not found
[error] unresolved dependency: org.apache.hadoop#hadoop-mapreduce-client-core;2.3.0: not found
[error] unresolved dependency: org.apache.hadoop#hadoop-mapreduce-client-jobclient;2.3.0: not found
[error] Use 'last' for the full log.
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=384M; support was removed in 8.0

这是我的sbt.build文件

name := "SparkSandbox"
version := "0.1"
scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.5.1"
libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.3.0"
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "2.2.5"

我已经跑了 sbt clean 以及 sbt update ,如这里所建议的。我的存储库文件如下所述。
但是,当我试图编译时,它仍然被卡住了,因为它无法解析依赖关系。我错过了什么?

vjhs03f7

vjhs03f71#

在依赖关系“org.apache.hadoop”%”hadoop-client\ u 2.11“%”2.3.0中的groupid末尾添加版本

bn31dyow

bn31dyow2#

sbt尊重http代理设置的常用环境变量:

export JAVA_OPTS="$JAVA_OPTS -Dhttp.proxyHost=yourserver -Dhttp.proxyPort=8080 -Dhttp.proxyUser=username -Dhttp.proxyPassword=password"

(当然,假设是unix(linux/osx等)。在windows上,您可以像往常一样以windows方式设置相同的环境变量(%java\u opts%)
信用卡:如何从代理后面使用sbt?

jgwigjjp

jgwigjjp3#

我能解决这个问题。请停止所有的执事,重新开始。下面是我在快速入门vm中遵循的步骤。
1.sudo服务hadoop主站
2.sudo服务hadoop主启动
3.hadoop dfsadmin-安全模式离开
现在使用sparkshell命令启动spark。
请现在运行spark程序。

相关问题