spark:无法分配请求的地址

wljmcqd8  于 2021-07-12  发布在  Spark
关注(0)|答案(1)|浏览(1058)

这个问题在这里已经有答案了

如何解决运行spark代码时“无法分配请求的地址:服务'sparkdriver'在16次重试后失败”(6个答案)
上个月关门了。
当我运行测试示例时: ./bin/run-example SparkPi 10 我得到下面的错误。
编辑:问题是因为,我切换了ti wifi而不是以太网,这改变了本地主机@mck对先前解决方案的指导很有帮助。
解决方案:在位于spark/bin目录下的load-spark-env.sh文件中添加spark\u local\u ip export SPARK_LOCAL_IP="127.0.0.1" 我得到错误:

WARNING: An illegal reflective access operation has occurred
    WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/home/d/spark/jars/spark-unsafe_2.12-3.1.1.jar) to constructor java.nio.DirectByteBuffer(long,int)
    WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
    WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
    WARNING: All illegal access operations will be denied in a future release
    2021-03-09 15:37:39,164 INFO spark.SparkContext: Running Spark version 3.1.1
    2021-03-09 15:37:39,214 INFO resource.ResourceUtils: ==============================================================
    2021-03-09 15:37:39,215 INFO resource.ResourceUtils: No custom resources configured for spark.driver.
    2021-03-09 15:37:39,215 INFO resource.ResourceUtils: ==============================================================
    2021-03-09 15:37:39,216 INFO spark.SparkContext: Submitted application: Spark Pi
    2021-03-09 15:37:39,240 INFO resource.ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
    2021-03-09 15:37:39,257 INFO resource.ResourceProfile: Limiting resource is cpus at 1 tasks per executor
    2021-03-09 15:37:39,259 INFO resource.ResourceProfileManager: Added ResourceProfile id: 0
    2021-03-09 15:37:39,335 INFO spark.SecurityManager: Changing view acls to: d
    2021-03-09 15:37:39,335 INFO spark.SecurityManager: Changing modify acls to: d
    2021-03-09 15:37:39,335 INFO spark.SecurityManager: Changing view acls groups to: 
    2021-03-09 15:37:39,335 INFO spark.SecurityManager: Changing modify acls groups to: 
    2021-03-09 15:37:39,335 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(d); groups with view permissions: Set(); users  with modify permissions: Set(d); groups with modify permissions: Set()
    2021-03-09 15:37:39,545 WARN util.Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    2021-03-09 15:37:39,557 WARN util.Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    2021-03-09 15:37:39,572 WARN util.Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    2021-03-09 15:37:39,585 WARN util.Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    2021-03-09 15:37:39,597 WARN util.Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    2021-03-09 15:37:39,608 WARN util.Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    2021-03-09 15:37:39,612 WARN util.Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    2021-03-09 15:37:39,641 WARN util.Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    2021-03-09 15:37:39,646 WARN util.Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    2021-03-09 15:37:39,650 WARN util.Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    2021-03-09 15:37:39,654 WARN util.Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    2021-03-09 15:37:39,658 WARN util.Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    2021-03-09 15:37:39,663 WARN util.Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    2021-03-09 15:37:39,673 WARN util.Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    2021-03-09 15:37:39,676 WARN util.Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    2021-03-09 15:37:39,682 WARN util.Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    2021-03-09 15:37:39,705 ERROR spark.SparkContext: Error initializing SparkContext.
    java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
        at java.base/sun.nio.ch.Net.bind0(Native Method)
        at java.base/sun.nio.ch.Net.bind(Net.java:455)
        at java.base/sun.nio.ch.Net.bind(Net.java:447)
        at java.base/sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:227)
        at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:134)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:550)
        at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1334)
        at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:506)
        at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:491)
        at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:973)
        at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:248)
        at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:356)
        at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)
        at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:500)
        at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
        at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
        at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base/java.lang.Thread.run(Thread.java:834)
    2021-03-09 15:37:39,723 INFO spark.SparkContext: Successfully stopped SparkContext
    Exception in thread "main" java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
        at java.base/sun.nio.ch.Net.bind0(Native Method)
        at java.base/sun.nio.ch.Net.bind(Net.java:455)
        at java.base/sun.nio.ch.Net.bind(Net.java:447)
        at java.base/sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:227)
        at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:134)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:550)
        at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1334)
        at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:506)
        at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:491)
        at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:973)
        at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:248)
        at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:356)
        at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)
        at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:500)
        at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
        at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
        at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base/java.lang.Thread.run(Thread.java:834)
    2021-03-09 15:37:39,730 INFO util.ShutdownHookManager: Shutdown hook called
    2021-03-09 15:37:39,731 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-b53dc8d9-adc8-454b-83f5-bd2826004dee
wmtdaxz3

wmtdaxz31#

解决方案:在位于spark/bin目录export spark\u local\u ip=“127.0.0.1”的load-spark-env.sh文件中添加spark\u local\u ip

相关问题