无法在 mac os 上启动 Spark:spark-shell 不工作和 Py4JJavaError

问题描述

我最近在我的 Mac 上下载了 Spark 版本 3.0.1(为 Apache Hadoop 3.2 及更高版本预构建)。

我在测试一个简单的示例代码时遇到错误(Py4JJavaError):

enter image description here

所以我尝试打开 /bin 文件夹中的 spark-shell,但它也没有打开。

现在我正在使用 Java 15.0.1 和 Python 3.8

打开 spark-shell 时的错误消息

jisujung@Jisuui-MacBookPro ~ % /Users/jisujung/Downloads/spark-2.4.7-bin-hadoop2.7/bin/spark-shell ; exit;
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/Users/jisujung/Downloads/spark-2.4.7-bin-hadoop2.7/jars/spark-unsafe_2.11-2.4.7.jar) to method java.nio.Bits.unaligned()
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
21/01/24 16:32:27 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR,use setLogLevel(newLevel).
21/01/24 16:32:32 WARN Utils: Service 'sparkDriver' Could not bind on a random free port. You may check whether configuring an appropriate binding address.
21/01/24 16:32:32 WARN Utils: Service 'sparkDriver' Could not bind on a random free port. You may check whether configuring an appropriate binding address.
21/01/24 16:32:32 WARN Utils: Service 'sparkDriver' Could not bind on a random free port. You may check whether configuring an appropriate binding address.
21/01/24 16:32:32 WARN Utils: Service 'sparkDriver' Could not bind on a random free port. You may check whether configuring an appropriate binding address.
21/01/24 16:32:32 WARN Utils: Service 'sparkDriver' Could not bind on a random free port. You may check whether configuring an appropriate binding address.
21/01/24 16:32:32 WARN Utils: Service 'sparkDriver' Could not bind on a random free port. You may check whether configuring an appropriate binding address.
21/01/24 16:32:32 WARN Utils: Service 'sparkDriver' Could not bind on a random free port. You may check whether configuring an appropriate binding address.
21/01/24 16:32:32 WARN Utils: Service 'sparkDriver' Could not bind on a random free port. You may check whether configuring an appropriate binding address.
21/01/24 16:32:32 WARN Utils: Service 'sparkDriver' Could not bind on a random free port. You may check whether configuring an appropriate binding address.
21/01/24 16:32:32 WARN Utils: Service 'sparkDriver' Could not bind on a random free port. You may check whether configuring an appropriate binding address.
21/01/24 16:32:32 WARN Utils: Service 'sparkDriver' Could not bind on a random free port. You may check whether configuring an appropriate binding address.
21/01/24 16:32:32 WARN Utils: Service 'sparkDriver' Could not bind on a random free port. You may check whether configuring an appropriate binding address.
21/01/24 16:32:32 WARN Utils: Service 'sparkDriver' Could not bind on a random free port. You may check whether configuring an appropriate binding address.
21/01/24 16:32:32 WARN Utils: Service 'sparkDriver' Could not bind on a random free port. You may check whether configuring an appropriate binding address.
21/01/24 16:32:32 WARN Utils: Service 'sparkDriver' Could not bind on a random free port. You may check whether configuring an appropriate binding address.
21/01/24 16:32:32 WARN Utils: Service 'sparkDriver' Could not bind on a random free port. You may check whether configuring an appropriate binding address.
21/01/24 16:32:32 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Can't assign requested address: Service 'sparkDriver' Failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
    at java.base/sun.nio.ch.Net.bind0(Native Method)
    at java.base/sun.nio.ch.Net.bind(Net.java:550)
    at java.base/sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:249)
    at io.netty.channel.socket.nio.NioServerSocketChannel.dobind(NioServerSocketChannel.java:134)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:550)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1334)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:506)
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:491)
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:973)
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:248)
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:356)
    at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:500)
    at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
    at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
    at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
    at java.base/java.lang.Thread.run(Thread.java:832)
21/01/24 16:32:32 ERROR Main: Failed to initialize Spark session.
java.net.BindException: Can't assign requested address: Service 'sparkDriver' Failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
    at java.base/sun.nio.ch.Net.bind0(Native Method)
    at java.base/sun.nio.ch.Net.bind(Net.java:550)
    at java.base/sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:249)
    at io.netty.channel.socket.nio.NioServerSocketChannel.dobind(NioServerSocketChannel.java:134)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:550)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1334)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:506)
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:491)
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:973)
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:248)
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:356)
    at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:500)
    at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
    at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
    at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
    at java.base/java.lang.Thread.run(Thread.java:832)

[process finished]

jupyter notebook 上的错误消息:

Py4JJavaError                             Traceback (most recent call last)
<ipython-input-13-1d1e12ca7331> in <module>
      2 MAX_MEMORY = "32g"
      3 spark = (
----> 4     SparkSession.builder.appName("Test Shell")
      5     .config("spark.driver.memory",MAX_MEMORY)
      6     .getorCreate()

/opt/anaconda3/lib/python3.8/site-packages/pyspark/sql/session.py in getorCreate(self)
    184                             sparkConf.set(key,value)
    185                         # This SparkContext may be an existing one.
--> 186                         sc = SparkContext.getorCreate(sparkConf)
    187                     # Do not update `SparkConf` for existing `SparkContext`,as it's shared
    188                     # by all sessions.

/opt/anaconda3/lib/python3.8/site-packages/pyspark/context.py in getorCreate(cls,conf)
    374         with SparkContext._lock:
    375             if SparkContext._active_spark_context is None:
--> 376                 SparkContext(conf=conf or SparkConf())
    377             return SparkContext._active_spark_context
    378 

/opt/anaconda3/lib/python3.8/site-packages/pyspark/context.py in __init__(self,master,appName,sparkHome,pyFiles,environment,batchSize,serializer,conf,gateway,jsc,profiler_cls)
    133         SparkContext._ensure_initialized(self,gateway=gateway,conf=conf)
    134         try:
--> 135             self._do_init(master,136                           conf,profiler_cls)
    137         except:

/opt/anaconda3/lib/python3.8/site-packages/pyspark/context.py in _do_init(self,profiler_cls)
    196 
    197         # Create the Java SparkContext through Py4J
--> 198         self._jsc = jsc or self._initialize_context(self._conf._jconf)
    199         # Reset the SparkConf to the one actually used by the SparkContext in JVM.
    200         self._conf = SparkConf(_jconf=self._jsc.sc().conf())

/opt/anaconda3/lib/python3.8/site-packages/pyspark/context.py in _initialize_context(self,jconf)
    313         Initialize SparkContext in function to allow subclass specific initialization
    314         """
--> 315         return self._jvm.JavaSparkContext(jconf)
    316 
    317     @classmethod

/opt/anaconda3/lib/python3.8/site-packages/py4j/java_gateway.py in __call__(self,*args)
   1566 
   1567         answer = self._gateway_client.send_command(command)
-> 1568         return_value = get_return_value(
   1569             answer,self._gateway_client,None,self._fqn)
   1570 

/opt/anaconda3/lib/python3.8/site-packages/py4j/protocol.py in get_return_value(answer,gateway_client,target_id,name)
    324             value = OUTPUT_CONVERTER[type](answer[2:],gateway_client)
    325             if answer[1] == REFERENCE_TYPE:
--> 326                 raise Py4JJavaError(
    327                     "An error occurred while calling {0}{1}{2}.\n".
    328                     format(target_id,".",name),value)

Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.net.BindException: Can't assign requested address: Service 'sparkDriver' Failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
    at java.base/sun.nio.ch.Net.bind0(Native Method)
    at java.base/sun.nio.ch.Net.bind(Net.java:550)
    at java.base/sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:249)
    at io.netty.channel.socket.nio.NioServerSocketChannel.dobind(NioServerSocketChannel.java:134)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:550)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1334)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:506)
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:491)
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:973)
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:248)
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:356)
    at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:500)
    at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
    at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
    at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
    at java.base/java.lang.Thread.run(Thread.java:832)

解决方法

暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!

如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。

小编邮箱:dio#foxmail.com (将#修改为@)

相关问答

Selenium Web驱动程序和Java。元素在(x,y)点处不可单击。其...
Python-如何使用点“。” 访问字典成员?
Java 字符串是不可变的。到底是什么意思?
Java中的“ final”关键字如何工作?(我仍然可以修改对象。...
“loop:”在Java代码中。这是什么,为什么要编译?
java.lang.ClassNotFoundException:sun.jdbc.odbc.JdbcOdbc...