无法从MacOS的终端运行spark-shell

问题描述

我试图使用./Documents/spark/spark-3.0.0-bin-hadoop2.7/bin/spark-shell在MacOS的终端上运行spark-shell,它开始运行,但是我想仅使用spark-shell来运行它。

我看到一个4-minute-video,显示了它是如何完成的,但对我不起作用。

我不完全了解~/.bash_profile的工作方式,但下面是它的样子:

# added by Anaconda3 5.3.1 installer
# >>> conda init >>>
# !! Contents within this block are managed by 'conda init' !!

__conda_setup="$(CONDA_REPORT_ERRORS=false '/Users/ajay/anaconda3/bin/conda' shell.bash hook 2> /dev/null)"
if [ $? -eq 0 ]; then
    \eval "$__conda_setup"
else
    if [ -f "/Users/ajay/anaconda3/etc/profile.d/conda.sh" ]; then
        . "/Users/ajay/anaconda3/etc/profile.d/conda.sh"
        CONDA_CHANGEPS1=false conda activate base
    else
        \export PATH="/Users/ajay/anaconda3/bin:$PATH"
    fi
fi
unset __conda_setup
# <<< conda init <<<
export SPARK_HOME=/Users/ajay/Documents/spark/spark-3.0.0-bin-hadoop2.7
export PATH=$PATH:$SPARK_HOME/bin
PATH="/usr/local/bin:/Users/ajay/anaconda3/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin"

$ PATH 给出/usr/local/bin:/Users/ajay/anaconda3/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin

如何更改~/.bash_profile才能使spark-shell正常工作?

编辑

这是我在运行./Documents/spark/spark-3.0.0-bin-hadoop2.7/bin/spark-shell时收到的消息

20/08/27 16:51:16 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR,use setLogLevel(newLevel).
Spark context Web UI available at http://192.168.0.2:4040
Spark context available as 'sc' (master = local[*],app id = local-1598527288778).
Spark session available as 'spark'.

运行spark-shell时显示:-bash: spark-shell: command not found

解决方法

暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!

如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。

小编邮箱:dio#foxmail.com (将#修改为@)