Setting
PATH="$HOME/bin:$HOME/.local/bin:$PATH:/home/username/Installs/Spark/bin"
would enable to run the executable scripts like spark-shell
, spark-submit
, pyspark
etc. without need to give full path to the scripts.
Besides setting PATH
, you would need to set
SPARK_HOME=/home/username/Installs/Spark
which is used internally when you start spark cluster or when you use spark-submit
.
If you are setting the variables in .bashrc
file, you need export
keyword too as
export SPARK_HOME=/home/username/Installs/Spark
and if you don't want to reboot Ubuntu to test it worked type
. ~/.profile
into the command line then try your spark command.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…