Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
943 views
in Technique[技术] by (71.8m points)

pyspark - Oozie - Unable to run Spark-Submit on remote server though shell action

When I login to my edge node and run the below command, my application is submitted successfully and completes successfully.

spark-submit --master yarn mydir/myscript.py

But, I'm trying to run this through an oozie shell action.

Here's the xml:

<workflow-app xmlns="uri:oozie:workflow:0.5" name="my_wf">
  <start to="sparksub"/>
  <action name="sparksub">
    <shell xmlns="uri:oozie:shell-action:0.2">
      <job-tracker>${jobTracker}</job-tracker>
      <name-node>${nameNode}</name-node>
      <configuration>
        <property>
          <name>mapred.job.queue.name</name>
          <value>${queueName}</value>
        </property>
      </configuration>
      <exec>rem.sh</exec>
      <file>${nameNode}/hdfs_path/rem.sh</file>
      <file>${nameNode}/hdfs_path/id_pvt</file>
    </shell>
    <ok to="end"/>
    <error to="failure-email"/>
  </action>

id_pvt is the ssh private key to connect to the server and rem.sh contains:

set -e
ssh -i id_pvt -o StrictHostKeyChecking=no my_user@my_node "spark-submit --master yarn mydir/myscript.py"

This is not working. In oozie logs, I just see the message

Heart Beat
Heart Beat
...

There's no error and it just keeps going this way.

Please help me to understand what I'm doing wrong and to make this work. Thanks

question from:https://stackoverflow.com/questions/65829146/oozie-unable-to-run-spark-submit-on-remote-server-though-shell-action

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)
Waitting for answers

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...