如果Hive失败,请停止Bash脚本

时间:2015-03-16 17:08:22

标签: shell hadoop hive

我有一个bash脚本循环遍历文件夹并处理所有* .hql文件。有时其中一个配置单元脚本失败(语法,资源约束等),而不是脚本失败,它将继续到下一个.hql文件。

无论如何,我可以阻止bash处理其余的?以下是我的样本bash:

for i in `ls ${layer}/*.hql`; do
      echo "Processing $i ..."
      hive ${hiveconf_all} -hiveconf DATE=${date} -f ${i} &
    if [ $j -le 5 ]; then
       j=$(( j+1 ))
    else
      wait
      j=0
    fi
  done

3 个答案:

答案 0 :(得分:2)

我会检查上一个命令的进程完成状态,并调用exit命令来出现循环

 (( $? == 0 )) && exit 1

在hive命令之后引入上面的行,并且应该这样做。

答案 1 :(得分:0)

添加

set -e

到您的脚本顶部

答案 2 :(得分:0)

使用此模板运行并行进程并等待其完成。添加您的datelayerhiveconf_all和其他变量:

#!/bin/bash
set -e

#Run parallel processes and write their logs
log_dir=/tmp/my_script_logs
for i in `ls ${layer}/*.hql`; do
      echo "Processing $i ..."
      #Run hive in parallel and redirect to the log file
      hive ${hiveconf_all} -hiveconf DATE=${date} -f ${i} 2>&1 | tee "log_dir/${i}".log &
done

#Now wait for all processes to complete
FAILED=0

for job in `jobs -p`
do
   echo "job=$job"
   wait $job || let "FAILED+=1"
done

if [ "$FAILED" != "0" ]; then
    echo "Execution FAILED!  ($FAILED)"
    #Do something here, log or send message, etc
    exit 1
fi

#All processes are completed successfully!
#Do something here
echo "Done successfully"

然后,您将能够单独检查每个过程日志。