开发者

How can Hadoop job kill by itself

开发者 https://www.devze.com 2023-03-07 10:34 出处:网络
Is there any way to kill a Hadoop job itself or send a signal to kill it. I\'ve read the Configuration settings from jobConf where it says that if a user specify the wrong settings I need to kill the

Is there any way to kill a Hadoop job itself or send a signal to kill it. I've read the Configuration settings from jobConf where it says that if a user specify the wrong settings I need to kill the job or throw an error, since map/reduce config method does not allow throwing an exception.

public void configure(JobConf job) {
    System.out.println("Inside config start processing");
     try {

            String strFileName =  jo开发者_JAVA百科b.get("hadoop.rules");
            LoadFile(strFileName );
     } catch (Exception e) {
         e.printStackTrace();
        //Here i need to write code to kill job
     }
}


In the configure() method, just throw a RuntimeException.

Better yet, if possible, you're better off performing your validation step before the job is run.


Just save the state into a boolean variable called kill and evaluate the variable inside the map step and then throw an IOException.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号