开发者

How can I run a system command in Perl asynchronously?

开发者 https://www.devze.com 2022-12-11 13:43 出处:网络
I currently have a Perl script that runs an external command on the system, gathers the outp开发者_运维技巧ut, and performs some action based on what was returned.Right now, here is how I run this (wh

I currently have a Perl script that runs an external command on the system, gathers the outp开发者_运维技巧ut, and performs some action based on what was returned. Right now, here is how I run this (where $cmd is a string with the command setup):

@output = `$cmd`;

I'd like to change this so if the command hangs and does not return a value after so much time then I kill the command. How would I go about running this asynchronously?


There's a LOT of ways to do this:

  • You can do this with a fork (perldoc -f fork)
  • or using threads (perldoc threads). Both of these make passing the returned information back to the main program difficult.
  • On systems that support it, you can set an alarm (perldoc -f alarm) and then clean up in the signal handler.
  • You can use an event loop like POE or Coro.
  • Instead of the backticks, you can use open() or respectively open2 or open3 (cf. IPC::Open2, IPC::Open3) to start a program while getting its STDOUT/STDERR via a file handle. Run non-blocking read operations on it. (perldoc -f select and probably google "perl nonblocking read")
  • As a more powerful variant of the openX()'s, check out IPC::Run/IPC::Cmd.
  • Probably tons I can't think of in the middle of the night.


If you really just need to put a timeout on a given system call that is a much simpler problem than asynchronous programming.

All you need is alarm() inside of an eval() block.

Here is a sample code block that puts these into a subroutine that you could drop into your code. The example calls sleep so isn't exciting for output, but does show you the timeout functionality you were interested in. Output of running it is:

/bin/sleep 2 failure: timeout at ./time-out line 15.

$ cat time-out
#!/usr/bin/perl

use warnings;
use strict;
my $timeout = 1;
my @cmd = qw(/bin/sleep 2);
my $response = timeout_command($timeout, @cmd);
print "$response\n" if (defined $response);

sub timeout_command {
        my $timeout = (shift);
        my @command = @_;
        undef $@;
        my $return  = eval {
                local($SIG{ALRM}) = sub {die "timeout";};
                alarm($timeout);
                my $response;
                open(CMD, '-|', @command) || die "couldn't run @command: $!\n";
                while(<CMD>) {
                        $response .= $_;
                }
                close(CMD) || die "Couldn't close execution of @command: $!\n";
                $response;
        };
        alarm(0);
        if ($@) {
                warn "@cmd failure: $@\n";
        }
        return $return;
}


If your external program doesn't take any input, look for the following words in the perlipc manpage:

Here's a safe backtick or pipe open for read:

Use the example code and guard it with an alarm (which is also explained in perlipc).


I coded below to run rsync on 20 directories simultaneously (in parallel instead of sequentially requiring me to wait hours for it to complete):


use threads;

for my $user ( keys %users ) {
  my $host = $users{$user};
  async {
    system <<~ "SHELL";
      ssh $host \\
        rsync_user $user
    SHELL
  }
}
$ pgrep -lf rsync | wc -l
      20

Not sure if it's best or even a good solution, but it I was glad that it worked for my use case.

With this you get a mixed output on screen (what I ignored anyway), but it does its job successfully.


threads pragma exports the (very useful) async function by default.

rsync_user is my Perl script that wraps rsync command with options, and source and target directories set.

Ran on FreeBSD 13.1 with Perl 5.32.1

0

精彩评论

暂无评论...
验证码 换一张
取 消