I'm using Thread::Pool::Simple
to create a few working threads. Each working thread does some stuff, including a call to chdir
followed by an execution of an external Perl script (from the jbrowse
genome browser, if it ma开发者_StackOverflowtters). I use capturex
to call the external script and die on its failure.
I discovered that when I use more then one thread, things start to be messy. after some research. it seems that the current directory of some threads is not the correct one.
Perhaps chdir
propagates between threads (i.e. isn't thread-safe)?
Or perhaps it's something with capturex
?
So, how can I safely set the working directory for each thread?
** UPDATE **
Following the suggestions to change dir while executing, I'd like to ask how exactly should I pass these two commands to capturex
?
currently I have:
my @args = ( "bin/flatfile-to-json.pl", "--gff=$gff_file", "--tracklabel=$track_label", "--key=$key", @optional_args );
capturex( [0], @args );
How do I add another command to @args
?
Will capturex
continue die on errors of any of the commands?
I think that you can solve your "how do I chdir in the child before running the command" problem pretty easily by abandoning IPC::System::Simple
as not the right tool for the job.
Instead of doing
my $output = capturex($cmd, @args);
do something like:
use autodie qw(open close);
my $pid = open my $fh, '-|';
unless ($pid) { # this is the child
chdir($wherever);
exec($cmd, @args) or exit 255;
}
my $output = do { local $/; <$fh> };
# If child exited with error or couldn't be run, the exception will
# be raised here (via autodie; feel free to replace it with
# your own handling)
close ($fh);
If you were getting a list of lines instead of scalar output from capturex
, the only thing that needs to change is the second-to-last line (to my @output = <$fh>;
).
More info on forking-open is in perldoc perlipc.
The good thing about this in preference to capture("chdir wherever ; $cmd @args")
is that it doesn't give the shell a chance to do bad things to your @args
.
Updated code (doesn't capture output)
my $pid = fork;
die "Couldn't fork: $!" unless defined $pid;
unless ($pid) { # this is the child
chdir($wherever);
open STDOUT, ">/dev/null"; # optional: silence subprocess output
open STDERR, ">/dev/null"; # even more optional
exec($cmd, @args) or exit 255;
}
wait;
die "Child error $?" if $?;
I don't think "current working directory" is a per-thread property. I'd expect it to be a property of the process.
It's not clear exactly why you need to use chdir
at all though. Can you not launch the external script setting the new process's working directory appropriately instead? That sounds like a more feasible approach.
精彩评论