Most examples of inplace editing are one-liners that iterate through a file or files, reading and print开发者_StackOverflowing one line at a time.
I can't find any examples of reading an entire file into an array, modifying the array as needed, and then printing the array while using the ^I switch to do an inplace edit. When I try to read the entire file from the diamond operator, edit the contents and print the entire contents, I find that the print goes to STDOUT instead of ARGVOUT and that ARGVOUT is closed. I can open the same file for output and then print to it, but I'm not sure I understand why that is necessary. Here is an example:
#!/usr/bin/perl
use strict;
use warnings;
use 5.010;
my $filename = 'test.txt';
push @ARGV, $filename;
$^I = ".bk";
my @file = <>; #Read all records into array
chomp @file;
push @file, qw(add a few more lines);
print join "\n", @file; #This prints to STDOUT, and ARGVOUT is closed. Why?
Running the above makes a backup of the test.txt file as expected, but leaves the edited test.txt empty, printing the edited contents to STDOUT instead.
See perlrun
.
When the -i
switch has been invoked, perl starts the program using ARGVOUT
as the default file handle instead of STDOUT
. If there are multiple input files, then every time the <>
or <ARGV>
or readline(ARGV)
operation finishes with one of the input files, it closes ARGVOUT
and reopens it to write to the next output file name.
Once all the input from <>
is exhausted (when there are no more files to process), perl closes ARGVOUT
and restores STDOUT
as the default file handle again. Or as perlrun
says
#!/usr/bin/perl -pi.orig
s/foo/bar/;
is equivalent to
#!/usr/bin/perl
$extension = '.orig';
LINE: while (<>) {
if ($ARGV ne $oldargv) {
if ($extension !~ /\*/) {
$backup = $ARGV . $extension;
}
else {
($backup = $extension) =~ s/\*/$ARGV/g;
}
rename($ARGV, $backup);
open(ARGVOUT, ">$ARGV");
select(ARGVOUT);
$oldargv = $ARGV;
}
s/foo/bar/;
}
continue {
print; # this prints to original filename
}
select(STDOUT);
Once you say my @file = <>
and consume all the input, Perl closes the filehandle to the backup files and starts directing output to STDOUT
again.
The workaround, I think, is to call <>
in scalar context and check eof(ARGV)
after each line. When eof(ARGV)=1
, you have read the last line in that file and you get one chance to print before you call <>
again:
my @file = ();
while (<>) {
push @file, $_;
if (eof(ARGV)) {
# done reading current file
@processed_file = &do_something_with(@file);
# last chance to print before ARGVOUT gets reset
print @processed_file;
@file = ();
}
}
my @file = <>; #Read all records into array
is bad. Now you're done slurping all the records, *ARGV
is closed, and $^I
replacement doesn't have anything to work on.
my @file;
while (<>) {
push @file, $_;
}
continue {
if (eof ARGV) {
chomp @file;
push @file, qw(add a few more lines);
print join "\n", @file;
@file = ();
}
}
This read the file(s) line-at-a-time, and at the end of each file (before it's closed), performs the manipulation.
undef $/;
while (<>) {
my @file = split /\n/, $_, -1;
push @file, qw(add a few more lines);
print join "\n", @file;
}
This reads entire files at a time as single records.
Tie::File can also be used to edit a file in-place. It does not leave a backup copy of the original file, however.
use warnings;
use strict;
use Tie::File;
my $filename = 'test.txt';
tie my @lines, 'Tie::File', $filename or die $!;
push @lines, qw(add a few more lines);
untie @lines;
Perl's inplace editing is much simpler than any of the answers:
sub edit_in_place
{
my $file = shift;
my $code = shift;
{
local @ARGV = ($file);
local $^I = '';
while (<>) {
&$code;
}
}
}
edit_in_place $file, sub {
s/search/replace/;
print;
};
if you want to create a backup then change local $^I = '';
to local $^I = '.bak';
精彩评论