My Perl script have weird behaviour which I don't understand. I'm processing large structure stored as array of hashes which is growing while processing. The problem is that structure has about max 8mb when I store it on hdd, but while it is processing it takes about 130mb of ram. Why there is so big difference?
The main flow of proccessing looks like:
while(...)
{
my %new_el = %{Storable::dclone \%some_el};
# ...
# change a few thin开发者_如何转开发gs in new_el
# ...
push @$elements_ref, \%new_el;
}
You are making more copies of the data than you need to. Try working with hashrefs rather than dereferencing, as much as possible:
while (...)
{
my $new_el = Storable::dclone \%some_el;
# ...
# change a few things in new_el
# ...
push @$elements_ref, $new_el;
}
Even better would be to not clone the entire hash -- perhaps you can get away with altering it in-place?
精彩评论