[Israel.pm] Memory problem - Loading big files

Yossi Itzkovich Yossi.Itzkovich at ecitele.com
Thu Jun 19 04:14:39 PDT 2008


The problem is not in the file.  You have @probes with number of items as the amount of lines in the file.
For each line you have an additional array...


-----Original Message-----
From: perl-bounces at perl.org.il [mailto:perl-bounces at perl.org.il] On Behalf Of Assaf Gordon
Sent: Tuesday, June 17, 2008 2:36 AM
To: perl at perl.org.il
Subject: [Israel.pm] Memory problem - Loading big files

Hello all,

I'm having problems loading big files into memory - maybe you could help
me solve them.

My data file is a big (~250MB) text file, with eight tab-separated
fields. I want to load the entire file into a list.

I've narrowed down the code into this:
use strict;
use warnings;
use Data::Dumper;
use Devel::Size qw (size total_size);

my @probes;
while (<>) {
        my @fields = split(/\s+/);
        push @probes, \@fields;

print "size = ", size(\@probes),"\n";
print "total size= ", total_size(\@probes),"\n";
print "data size = ", total_size(\@probes)- size(\@probes),"\n";
print Dumper(\@probes),"\n";
(Can't get any simpler than that, right?)

But when I run the program, the perl process consumes 2.5GB of memory,
prints "out of memory" and stops.

I know that perl isn't the most efficient memory consumer, but surely
there's a way to do it...

If you care to test it yourselves, here's a simple script that creates a
dummy text file, similar to my own data file:
foreach (1..2100000) { print join("\t", "LONG-TEXT-FIELD", 11111,
222222, 3333333, 44444444, 5555555, 6666666,

Thanks in advance for your help!

Perl mailing list
Perl at perl.org.il

More information about the Perl mailing list