[Israel.pm] (no subject)
jkrets20 at student.scad.edu
Mon Aug 2 12:35:07 PDT 2004
For small code you should make 2 subroutines, one for your first level of
directories and then the other subroutine is for all of the directories
below the first directory.
The second subroutine will be recursive calling itself whenever it finds a
directory. Each subroutine only exits after it has gone through the whole
If you are looking through a gig of text I can see this requiring a few
minutes. Keep the write file open and write from each subroutine. The only
thing that should be left in the memory is the depth of the directory
structure. As long as you don't have 1000 levels deep in your directory
structure, it should not be a problem. Just crontab it to an off peak hour.
I have done something similar to this at work to snatch files out of
subdirectories. I forgot about some tar backups I had buried in the
directory and ended up with some huge binary chunks that I tried to parse,
so just remember the -T file tester.
----- Original Message -----
From: "Georges EL OJAIMI" <g-ojaimi at cyberia.net.lb>
To: <perl at perl.org.il>
Sent: August 02, 2004 12.47
Subject: [Israel.pm] (no subject)
> I am not receiving myself!
> This was my previous post, did anybody receive it! I post it twice!!
> I have a big folder with non ending subfolders and fragmented files.
> I want first to browse to the end to these subfolders and append the
> content of the files into one file per parent directory.
> # Original Structure
> # Desired Output
> myFile1.txt (containing File1.1.txt until File1.n.txt)
> myFile2.txt (containing File2.1.txt until File2.n.txt)
> Is there any way and fast way (no memory processing) to do it while the
> size of the rootDirectory might reach the 1 GB and the process is
> repeated each day?
> Best Regards,
> Perl mailing list
> Perl at perl.org.il
More information about the Perl