[Israel.pm] Handling huge data-structures?
yuval at windax.com
Sun Aug 29 09:21:20 PDT 2004
SQLite through CDBI sounds like fun (yet slow).
BUT doesn't SQLite's DB stay in the memory all the time?
DB_File might work. I read about it a year ago, I'll freshen up my memory
and let you know :)
But still, I thought the bioinformatics guys would work their magic to
allow me to use a "normal" hash that is not in the memory (or at least not
Gaal Yahas said:
> On Sun, Aug 29, 2004 at 06:30:02PM +0300, Offer Kaye wrote:
>> > I have a <1GB file that I want to keep in a hash, and work with that
>> hash like a normal hash, just keep it out of my memory (on the HDD,
>> > What solutions exist (except for buying more RAM) and what are their
>> Array based rather than hash based solution, but good for large data
>> sets and works pretty much as you described above:
> If by "works" you mean only "has the same semantics as", then yes; but
> anything that's going to insert or delete -- or indeed, change the
> length of an existing record -- is going to be very expensive.
> Yuval, if I were you I'd do some research on DB_File and maybe SQLite
> via Class::DBI. Or just chuck the semantic equivalence requirement and
> bite the SQL bullet: sufficiently different problems tolerate different
> Gaal Yahas <gaal at forum2.org>
> Perl mailing list
> Perl at perl.org.il
More information about the Perl