[Bioperl-l] malloc errors while using Bio::SeqIO?

Rutger Vos rvos at interchange.ubc.ca
Sat Aug 9 07:36:20 EDT 2008


Hi Dave,

thanks for the reply. The memory usage is in fact much more atrocious
than just 44 mb: I'm actually looping over all 36 such archives (the
genbank primates), and on my macbook it steadily increase to over 1gb
of memory. What seemed to be helping somewhat is to call
$reader->flush and $writer->flush after each seq, at least to the
extent that I make it through that one file, but last time I tried I
didn't get much further: the whole terminal process died shortly after
instead. I seem to vaguely recall that even if perl free()'s memory,
that doesn't necessarily mean that the memory is returned to the OS
for the runtime of the program - depending on the OS and perl version.
What OS are you on? I'm running perl 5.8.6 on OS X 10.4.11 intel.

Rutger

On Sat, Aug 9, 2008 at 4:04 AM, Dave Messina <David.Messina at sbc.su.se> wrote:
> Hi Rutger,
> I ran your script on the same genbank file and, while I did not run out of
> memory, I did see what appears to be a memory leak. Even when I manually
> undef'd the reader and writer object every 1000 records, memory usage
> continued to grow.
>
> I can't quite figure out what's going on, though.
> If I run a different program using SeqIO (the simple sequence converter from
> the SeqIO HOWTO) on the same input file, I don't see this same runaway
> growth.
> Also, the problem seems a lot worse on perl 5.10 than on 5.8.8; on 5.8.8 the
> sequence converter holds steady at about 12MB of real memory, whereas on
> 5.10 it grows, albeit slowly, for as long as the program is executing. When
> I killed it about 20% of the way through the file, it was up to  about 44MB
> of real memory.
> Anyone else have a chance to look at this?
>
> Dave
>



-- 
Dr. Rutger A. Vos
Department of zoology
University of British Columbia
http://www.nexml.org
http://rutgervos.blogspot.com



More information about the Bioperl-l mailing list