[Biopython] Writing large Fastq

Jurgens de Bruin debruinjj at gmail.com
Thu Aug 14 07:03:54 UTC 2014


Hi All,

I would appreciate any help on the following, I have script that that
filters read from a FASQ files files are 1.5GB in size. I want to write the
filtered read to a new fastq file and this is where I seem to have  bug as
the writing of the file newer finishes I have left the script for 4 hours
and nothing so I stop the script. This is currently what I have :

from Bio import SeqIO
fastq_parser = SeqIO.parse(ls_file,ls_filetype)
wanted = (rec for rec in fastq_parser if rec.description in ll_llist )
ls_filename = "%s_filered.fastq"%ls_file.split(".")[0]
handle = open(ls_filename,'wb')
SeqIO.write(wanted, handle , "fastq")
handle.close()

Thanks inadvance
-- 
Regards/Groete/Mit freundlichen Grüßen/recuerdos/meilleures salutations/
distinti saluti/siong/duì yú/привет

Jurgens de Bruin
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.open-bio.org/pipermail/biopython/attachments/20140814/f7fdecd9/attachment.html>


More information about the Biopython mailing list