[Bioperl-pipeline] multiple pipelines

jeremyp at sgx3.bmb.uga.edu jeremyp at sgx3.bmb.uga.edu
Mon Jun 30 15:35:59 EDT 2003


Hi,

I just wanted to check on something I haven't really tried yet. We're
planning on running multiple pipelines at the same time. So, for example,
we might have 5 different databases and potentially all 5 could be in use
at the same time. I was wondering if this might cause any problems...
specifically, the pipelines share the same NFSTMP_DIR. It seems like there
might be concurrency problems (specifically, the PipelineManager seems to
use the same names for the executable scripts, 1.pbs, 2.pbs ...). Does
this work out?

One other note: with our setup, reading/writing from/to an nfs directory
during a blast analysis is very io bound. I altered the Blast runnable to
include a very simple system for doing the actual running in a specific
directory (especially a specific disk/filesystem). So, when we run the
blast file pipeline now, the input file is copied to a directory local to
the node on which a given analysis is running, and the output is generated
there as well, then copied back to the nfs mounted directory the analysis
was started in. It does seem that having the database on an nfs mounted
directory is ok. I don't know if anyone else has seen anything similar
(our CPU usage was fairly low when running purely off of an nfs mounted
disk).

Jeremy


More information about the bioperl-pipeline mailing list