HPC has installed a variety of phylogenetic software package. I usually run the parallel version of raxml (mpiraxml), mrbayes (mb_p), and Garli (pargarli) using the MOAB scheduler. Whenever I'm running a version of these packages that is single threaded, I typically run these using Condor. PAUP and Phylip are only available as single threaded applications, so I typically run these by using Condor. You can find examples of both job submission mechanisms (MOAB and Condor) on the HPC web site under software. Also, there are some video tutorials on the HPC web (under tutorials) for each job submission mechanism.
HPC has adapted a new directory structure to better support all the software package that we install. Until the web documents are updated to reflect these changes please send a message to firstname.lastname@example.org and one the HPC staff members will send you the new path information for MrBayes and other software that had to be recompiled.
I am having trouble submitting PAUP jobs to the Moab. I am not certain where PAUP is located (the correct directory and path), and my analysis is getting hung up at the submit file. Below is a copy of my submit file:
The application MrBayes will likely use as much memory as is on the system that Condor gives it. MrBayes often uses lots of memory, especially for long running jobs so this can be a problem. To be sure that your job is paired with a system that has lots of memory, I suggest using the requirements statement. For example:
Requirements = Memory >= 2000
will only let your jobs run on systems with 2GB of memory or more.
You can figure out what memory is available to condor systems by typing in the command:
The seventh column in the output from this command is memory available per job. I would also suggest being a little more explicit with your requirements statement to include the OS and the cpu architecture. For example:
I am trying to run an MPI MrBayes job, however, I am receiving an error message (in my *.output file) that says MrBayes has run out of memory: The program ran out of memory while trying to allocate the conditional likelihoods for the chain. Please try allocating more memory to the program, or running the data on a computer with more memory. Error in command "Mcmc"
Any ideas on how to allocate more memory to MrBayes?
Below is my submit file, if it helps diagnose the problem: Universe = Vanilla Executable = /usr/common/i686-linux/bin/condor_mb