AMBER Archive (2008)Subject: Re: AMBER: nmode/nab entropy calculations memory issues
From: Andreas Svrcek-Seiler (svrci_at_tbi.univie.ac.at)
Date: Tue Sep 16 2008 - 10:50:11 CDT
Hi,
> I have a question relating to entropy calculations using nmode/nab. I am
> trying to calculate entropy of a protein which has around 9000 atoms. I
> encountered memory problems both with nmode and nab.
> Any estimate of how much memory I might need to run this kind of system ?
Your system contains 27000 coordinates, which makes 27000**2 = 729 million
entries, each of which needs 8 bytes, which gives 5.8 billion bytes for
the hessian (though the hessian is symmetric and exploiting this can save
memory - I'm not sure about implementation details).
Anyway: For 6370 atoms, I see 5.9 GB of memory being used (so this
woudln't work within reasonable time on a 4GB machine).
For 9515 atoms I see 13 GB of memory in use. So you might like a 16 GB
machine for this (at least).
...good luck
Andreas
-----------------------------------------------------------------------
The AMBER Mail Reflector
To post, send mail to amber_at_scripps.edu
To unsubscribe, send "unsubscribe amber" (in the *body* of the email)
to majordomo_at_scripps.edu
|