AMBER Archive (2008)Subject: Re: RE: RE: AMBER: memory usage
From: Ye Mei (ymei_at_itcc.nju.edu.cn)
Date: Mon Apr 28 2008 - 09:06:20 CDT
Dear Ross,
I tried to do some tests as you told me to, and the memory requirements are listed in the following:
ACE-ALA-NME gas phase 4091MB
ACE-ALA-NME igb=0 4090MB
bench.jac 104MB
bench.jac.ips 96528 96k? or 96MB? I am not sure.
bench.jac.pmemd 7985MB
Even larger memory is required for pmemd. So I attach config.h for pmemd in this mail. Maybe it can help.
Best regards,
Ye Mei
ymei_at_itcc.nju.edu.cn
Institute of Theoretical and Computational Chemistry
Key Laboratory of Mesoscopic Chemistry of MOE
School of Chemistry and Chemical Engineering
Nanjing University
Nanjing 210093
P.R.China
2008-04-28
======= 2008-04-28 00:52:09 Ross Walker wrote=======
>Hi Ye,
>
>What you are seeing seems very strange. I am copying this to Roberto
>Gomperts at SGI who is the expert on Amber on Altix machines. However in the
>meantime can you try running a Generalized born job instead. So just set up
>something simple like ACE ALA NME in gas phase and then run it with ntb=0
>and igb=1. This will use a different route through the code and if that uses
>a more reasonable amount of memory then we know it is related to the
>ewald/pme code. It is also possible that the code to calculate the required
>memory for such a small system is miscalculating and requesting a huge
>allocation. Can you also try a bigger system such as the JAC benchmark and
>see how this does.
>
>All the best
>Ross
>
>-----Original Message-----
>From: owner-amber_at_scripps.edu [mailto:owner-amber_at_scripps.edu] On Behalf Of
>Ye Mei
>Sent: Sunday, April 27, 2008 7:53 AM
>To: amber
>Subject: Re: RE: AMBER: memory usage
>
>Dear Ross,
>
>Thank you very much for your reply.
>I attach some files that might be helpful for solving this problem.
>As you guessed, I am using Amber9. In the attached files, please find
>config.h for the compilation. Whatever I compile amber9 statically or
>dynamically, the results are the same.
>Other info related to the compilation are:
>Intel Fortran Compiler 10.1.008
>Intel Math kernel library 10.0.1.014
>SGI native mpi
>
>What I am running is a tiny job, i.e. a ethanol in TIP3P water box. Related
>files are also attached.
>The performance is OK.
>The administrator told me that this SGI Altix 4700 uses memory as swap to
>accelerate swapping, instead of hard disk. I have no idea whether this has
>anything to do with this large virtual memory requirement. When running
>sander, little memory left for other jobs. Some CPUs cannot get enough
>memory to run jobs. So the administrator ordered me to check my amber9.
>
>Best regards,
>
>Ye Mei
>ymei_at_itcc.nju.edu.cn
>Institute of Theoretical and Computational Chemistry
>Key Laboratory of Mesoscopic Chemistry of MOE
>School of Chemistry and Chemical Engineering
>Nanjing University
>Nanjing 210093
>P.R.China
>2008-04-27
>
>
>======= 2008-04-27 14:10:22 Ross Walker wrote=======
>
>>Hi Ye,
>>
>>>10513 chem 25 0 15.7g 161m 57m R 100 0.2 2028:17 sander.MPI
>>
>>This doesn't look like much memory to me. 0.2% seems perfectly reasonable.
>>The 15.7GB of virtual memory seems a bit strange though. Are you sure this
>>is not some strange problem with reporting of memory usage? Why did the
>>administrator ask you to check things? Is it causing problems? The jobs are
>>all running at 100% cpu usage so it isn¨t clear that it is swapping, more
>>likely there is 15.7 GB that got swapped out immediately.
>>
>>This is AMBER 9 I assume? If so then this is weird for a 4000 atom run,
>>although I suppose if you were using Generalized Born it might use a lot,
>>but not that much. Can you post details of the actual calculation you are
>>running.
>>
>>Of course if this is an earlier version of sander which used to have static
>>memory then it is possible it was compiled with a "huge" sizes.h file that
>>requests 16GB of static memory. Then even if you run a tiny job it still
>>allocates 16GB and then of course the memory manager then swaps out the
>>15.7GB or so that is not being used. This doesn't correlate with the
>>executable being called sander.MPI though...
>>
>>Thus I am a little confused. It will probably be easier to see your input
>>script etc. And the first part of the output. Also what is the performance
>>like? Very slow or inline with what you'd expect?
>>
>>All the best
>>Ross
>>
>>
>>-----Original Message-----
>>From: owner-amber_at_scripps.edu [mailto:owner-amber_at_scripps.edu] On Behalf Of
>>Ye Mei
>>Sent: Saturday, April 26, 2008 6:58 AM
>>To: amber mailing list
>>Subject: AMBER: memory usage
>>
>>Dear amber users,
>>
>>I got quite confused about memory usage by sander.MPI running on SGI Altix.
>>As far as I know, sander does not use much memory, especially for a tiny
>>system with less than 4000 atoms. It is true on PC. But it is not the case
>>on SGI Altix 4700 running SuSE Linux Enterprise Server 10 with patch level
>>1. The top command shows that it occupies more than 15GB virtual memory as
>>following.
>> PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
>>10513 chem 25 0 15.7g 161m 57m R 100 0.2 2028:17 sander.MPI
>>10514 chem 25 0 15.7g 120m 14m R 100 0.2 2028:06 sander.MPI
>>10515 chem 25 0 15.7g 161m 57m R 100 0.2 2028:07 sander.MPI
>>10516 chem 25 0 15.7g 160m 56m R 100 0.2 2028:19 sander.MPI
>>10517 chem 25 0 15.7g 160m 57m R 100 0.2 2028:07 sander.MPI
>>10518 chem 25 0 15.7g 158m 55m R 100 0.2 2028:19 sander.MPI
>>10519 chem 25 0 15.7g 160m 57m R 100 0.2 2028:19 sander.MPI
>>10520 chem 25 0 15.7g 159m 56m R 100 0.2 2028:07 sander.MPI
>>
>>The server administrator ordered me to check my jobs. But I have no idea
>how
>>this could happen. Does anyone know how to solve this problem?
>>BTW, jobs are managed by PBS Pro.
>>
>>
>>
>>Best regards,
>>
>>Ye Mei
>>ymei_at_itcc.nju.edu.cn
>>Institute of Theoretical and Computational Chemistry
>>Key Laboratory of Mesoscopic Chemistry of MOE
>>School of Chemistry and Chemical Engineering
>>Nanjing University
>>Nanjing 210093
>>P.R.China
>>2008-04-26
>>
>>-----------------------------------------------------------------------
>>The AMBER Mail Reflector
>>To post, send mail to amber_at_scripps.edu
>>To unsubscribe, send "unsubscribe amber" to majordomo_at_scripps.edu
>>
>>-----------------------------------------------------------------------
>>The AMBER Mail Reflector
>>To post, send mail to amber_at_scripps.edu
>>To unsubscribe, send "unsubscribe amber" to majordomo_at_scripps.edu
>>
>
>= = = = = = = = = = = = = = = = = = = =
>
>
>
>
>
>
>
>-----------------------------------------------------------------------
>The AMBER Mail Reflector
>To post, send mail to amber_at_scripps.edu
>To unsubscribe, send "unsubscribe amber" to majordomo_at_scripps.edu
>
= = = = = = = = = = = = = = = = = = = =
- application/octet-stream attachment: config.h
-----------------------------------------------------------------------
The AMBER Mail Reflector
To post, send mail to amber_at_scripps.edu
To unsubscribe, send "unsubscribe amber" to majordomo_at_scripps.edu
|