AMBER Archive (2009)

Subject: Re: [AMBER] Polarizable simulation of the slab

From: Robert Duke (
Date: Fri Nov 27 2009 - 10:48:07 CST

PMEMD should be more robust - it is not dependent on uniform density to size
the pairlist (the first guess is based on a uniform density guess, but if
there is a problem, it resizes the list instead of crashing on you). So I
don't know if you have sander-dependent requirements here, but pmemd at
least can handle the density differences. I guess I do seem a reference to
"polarizable" here, and the only "polarizable" that pmemd does is amoeba...
- Bob Duke
----- Original Message -----
From: "case" <>
To: "AMBER Mailing List" <>
Sent: Friday, November 27, 2009 11:10 AM
Subject: Re: [AMBER] Polarizable simulation of the slab

> On Fri, Nov 27, 2009, Jan Heyda wrote:
>> I started from equilibrated NPT simulation, which gave box with
>> dimensions approximately 32A x 32A x 32A.
>> System was then shifted by 75A in z-coordinate (by awk) and z-PBC
>> changed to 150, so the final system sizes were 32A x 32A x 150A, and all
>> atoms were in the z-region 59A-91A.
>> The system was thou inhomogeneous in total, but in fact it's 60A of
>> vacuum, 32A of water, and 60A of vacuum. This should be stabile during
>> simulation.
> The nonbonded list code in sander assumes that the density of atoms in all
> regions will be approximately homogeneous. It uses this assumption to
> approximately divide work (and memory for data structures) among
> processors
> in multi-CPU simulations. So, the code will indeed work on a single CPU,
> but
> will find allocation failures in multiple-CPU runs.
> I don't know the code well enough to estimate how hard it would be to fix
> this. As it stands, what you want to do (parallel runs of polarizable
> potential in a system where part is in vacuum) is outside the capabilities
> of the code.
> ...regards...dac
> _______________________________________________
> AMBER mailing list

AMBER mailing list