AMBER Archive (2009)

Subject: Re: [AMBER] ptraj clustering with sieve- unexpected Bus Error

From: David Watson (
Date: Thu Jun 25 2009 - 21:51:11 CDT

On Jun 25, 2009, at 9:36 PM, Rachel Rice wrote:

> I set ulimit -n unlimited. There are 2 files in my working
> directory, the
> two test data sets had 200 and 941 snapshots in them, respectively.

That's strange, when I try to use "ulimit -n unlimited" in bash on
macosx I get the message "Invalid argument".
Are you using tcsh or some other shell?

Perhaps it has something to do with the number of user processes for
the shell as well.
Take a look with "ulimit -a" to see all of the pertinent settings.

> A ulimit problem doesn't explain
> why it works with my "workaround" command (changing "all pdb" to the
> default setting).
> At least to my understanding.
> I cannot give up on sieve, I have a total dataset of microseconds of
> MD.
> Hopefully we can figure it out.
> Thanks!
> Rachel
> On Thu, Jun 25, 2009 at 9:28 PM, David Watson <>
> wrote:
>> Did you >try< ulimit -n ?
>> How many files are in your working directory when it bombs?
>> Try typing
>> ls | wc -l #That's a pipe after ls, not another l
>> and then
>> ulimit -n
>> If you see that the number of files is close to or greater than the
>> what is
>> reported by the "ulimit -n" command, then you actually are running
>> into the
>> problem I mentioned.
>> The response returned with the "ulimit" command without arguments
>> says
>> "unlimited" on macosx and doesn't have anything to do with reality.
>> You must set the number of open files per process to something
>> greater
>> using something like "ulimit -n 1024".
>> I went through this exact problem with the sieve command, and apart
>> from
>> using "ulimit" command the only other option was to give up on
>> using sieve.
>> Well, that's my take on it anyway, and I could be totally off base.
>> Perhaps there is a bug where files descriptors aren't being properly
>> disposed of during the sieve routine, but that's a question for the
>> powers
>> that be.
>> On Jun 25, 2009, at 7:54 PM, Rachel Rice wrote:
>> I get the bus error even with ulimit set to "unlimited". The error
>> does
>>> not
>>> happen when the program is trying to write output files- it
>>> appears to
>>> occur
>>> when it attempts to read in data for the second set of clustering.
>>> On Thu, Jun 25, 2009 at 1:53 PM, David Watson <>
>>> wrote:
>>> Rachel,
>>>> Check the following message from the archive:
>>>> I hope that helps.

AMBER mailing list