AMBER Archive (2007)

Subject: RE: AMBER: FW: MPI error message

From: Gustavo Seabra (gustavo.seabra_at_gmail.com)
Date: Tue Jul 24 2007 - 14:32:52 CDT


Two things:
 
1. You really should be using sander.MPI for parallel calculations.
2. Do you think you could post your inpcrd and prmtop files for us to test?
 
Gustavo.

  _____

From: owner-amber_at_scripps.edu [mailto:owner-amber_at_scripps.edu] On Behalf Of
Taryn Hartley
Sent: Tuesday, July 24, 2007 3:00 PM
To: amber_at_scripps.edu
Subject: AMBER: FW: MPI error message

Although I have yet to determine why, I re-made the .inpcrd and .prmtop
files in xLeap, and attempted to run the job again. This was my error
message this time.... any thoughts?

set_SCR: using existing PBS job directory /scratch/batch/205048
MPI: On host co-compute2, Program
/usr/apps/chemistry/amber/amber9/amber9/exe/sander.
MPI, Rank 0, Process 7718 called MPI_Abort(<communicator>, 1)

MPI: --------stack traceback-------
Internal Error: Can't read/write file "/dev/mmtimer", (errno = 22)
Internal Error: Can't read/write file "/dev/sgi_fetchop", (errno = 22)
MPI: Intel(R) Debugger for Itanium(R) -based Applications, Version 9.0-20 ,
Build 200
60218
MPI: Reading symbolic information from
/usr/apps/chemistry/amber/amber9/amber9/exe/sa
nder.MPI...done
MPI: Attached to process id 7718 ....
MPI: stopped at [0xa000000000010641]
MPI: >0 0xa000000000010641
MPI: #1 0x2000000005873a80 in __waitpid(...) in /lib/tls/libc.so.6.1
MPI: #2 0x20000000000e4170 in MPI_SGI_stacktraceback(...) in
/usr/lib/libmpi.so
MPI: #3 0x20000000001208d0 in PMPI_Abort(...) in /usr/lib/libmpi.so
MPI: #4 0x20000000001bdda0 in mpi_abort__(...) in /usr/lib/libmpi.so
MPI: #5 0x40000000003641f0 in mexit_(...) in
/usr/apps/chemistry/amber/amber9/amber9
/exe/sander.MPI
MPI: #6 0x4000000000211510 in getcor_(...) in
/usr/apps/chemistry/amber/amber9/amber
9/exe/sander.MPI
MPI: #7 0x40000000001a5440 in sander_(...) in
/usr/apps/chemistry/amber/amber9/amber
9/exe/sander.MPI
MPI: #8 0x4000000000194bc0 in MAIN__(...) in
/usr/apps/chemistry/amber/amber9/amber9
/exe/sander.MPI
MPI: #9 0x4000000000004840 in main(...) in
/usr/apps/chemistry/amber/amber9/amber9/e
xe/sander.MPI
MPI: #10 0x2000000005791c50 in __libc_start_main(...) in
/lib/tls/libc.so.6.1
MPI: #11 0x4000000000004580 in _start(...) in
/usr/apps/chemistry/amber/amber9/amber9
/exe/sander.MPI

MPI: -----stack traceback ends-----
MPI: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
MPI: aborting job

  _____

From: taryn_hartley_at_hotmail.com
To: amber_at_scripps.edu
Subject: MPI error message
Date: Mon, 23 Jul 2007 11:39:27 -0600

Regarding using Sander on Amber9 to run an MD simulation, my pbs script
included AMBERHOME/exe/sander and was receiving this error message:
MPI: co-compute1: 0x34b5000042df8b72: forrtl: severe (64): input conversion
error, un
it 8, file /u/ac/thartley/test/bundlewat.prmtop
MPI: co-compute1: 0x34b5000042df8b72: Image PC
Routine
         Line Source
MPI: co-compute1: 0x34b5000042df8b72: sander 4000000000C91390
Unknown
            Unknown Unknown
MPI: co-compute1: 0x34b5000042df8b72: sander 4000000000C8C560
Unknown
            Unknown Unknown
MPI: co-compute1: 0x34b5000042df8b72: sander 4000000000C34250
Unknown
            Unknown Unknown
MPI: co-compute1: 0x34b5000042df8b72: sander 4000000000B882E0
Unknown
            Unknown Unknown
MPI: co-compute1: 0x34b5000042df8b72: sander 4000000000B87810
Unknown
            Unknown Unknown
MPI: co-compute1: 0x34b5000042df8b72: sander 4000000000BCB6B0
Unknown
            Unknown Unknown
MPI: co-compute1: 0x34b5000042df8b72: sander 40000000001B13C0
Unknown
            Unknown Unknown
MPI: co-compute1: 0x34b5000042df8b72: sander 400000000018B850
Unknown
            Unknown Unknown
MPI: co-compute1: 0x34b5000042df8b72: sander 40000000001874F0
Unknown
            Unknown Unknown
MPI: co-compute1: 0x34b5000042df8b72: sander 4000000000003D40
Unknown
            Unknown Unknown
MPI: co-compute1: 0x34b5000042df8b72: libc.so.6.1 2000000001D0DC50
Unknown
            Unknown Unknown
MPI: co-compute1: 0x34b5000042df8b72: sander 4000000000003A80
Unknown
            Unknown Unknown
MPI: could not run executable (case #4)

It was suggested to me to add .MPI after sander (AMBERHOME/exe/sander.MPI),
which I did, and now my error message reads as follows:
set_SCR: using existing PBS job directory /scratch/batch/204612
forrtl: severe (64): input conversion error, unit 8, file
/u/ac/thartley/test/bundlewat.prmtop
Image PC Routine Line Source

  
sander.MPI 40000000007F3CD0 Unknown Unknown Unknown
sander.MPI 40000000007EEEA0 Unknown Unknown Unknown
sander.MPI 4000000000796B90 Unknown Unknown Unknown
sander.MPI 40000000006EEEE0 Unknown Unknown Unknown
sander.MPI 40000000006EE410 Unknown Unknown Unknown
sander.MPI 4000000000731DF0 Unknown Unknown Unknown
sander.MPI 40000000001CAA00 Unknown Unknown Unknown
sander.MPI 40000000001A4650 Unknown Unknown Unknown
sander.MPI 4000000000194BC0 Unknown Unknown Unknown
sander.MPI 4000000000004840 Unknown Unknown Unknown
libc.so.6.1 2000000005791C50 Unknown Unknown Unknown
sander.MPI 4000000000004580 Unknown Unknown Unknown
MPI: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
MPI: aborting job

Is it my .prmtop file as the 2nd line indicates? .MPI was not used in the
PBS script in the tutorials (which I executed successfully before attempting
my own project) and that is what I am using as my guide. Help?

-Taryn

  _____

Explore the seven wonders of the world Learn more!
<http://search.msn.com/results.aspx?q=7+wonders+world&mkt=en-US&form=QBRE>

  _____

Get news, entertainment and everything you care about at Live.com. Check it
out! <http://www.live.com/getstarted.aspx>

-----------------------------------------------------------------------
The AMBER Mail Reflector
To post, send mail to amber_at_scripps.edu
To unsubscribe, send "unsubscribe amber" to majordomo_at_scripps.edu