AMBER Archive (2009)

Subject: [AMBER] sander.MPI parallel run problem

From: Dechang Li (li.dc06_at_gmail.com)
Date: Mon Apr 06 2009 - 02:44:12 CDT


Dear all,

    I run the parallel simulation using sander.MPI in a cluster.
The command I used was:

        mpirun -np 4 -machinefile myhosts ...

        I required 4 cpus to do the parallel simulation, but finally
there were 7 programes run at the node:

 9279 user1 25 0 37928 6088 1592 R 101 0.1 11:03.15 /hpexport2/home/user1/amber9/exe/sander.MPI cn27 56237 -p4amslave -p4yourname cn27
 9281 user1 25 0 38864 6128 1592 R 101 0.1 11:03.07 /hpexport2/home/user1/amber9/exe/sander.MPI cn27 56237 -p4amslave -p4yourname cn27
 9253 user1 16 0 37928 6088 1592 R 99 0.1 10:45.95 /hpexport2/home/user1/amber9/exe/sander.MPI cn27 56237 -p4amslave -p4yourname cn27
 9280 user1 25 0 37928 6068 1592 R 99 0.1 11:03.23 /hpexport2/home/user1/amber9/exe/sander.MPI cn27 56237 -p4amslave -p4yourname cn27
 9207 user1 16 0 38864 6128 1592 R 97 0.1 10:44.25 /hpexport2/home/user1/amber9/exe/sander.MPI cn27 56237 -p4amslave -p4yourname cn27
 9230 user1 16 0 37928 6068 1592 R 97 0.1 10:41.93 /hpexport2/home/user1/amber9/exe/sander.MPI cn27 56237 -p4amslave -p4yourname cn27
 9202 user1 16 0 35872 4848 2184 R 91 0.1 10:09.37 /hpexport2/home/user1/amber9/exe/sander.MPI -O -i /hpexport2/home/user1/abc/water/

        What is the problem?

Best regards,
2009-4-6

=========================================
Dechang Li, Ph.D Candidate
Department of Engineering Mechanics
Tsinghua University
Beijing 100084
P.R. China

Tel: +86-10-62773574(O)
Email: lidc02 at mails.tsinghua.edu.cn
=========================================

  

_______________________________________________
AMBER mailing list
AMBER_at_ambermd.org
http://lists.ambermd.org/mailman/listinfo/amber