|
|||||||||||||||||||||||||||||||||
AMBER Archive (2007)Subject: Re: AMBER: Fwd: Amber9 parallel compilation openmpi issues
From: Mark Williamson (Mark.Williamson_at_imperial.ac.uk)
David A. Case wrote:
> It was after Amber9 was released that the openMPI folks changed their header
Hi David,
In this case, I'm using openmpi-1.2.3 . I don't generally use openmpi; I
http://www.open-mpi.org/software/ompi/v1.2/downloads/openmpi-1.2.3-1.src.rpm
My base system is FC5 with all updates applied and AMBER 9 with patches
> Can you check on what is inside your $MPI_HOME/include/mpif.h file?
/usr/include/mpif.h does have an
include 'mpif-common.h'
If I execute the following:
cd $AMBERHOME/src
In the case of the sander compile, evb_init.f does have the region:
#if defined(MPI)
correctly expanded after passing it via cpp, but _evb_init.f does have
Interestingly my version of Ifort:
Intel(R) Fortran Compiler for 32-bit applications, Version 9.0
seems to somehow "automagically" resolve this:
cpp -traditional -I/usr/include -P -DMPI evb_vars.f > _evb_vars.f
If this is repeated with gfortran (gcc-gfortran-4.1.1-51.fc5), there is
cd $AMBERHOME/src
cpp -traditional -I/usr/include -P -DMPI -xassembler-with-cpp
As an aside, this can be resolved by setting
FC= gfortran -I/usr/include
in config.h
So, I agree with everything you've said. It seems like ifort has some
regards,
Mark
-----------------------------------------------------------------------
| |||||||||||||||||||||||||||||||||
|