[Wien] Compilation problem with mpi

Dima Vingurt dimavingurt at gmail.com
Wed Feb 11 11:07:04 CET 2009


Dear Wien2k users

I what to compile the wien2k 8.3 in parallel execution on cluster with
Red hat 4, 64 bit
i have the following libraries:
----
ls /opt/intel/mkl/9.1.021/lib/em64t
libfftf_intel.a  libmkl_gfortran.a   libmkl_mc.so       libmkl_vml_p4n.so
libguide.a       libmkl_gfortran.so  libmkl_p4n.so      libvml.so
libguide.so      libmkl_ias.so       libmkl.so          mkl95_blas.mod
libmkl_blas95.a  libmkl_lapack95.a   libmkl_solver.a    mkl95_lapack.mod
libmkl_def.so    libmkl_lapack.a     libmkl_vml_def.so  mkl95_precision.mod
libmkl_em64t.a   libmkl_lapack.so    libmkl_vml_mc.so
----
ls  /usr/lib64 | grep mpi

libcmpiCppImpl.so.1
libcmpiOSBase_CommonFsvol.so
libcmpiOSBase_CommonFsvol.so.0
libcmpiOSBase_CommonFsvol.so.0.0.0
libcmpiOSBase_Common.so
libcmpiOSBase_Common.so.0
libcmpiOSBase_Common.so.0.0.0
liblamf77mpi.a
liblamf77mpi.la
liblamf77mpi.so
liblamf77mpi.so.0
liblamf77mpi.so.0.0.0
liblammpi++.a
liblammpi++.la
liblammpio.a
liblammpi++.so
liblammpi++.so.0
liblammpi++.so.0.0.0
libmpi.a
libmpiblacs.a
libmpiblacsCinit.a
libmpiblacsCinit.so
libmpiblacsCinit.so.1
libmpiblacsCinit.so.1.0.0
libmpiblacsF77init.a
libmpiblacsF77init.so
libmpiblacsF77init.so.1
libmpiblacsF77init.so.1.0.0
libmpiblacs.so
libmpiblacs.so.1
libmpiblacs.so.1.0.0
libmpi.la
libmpi.so
libmpi.so.0
libmpi.so.0.0.0
lib-org-eclipse-jdt-core-compiler-2.1.3.so
lib-org-eclipse-jdt-core-compiler.so
lib-org-eclipse-jdt-internal-compiler-2.1.3.so
lib-org-eclipse-jdt-internal-compiler.so
libpegcompiler.so.1
----
ls /usr/lib64 | grep lapack
liblapack.a
liblapack.so
liblapack.so.3
liblapack.so.3.0
liblapack.so.3.0.3
libscalapack.so.
libscalapack.so.1
libscalapack.so.1.0.0
-----
ls /usr/lib64 | grep blas
libblas.a
libblas.so
libblas.so.3
libblas.so.3.0
libblas.so.3.0.3
libgslcblas.a
libgslcblas.la
libgslcblas.so
libgslcblas.so.0
libgslcblas.so.0.0.0

my options are:
------------------
**********************************
   *  Configure parallel execution  *
   **********************************
Shared Memory Architecture? (y/n):n
Remote shell (default is ssh) =ssh
 Your compiler: ifort
   Do you have MPI and Scalapack installed and intend to run
   finegrained parallel? (This is usefull only for BIG cases)!
   (y/n) y
Current settings:
     RP  RP_LIB(SCALAPACK+PBLAS): -L/usr/lib64
-L/storage/intel/mkl/9.1.021/lib/em64t -lmpi -lmkl_lapack -lscalapack
-lmpiblacs -lmpiblacs -lgslcblas
     FP  FPOPT(par.comp.options): -FR -mp1 -w -prec_div -pc80 -pad -ip
-DINTEL_VML -traceback
     MP  MPIRUN commando        : mpirun -np _NP_ -machinefile _HOSTS_
_EXEC_

--------------------------------------------
and i have the following error
ifort -o lapw0_mpi cputim.o modules.o reallocate.o ainv.o am05v2.o b88.o
blyp.o
brj.o charg2.o  charg3.o charge.o chfac.o chslv.o corgga.o corpbe_tpss.o
cub_xc_
back.o corlsd.o drho.o dfxhpbe.o dfxtpss.o dylm.o efg.o energy.o epot1.o
eramps.
o errclr.o errflg.o ev92.o ev92ex.o exch.o exch17.o exrevpbe.o fithi.o
fxhpbe.o
fx_tpss.o gbass.o gcor.o gea.o geaex.o getfft.o getff1.o gpoint.o grans.o
gtfnam
.o hcth.o ifflim.o kcis.o lapw0.o latgen.o multfc.o multsu.o outerr.o pbea.o
pbe
1.o pbe2.o pbesol.o poissn.o potfac.o pwxad4.o pwxad5.o qranf.o readstruct.o
rea
n0.o rean1.o rean3.o rean4.o rhopw.o rotate.o rotdef.o rpbe.o setff0.o
setff1.o
setfft.o setff2.o seval.o sevald.o sevaldd.o sevali.o sevalin.o sicpbe.o
sicpbe_
tpss.o sogga.o sphbes.o spline.o srolyl.o stern.o sumfac.o suml.o th1.o
th2.o vp
w91.o vresp.o vs98.o vxc15.o vxc16.o vxc17.o vxc24.o vxc26.o vxclm2.o
vxcpw2.o v
xi35.o vxi35a.o wc05.o workf1.o xcener.o xcpot.o xcpot1.o xcpot3.o ykav.o
ylm.o
 zfft3d.o gpointm.o -FR -mp1 -w -prec_div -pc80 -pad -ip -DINTEL_VML
-traceback
-L/opt/intel/mkl/9.1.021/lib/em64t -lpthread -L/opt/intel/mkl/9.1.021/li
b/em64t -lmkl_lapack -lmkl_em64t -lguide -lvml -pthread -L/usr/lib64 -L/opt/
intel/mkl/9.1.021/lib/em64t -lmpi -lmkl_lapack -lscalapack -lmpiblacs
-lmpiblacs
 -lgslcblas
ld: cannot find -lscalapack
make[1]: *** [lapw0_mpi] Error 1
------------------------------------------
what should i do?
(compilation without mpi was successful).
Thank you in advance.

Dima Vingurt.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://zeus.theochem.tuwien.ac.at/pipermail/wien/attachments/20090211/d9cf3145/attachment.html


More information about the Wien mailing list