<div dir="ltr"><div>Dear Wien2k users,</div><div><br></div><div>I have successfully installed the Wien2k_21 version at the HPC cluster (CrayXC40) of my institute (<a href="http://www.serc.iisc.ac.in/supercomputer/for-traditional-hpc-simulations-sahasrat/">http://www.serc.iisc.ac.in/supercomputer/for-traditional-hpc-simulations-sahasrat/</a>),. While running the parallel calculations there, I noticed a "lapw0_mpi: error" as given below. But when I submit the same job by removing the "-p" switch, it has been completed without errors. </div><div>The error has been given below,</div><div><br></div><div>============</div><div>/home/proj/21/isuch/soft/cray/wien2k/lapw0_mpi: error while loading shared libraries: libmpi_usempif08.so.40: cannot open shared object file: No such file or directory<br>[1]    Exit 127                      mpirun -np 48 /home/proj/21/isuch/soft/cray/wien2k/lapw0_mpi lapw0.def >> .time00<br>cat: No match.<br>grep: *scf1*: No such file or directory<br>grep: lapw2*.error: No such file or directory<br></div><div><br></div><div>=========</div><div><br></div><div>I did not find such a library in my MPI_ROOT/lib path and nowhere I found it within the cluster. I have searched on the internet for that library and found that it may be installed externally (<a href="https://pkgs.org/download/libmpi_usempif08.so.40(openmpi-i386)">https://pkgs.org/download/libmpi_usempif08.so.40(openmpi-i386)</a>). Kindly suggest to me what I should do, whether I miss something while linking available library paths or should  I go with a new installation of that library..? </div><div><br></div><div>Your suggestions are really helpful to me.</div><div><br></div><div>the WIEN2k_OPTIONS are given below</div><div><br></div><div>==========</div><div>current:FOPT:-O -FR -mp1 -w -prec_div -pc80 -pad -ip -DINTEL_VML -traceback -assume buffered_io -I$(MKLROOT)/include -I/opt/intel/compilers_and_libraries_2019.5.281/linux/mpi/intel64/include<br>current:FPOPT:-O -FR -mp1 -w -prec_div -pc80 -pad -ip -DINTEL_VML -traceback -assume buffered_io -I$(MKLROOT)/include -I/opt/intel/compilers_and_libraries_2019.5.281/linux/mpi/intel64/include<br>current:OMP_SWITCH:-qopenmp<br>current:OMP_SWITCHP:-qopenmp<br>current:LDFLAGS:$(FOPT) -L$(MKLROOT)/lib/$(MKL_TARGET_ARCH) -lpthread -lm -ldl -liomp5<br>current:DPARALLEL:'-DParallel'<br>current:R_LIBS:-lmkl_intel_lp64 -lmkl_intel_thread -lmkl_core<br>current:RP_LIBS:$(R_LIBS)<br>current:FFTWROOT:/opt/cray/pe/fftw/<a href="http://3.3.8.3/x86_64/">3.3.8.3/x86_64/</a><br>current:FFTW_VERSION:FFTW3<br>current:FFTW_LIB:lib<br>current:FFTW_LIBNAME:fftw3<br>current:LIBXCROOT:/home/proj/21/isuch/soft/cray/libxc/<br>current:LIBXC_FORTRAN:xcf03<br>current:LIBXC_LIBNAME:xc<br>current:LIBXC_LIBDNAME:lib<br>current:SCALAPACKROOT:$(MKLROOT)/lib/<br>current:SCALAPACK_LIBNAME:libmkl_scalapack_lp64<br>current:BLACSROOT:$(MKLROOT)/lib/<br>current:BLACS_LIBNAME:libmkl_blacs_intelmpi_lp64<br>current:ELPAROOT:<br>current:ELPA_VERSION:<br>current:ELPA_LIB:<br>current:ELPA_LIBNAME:<br>current:MPIRUN:mpirun -np _NP_ -machinefile _HOSTS_ _EXEC_<br>current:CORES_PER_NODE:24<br>current:MKL_TARGET_ARCH:intel64<br>===========</div><div><br></div><div>thanks</div><br clear="all"><div><div dir="ltr" class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr">Venkatesh<div>Postdoctoral Fellow, </div><div>Department of Instrumentation and Applied Physics </div><div>IISc Bangalore, India</div></div></div></div></div>