[Wien] Bugfix for MPI-parallel lapw1 with NO linked ELPA

Ruh, Thomas thomas.ruh at tuwien.ac.at
Sat Jun 29 09:48:30 CEST 2019


Dear WIEN2k Users!


Thanks to Luigi Maduro, who sent a bug report to the mailing list [1], we uncovered a bug in MPI-parallel version of lapw1 that happens ONLY if you have compiled the program without ELPA. Both the sequential code and the parallel code WITH ELPA are unaffected by this bug.


If you do not have ELPA linked and run lapw MPI-parallel, the ELPA keyword in the case.in1(c) file will be read in erroneously. That will lead to a crash of lapw1(c)_mpi.


We strongly suggest that you install the latest ELPA (https://elpa.mpcdf.mpg.de/) and recompile, since you can expect a speedup of 200-300 % compared to ScaLAPACK.


If you cannot do so, there are two possible ways around that bug:


1) Either use the enclosed modules.F file: Copy it to your $WIENROOT/SRC_lapw1 directory, recompile the MPI versions (by typing make rp and make cp ) and copy the newly compiled executables (lapw1_mpi and lapw1c_mpi) to the $WIENROOT directory. This will fix the bug.


2) Replace the ELPA keyword in the case.in1(c) file with SCALA - however, that will only fix the current calculation, i.e. you have to do so for all the cases where you want to use MPI-parallel lapw1.


Best regards,

Thomas Ruh


[1] https://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/msg18772.html
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://zeus.theochem.tuwien.ac.at/pipermail/wien/attachments/20190629/ae410cf8/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: modules.F
Type: application/octet-stream
Size: 59253 bytes
Desc: modules.F
URL: <http://zeus.theochem.tuwien.ac.at/pipermail/wien/attachments/20190629/ae410cf8/attachment-0001.obj>


More information about the Wien mailing list