[Wien] MPI error

Laurence Marks laurence.marks at gmail.com
Thu May 6 19:14:19 CEST 2021


Peter beat me to the response -- please do as he says and move stepwise
forward, posting single steps if they fail.

On Thu, May 6, 2021 at 10:38 AM Peter Blaha <pblaha at theochem.tuwien.ac.at>
wrote:

> Once the blacs problem has been fixed, the next step is to run lapw0 in
> sequential and parallel mode.
>
> Add:
>
> x lapw0     and check the case.output0 and case.scf0 files (copy them to
> a different name) as well as the message from the queuing system.
>
> add:   mpirun -np 4 $WIENROOT/lapw0_mpi lapw0.def
> and check the messages and compare the results with the previous
> sequential run.
>
> And finally:
> create a .machines file with:
> lapw0:localhost:4
>
> and execute
> x lapw0 -p
>
> -------------
> The same thing could be made with lapw1
>
>
> --
Professor Laurence Marks
Department of Materials Science and Engineering
Northwestern University
www.numis.northwestern.edu
"Research is to see what everybody else has seen, and to think what nobody
else has thought" Albert Szent-Györgyi
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://zeus.theochem.tuwien.ac.at/pipermail/wien/attachments/20210506/8dc34e20/attachment.htm>


More information about the Wien mailing list