<div dir="ltr"><div>Dear Wien2k Community,</div><div><br></div><div> Just adding two points:</div><div><br></div><div>1) Dear Prof. Fecher, in my case, restarting the machine did not solve the problem and all the executables were compiled with the same Intel oneAPI version.</div><div><br></div><div>2) Dear Prof. Ondračka, in my case, I used the lapwso serial version. In fact, it was a very simple system, named, InP in the zinc blende symmetry.</div><div><br></div><div> If someone needs/wants more information, please let me know.</div><div> All the best,</div><div> Luis<br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">Em qua., 18 de ago. de 2021 às 04:32, Pavel Ondračka <<a href="mailto:pavel.ondracka@email.cz">pavel.ondracka@email.cz</a>> escreveu:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Dear Luis,<br>
<br>
one very easy thing to try could be to set environment variable<br>
OMP_STACKSIZE to something large like "1g", i.e., "export<br>
OMP_STACKSIZE=1g" before run_lapw. Small OpenMP stacksize caused issues<br>
for us previously so could be the case here as well. The only explicit<br>
omp loop in hsocalc.F does allocates all private variables on the stack<br>
and few of them are arrays, it is feasible this could be the case.<br>
<br>
2 prof. Blaha:<br>
from a very brief visual inspection of the OpenMP code in lapwso, I<br>
believe there could be another small issue with combined MPI OpenMP. At<br>
lines hsocalc.F:159 and hsocalc.F:160 the variables ibf_local and<br>
ibi_local should be probably private. This should not be the cause of<br>
the here reported problems though as that would only influence the<br>
lapwso_mpi. The rest seems OK though (at first glance).<br>
<br>
Best regards<br>
Pavel<br>
<br>
On Tue, 2021-08-17 at 18:18 -0300, Luis Ogando wrote:<br>
> Dear Wien2k Community,<br>
> Greetings!<br>
> This message is only to inform that I also had a fragmentation<br>
> problem with lapwso and Wien2k-21.<br>
> It was a very strange case. After a converged SCF cycle with mBJ<br>
> and SO, I could not run "run_lapw -NI -so ...". In this case, I<br>
> always got the following error after lapwso:<br>
> <br>
> forrtl: severe (174): SIGSEGV, segmentation fault occurred<br>
> Image PC Routine Line <br>
> Source <br>
> lapwso 000000000046A0EA Unknown Unknown<br>
> Unknown<br>
> libpthread-2.28.s 00001530B217B730 Unknown Unknown<br>
> Unknown<br>
> libiomp5.so 00001530B1D132FB Unknown Unknown<br>
> Unknown<br>
> libiomp5.so 00001530B1D13049 Unknown Unknown<br>
> Unknown<br>
> libiomp5.so 00001530B1D14B59 Unknown Unknown<br>
> Unknown<br>
> libiomp5.so 00001530B1D161E8 Unknown Unknown<br>
> Unknown<br>
> libiomp5.so 00001530B1D0C926 Unknown Unknown<br>
> Unknown<br>
> lapwso 000000000049CA86 Unknown Unknown<br>
> Unknown<br>
> lapwso 000000000040D77F hmsout_mp_finit_h 119<br>
> modules.F<br>
> lapwso 000000000042B94E MAIN__ 622<br>
> lapwso.F<br>
> lapwso 0000000000404D22 Unknown Unknown<br>
> Unknown<br>
> <a href="http://libc-2.28.so" rel="noreferrer" target="_blank">libc-2.28.so</a> 00001530A3E3609B __libc_start_main Unknown<br>
> Unknown<br>
> lapwso 0000000000404C2A Unknown Unknown<br>
> Unknown<br>
> 0.167u 0.051s 0:00.10 210.0% 0+0k 0+1976io 0pf+0w<br>
> error: command /home/ogando/Wien/Wien21/lapwso lapwso.def failed<br>
> <br>
> The solution was to change OMP_NUM_THREADS from 4 to 1.<br>
> I checked and it also worked with OMP_NUM_THREADS equal to 2 but<br>
> not 3.<br>
> If someone is interested in the compilation options or any other<br>
> information, please ask.<br>
> All the best,<br>
> Luis<br>
> <br>
> <br>
> <br>
> Em qui., 10 de jun. de 2021 às 08:17, Fecher, Gerhard<br>
> <<a href="mailto:fecher@uni-mainz.de" target="_blank">fecher@uni-mainz.de</a>> escreveu:<br>
> > Dear all,<br>
> > while running a -so calculation I hit a segmentation fault in<br>
> > lapwso<br>
> > (see below) with the latest version Wien2k21.1 that does NOT appear<br>
> > in 19.2.<br>
> > (appeared for two different systems in fresh directories)<br>
> > <br>
> > Did someone experience the same, or did I miss a report and may be<br>
> > not up to date?<br>
> > <br>
> > I used all settings the same (mostly default values), and the same<br>
> > compilers and options (Intel OneAPI 2021 2.0 and Parallel Studio XE<br>
> > 2017.4.056) for both versions, 21.1 and 19.2<br>
> > <br>
> > forrtl: severe (174): SIGSEGV, segmentation fault occurred<br>
> > Image PC Routine Line <br>
> > Source <br>
> > lapwso 000000000046CE0A Unknown Unknown <br>
> > Unknown<br>
> > libpthread-2.22.s 00002AFBCC6DAB10 Unknown Unknown <br>
> > Unknown<br>
> > libiomp5.so 00002AFBCCF2C8E8 Unknown Unknown <br>
> > Unknown<br>
> > lapwso 000000000049F7A6 Unknown Unknown <br>
> > Unknown<br>
> > lapwso 0000000000421E9E hmsec_ 926 <br>
> > hmsec.F<br>
> > <br>
> > line 926 is; deallocate(meigve) <br>
> > indeed, if this is the correct line at all.<br>
> > <br>
> > indeed in 21.2 (I have seen that hmsec.F is different in 19.2)<br>
> > <br>
> > Thanks for any suggestions that help<br>
> > <br>
> > Gerhard<br>
> > <br>
> > DEEP THOUGHT in D. Adams; Hitchhikers Guide to the Galaxy:<br>
> > "I think the problem, to be quite honest with you,<br>
> > is that you have never actually known what the question is."<br>
> > <br>
> > ====================================<br>
> > Dr. Gerhard H. Fecher<br>
> > Institut of Physics<br>
> > Johannes Gutenberg - University<br>
> > 55099 Mainz<br>
> > _______________________________________________<br>
> > Wien mailing list<br>
> > <a href="mailto:Wien@zeus.theochem.tuwien.ac.at" target="_blank">Wien@zeus.theochem.tuwien.ac.at</a><br>
> > <a href="http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien" rel="noreferrer" target="_blank">http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien</a><br>
> > SEARCH the MAILING-LIST at: <br>
> > <a href="http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html" rel="noreferrer" target="_blank">http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html</a><br>
> _______________________________________________<br>
> Wien mailing list<br>
> <a href="mailto:Wien@zeus.theochem.tuwien.ac.at" target="_blank">Wien@zeus.theochem.tuwien.ac.at</a><br>
> <a href="http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien" rel="noreferrer" target="_blank">http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien</a><br>
> SEARCH the MAILING-LIST at: <br>
> <a href="http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html" rel="noreferrer" target="_blank">http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html</a><br>
<br>
<br>
_______________________________________________<br>
Wien mailing list<br>
<a href="mailto:Wien@zeus.theochem.tuwien.ac.at" target="_blank">Wien@zeus.theochem.tuwien.ac.at</a><br>
<a href="http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien" rel="noreferrer" target="_blank">http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien</a><br>
SEARCH the MAILING-LIST at: <a href="http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html" rel="noreferrer" target="_blank">http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html</a><br>
</blockquote></div>