<div dir="ltr">The command "ls *vsp*" returns only the files "TiC.vspdn_st" and "TiC.vsp_st", so it would appear that the file is not created at all when using the -p switch to runsp_lapw.<div><br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Mon, 22 Jul 2019 at 16:29, <<a href="mailto:tran@theochem.tuwien.ac.at">tran@theochem.tuwien.ac.at</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Is the file TiC.vspup emtpy?<br>
<br>
On Monday 2019-07-22 17:24, Ricardo Moreira wrote:<br>
<br>
>Date: Mon, 22 Jul 2019 17:24:42<br>
>From: Ricardo Moreira <<a href="mailto:ricardopachecomoreira@gmail.com" target="_blank">ricardopachecomoreira@gmail.com</a>><br>
>Reply-To: A Mailing list for WIEN2k users <<a href="mailto:wien@zeus.theochem.tuwien.ac.at" target="_blank">wien@zeus.theochem.tuwien.ac.at</a>><br>
>To: A Mailing list for WIEN2k users <<a href="mailto:wien@zeus.theochem.tuwien.ac.at" target="_blank">wien@zeus.theochem.tuwien.ac.at</a>><br>
>Subject: Re: [Wien] Parallel run problems with version 19.1<br>
><br>
>Hi and thanks for the reply,<br>
>Regarding serial calculations, yes in both non spin-polarized and spin-polarized everything runs properly in the cases you described. As<br>
>for parallel, it fails in both cases, with the error I indicated in my previous email.<br>
><br>
>Best Regards,<br>
>Ricardo Moreira<br>
><br>
>On Mon, 22 Jul 2019 at 16:09, <<a href="mailto:tran@theochem.tuwien.ac.at" target="_blank">tran@theochem.tuwien.ac.at</a>> wrote:<br>
> Hi,<br>
><br>
> What you should never do is to mix spin-polarized and<br>
> non-spin-polarized is the same directory.<br>
><br>
> Since Your explanations about spin-polarized/non-spin-polarized are a<br>
> bit confusing, the question is:<br>
><br>
> Does the calculation run properly (in parallel and serial) if everything<br>
> (init_lapw and run_lapw) in a directory is done from the beginning in<br>
> non-spin-polarized? Same question with spin-polarized.<br>
><br>
> F. Tran<br>
><br>
> On Monday 2019-07-22 16:37, Ricardo Moreira wrote:<br>
><br>
> >Date: Mon, 22 Jul 2019 16:37:30<br>
> >From: Ricardo Moreira <<a href="mailto:ricardopachecomoreira@gmail.com" target="_blank">ricardopachecomoreira@gmail.com</a>><br>
> >Reply-To: A Mailing list for WIEN2k users <<a href="mailto:wien@zeus.theochem.tuwien.ac.at" target="_blank">wien@zeus.theochem.tuwien.ac.at</a>><br>
> >To: <a href="mailto:wien@zeus.theochem.tuwien.ac.at" target="_blank">wien@zeus.theochem.tuwien.ac.at</a><br>
> >Subject: [Wien] Parallel run problems with version 19.1<br>
> ><br>
> >Dear Wien2k users,<br>
> >I am running Wien2k on a computer cluster, compiled with the GNU compilers version 7.2.3, OpenMPI with the operating system<br>
> Scientific Linux release<br>
> >7.4. Since changing from version 18.2 to 19.1 I've been unable to run Wien2k in parallel (neither mpi or simple k-parallel<br>
> seem to work), with<br>
> >calculations aborting with the following message:<br>
> ><br>
> > start (Mon Jul 22 14:49:31 WEST 2019) with lapw0 (40/99 to go)<br>
> ><br>
> > cycle 1 (Mon Jul 22 14:49:31 WEST 2019) (40/99 to go)<br>
> ><br>
> >> lapw0 -p (14:49:31) starting parallel lapw0 at Mon Jul 22 14:49:31 WEST 2019<br>
> >-------- .machine0 : 8 processors<br>
> >0.058u 0.160s 0:03.50 6.0% 0+0k 48+344io 5pf+0w<br>
> >> lapw1 -up -p (14:49:35) starting parallel lapw1 at Mon Jul 22 14:49:35 WEST 2019<br>
> >-> starting parallel LAPW1 jobs at Mon Jul 22 14:49:35 WEST 2019<br>
> >running LAPW1 in parallel mode (using .machines)<br>
> >2 number_of_parallel_jobs<br>
> > ava01 ava01 ava01 ava01(8) ava21 ava21 ava21 ava21(8) Summary of lapw1para:<br>
> > ava01 k=8 user=0 wallclock=0<br>
> > ava21 k=16 user=0 wallclock=0<br>
> >** LAPW1 crashed!<br>
> >0.164u 0.306s 0:03.82 12.0% 0+0k 112+648io 1pf+0w<br>
> >error: command /homes/fc-up201202493/WIEN2k_19.1/lapw1para -up uplapw1.def failed<br>
> ><br>
> >> stop error<br>
> ><br>
> >Inspecting the error files I find that the error printed to uplapw1.error is:<br>
> ><br>
> >** Error in Parallel LAPW1<br>
> >** LAPW1 STOPPED at Mon Jul 22 14:49:39 WEST 2019<br>
> >** check ERROR FILES!<br>
> > 'INILPW' - can't open unit: 18 <br>
> <br>
> > 'INILPW' - filename: TiC.vspup <br>
> <br>
> > 'INILPW' - status: old form: formatted <br>
> <br>
> > 'LAPW1' - INILPW aborted unsuccessfully.<br>
> > 'INILPW' - can't open unit: 18 <br>
> <br>
> > 'INILPW' - filename: TiC.vspup <br>
> <br>
> > 'INILPW' - status: old form: formatted <br>
> <br>
> > 'LAPW1' - INILPW aborted unsuccessfully.<br>
> ><br>
> >As this error message on previous posts to the mailing lists is often pointed out as being due to running init_lapw for a non<br>
> spin-polarized case<br>
> >and then using runsp_lapw I should clarify that this also occurs when attempting to run a non spin-polarized case and instead<br>
> of TiC.vspup it<br>
> >changes to TiC.vsp in the error message.<br>
> >I should point out, for it may be related to this issue that serial runs have the problem that after I perform my first<br>
> simulation on a folder if I<br>
> >first start with a spin-polarized case and then do another init_lapw for non spin-polarized and attempt to do run_lapw I get<br>
> the errors as in before<br>
> >of "can't open unit: 18" (this also occurs if I first run a non spin-polarized simulation and then attempt to do a<br>
> spin-polarized one on the same<br>
> >folder). The workaround I found for this was making a new folder, but since the error message is also related to<br>
> TiC.vsp/vspup I thought I would<br>
> >point it out still.<br>
> >Lastly, I should mention that I deleted the line "15,'$file.tmp$updn', 'scratch','unformatted',0" from x_lapw as I<br>
> previously had an error in<br>
> >lapw2 reported elsewhere on the mailing list, that Professor Blaha indicated was solved by deleting the aforementioned line<br>
> (and indeed it was).<br>
> >Whether or not this could possibly be related to the issues I'm having now, I have no idea, so I felt it right to point out.<br>
> >Thanks in advance for any assistance that might be provided.<br>
> ><br>
> >Best Regards,<br>
> >Ricardo Moreira<br>
> ><br>
> >_______________________________________________<br>
> Wien mailing list<br>
> <a href="mailto:Wien@zeus.theochem.tuwien.ac.at" target="_blank">Wien@zeus.theochem.tuwien.ac.at</a><br>
> <a href="http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien" rel="noreferrer" target="_blank">http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien</a><br>
> SEARCH the MAILING-LIST at: <a href="http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html" rel="noreferrer" target="_blank">http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html</a><br>
><br>
><br>
>_______________________________________________<br>
Wien mailing list<br>
<a href="mailto:Wien@zeus.theochem.tuwien.ac.at" target="_blank">Wien@zeus.theochem.tuwien.ac.at</a><br>
<a href="http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien" rel="noreferrer" target="_blank">http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien</a><br>
SEARCH the MAILING-LIST at: <a href="http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html" rel="noreferrer" target="_blank">http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html</a><br>
</blockquote></div>