[Wien] Error in parallel LAPW2
Fecher, Gerhard
fecher at uni-mainz.de
Fri Mar 26 23:44:48 CET 2021
I am wondering why the problem occurs only at the largest volume,
If the problem appears only at a +10% volume, what about the other volumes and for example +9% or +11%
Is the charge density from dstart or extrpolated from +5% volume, in the latter case was this correctly converged or already faulty ?
As Laurence suspects, is the structure ok, at all ?
Ciao
Gerhard
DEEP THOUGHT in D. Adams; Hitchhikers Guide to the Galaxy:
"I think the problem, to be quite honest with you,
is that you have never actually known what the question is."
====================================
Dr. Gerhard H. Fecher
Institut of Physics
Johannes Gutenberg - University
55099 Mainz
________________________________________
Von: Wien [wien-bounces at zeus.theochem.tuwien.ac.at] im Auftrag von Laurence Marks [laurence.marks at gmail.com]
Gesendet: Freitag, 26. März 2021 16:30
An: A Mailing list for WIEN2k users
Betreff: Re: [Wien] Error in parallel LAPW2
I took your file and initialized it, admittedly with a slightly different mixer. I had no problems, so it is hard to guess what your problem is. You need to look in *dayfile*, the output of the run (e.g. email if it was a remote job) or similar. Also look in the relevant case.output* for something wrong.
Several additional comments:
1) Based upon a quick BVS analysis, your unit cell should be around 38.644993 17.954890 32.663086 (-17 relative to your +10). I think using -10 to +10 is too large, and you should start with a more sensible estimate such as the BVS.
2) The number of k-points should roughly scale as 1/Volume. If I assume 1000 for a small 4x4x4 Angstrom cell then for your cell it should be about 16.
3) Why do you only have P-1? Is this based upon some refinement with low symmetry (that may be wrong) or what? I expect that the structure has higher symmetry. I have seen (too) many people on this list recently using P1 or P-1 cells, which are in most cases going to be wrong.
On Fri, Mar 26, 2021 at 1:39 AM Anupriya Nyayban <mamaniphy at gmail.com<mailto:mamaniphy at gmail.com>> wrote:
Dear Prof. Blaha,
Previously, I have followed the steps as:
deleted the case.struct file
copied the struct file for +10 as case.struct
x dstart
run_lapw -I -fc 10.0 -p
And, I have got the message as "forrtl: severe (67): input statement requires too much data, unit 10, file/case/./case.vector_1" at the first cycle.
Now, I have created a new case directory and saved the +10.struct as case.struct. Initialization has been done with RKmax = 7.0 and Kmesh = 150. The same message could be seen at the beginning when ""run_lapw -p -fc 10.0" has been executed.
Here, the struct file for +10 is attached below.
On Thu, 25 Mar 2021 at 12:34, Anupriya Nyayban <mamaniphy at gmail.com<mailto:mamaniphy at gmail.com>> wrote:
Dear Prof. Blaha,
Thank you very much for the help!!
First, I have activated both min and run_lapw in optimize.job to find the energy of the relaxed one, and could realize the serious mistake now.
Second, yes, the calculation crashes in the first cycle for +10.
Third, I have run x dstart, run_lapw -I -fc 10.0 -p for +10 and found the following message at the first cycle:
"forrtl: severe (67): input statement requires too much data, unit 10, file/case/./case.vector_1".
May I find the volume optimization with a smaller RKmax value to avoid the large data error and later I can have scf with the optimized lattice parameters. converged RKmax and Kmesh?
On Wed, 24 Mar 2021 at 17:42, Anupriya Nyayban <mamaniphy at gmail.com<mailto:mamaniphy at gmail.com>> wrote:
Dear experts and users,
In addition to the above information, I want to mention that commands used in optimize.job script are "min -I -j "run_lapw -I -fc 1.0 -i 40 -p"" and "run_lapw -p -ec 0.0001". The RKmax and kmesh are set to 7.0 and 150 respectively. The energy versus volume graph (fitted to Murnaghan equation of state) looks very different from the usual. I am not getting any idea why lapw2 crashes (error in paralle lapw2 is shown in lapw2.error) for +10% of change in volume. I need your valuable suggestions to proceed with the calculation.
On Fri, 19 Mar 2021 at 00:39, Anupriya Nyayban <mamaniphy at gmail.com<mailto:mamaniphy at gmail.com>> wrote:
Dear experts and users,
I was calculating the volume optimization in parallel (with 8 cores) of an orthorhombic 2*2*1 supercell having 80 atoms (in the supercell) in a HPC (Processor: dual socket 18 core per socket intel skylake processor, RAM: 96 GB ECC DDR4 2133 MHz RAM in balanced configuration, Operating system: CentOS-7.3, using compiler/intel 2018.5.274). The changes in volume were set to -10, -5, 0, 5, 10 (in %). I could find error only in lapw2.erro which states "error in parallel lapw2". The scf calculations have been completed for the volume changes of -10, -5, 0, 5%.
Looking forward for your suggestion.
If you need any additional information please let me know.
Thank you in advance.
--
With regards
Anupriya Nyayban
Ph.D. Scholar
Department of Physics
NIT Silchar
--
With regards
Anupriya Nyayban
Ph.D. Scholar
Department of Physics
NIT Silchar
--
With regards
Anupriya Nyayban
Ph.D. Scholar
Department of Physics
NIT Silchar
--
With regards
Anupriya Nyayban
Ph.D. Scholar
Department of Physics
NIT Silchar
_______________________________________________
Wien mailing list
Wien at zeus.theochem.tuwien.ac.at<mailto:Wien at zeus.theochem.tuwien.ac.at>
https://urldefense.com/v3/__http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien__;!!Dq0X2DkFhyF93HkjWTBQKhk!BZkxzKUPKVo8jz9msZ6emZ3TYR4PAsBxIM1BKkIHi62PXxywzRUjAfafZYmYaitVWdQnQw$
SEARCH the MAILING-LIST at: https://urldefense.com/v3/__http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html__;!!Dq0X2DkFhyF93HkjWTBQKhk!BZkxzKUPKVo8jz9msZ6emZ3TYR4PAsBxIM1BKkIHi62PXxywzRUjAfafZYmYaisqZXCsew$
--
Professor Laurence Marks
Department of Materials Science and Engineering
Northwestern University
www.numis.northwestern.edu<http://www.numis.northwestern.edu/>
"Research is to see what everybody else has seen, and to think what nobody else has thought" Albert Szent-Györgyi
More information about the Wien
mailing list