[Wien] a parallel error of lapw0 with MBJLDA potential (updated)

Peter Blaha pblaha at theochem.tuwien.ac.at
Fri Jun 11 15:37:46 CEST 2010


We will have to make a more detailed analysis.
Apparently the problem is only near the nucleus for very heavy elements, where
tau or g2rho take values of + or - 10**15 because of diverging subterms.
It is probably completely uncritical for the gap,...

A temporary fix can be made in brj.f just before the   do while loop
add the following lines:

tauw = 0.125d0*grho*grho*2.d0/rho
if(tau.lt.tauw)  tau=tauw
          D = TAU - 0.25D0*GRHO**2D0/RHO
          Q = (1D0/6D0)*(G2RHO - 2D0*0.8D0*D)
if(tau.eq.tauw .and. q.lt.-1.d9)   q=-1.d9   ! eventually experiment with the value of q

    10    DO WHILE (DABS(F) .GE. TOL)

We will check if q has some physical bound which could be used as better estimate.


wanxiang feng schrieb:
> It seems that there is an endless loop in "brj.f"
> 
> ===================================================
>    10    DO WHILE (DABS(F) .GE. TOL)
>                 .....
>                 ......
>          ENDDO
> 
>          IF (X .LT. 0D0) THEN
>               ....
>               .....
>          ENDIF
> ===================================================
> 
> Under our own test, another case with the same structure
> (test2.struct) will not meet with this situation.
> This problem is some delicate, and ask for your help.
> Note: we perform the spin-polarized calculation plus spin-orbit
> interaction about these cases,  "runsp_lapw -so -p" .
> 
> 
> Thanks
> 
> feng
> 
> 
> 
> 2010/6/11 wanxiang feng <fengwanxiang at gmail.com>:
>> Thanks for your timely reply!
>>
>> I known that lapw0_mpi parallel will not speed up the small system,
>> like GaAs. It's just a test case before we calculate some larger
>> system.
>>
>> Now, The code can deal with the lapw0 parallel of GaAs correctly, but,
>> another problem arised when we calculate some larger system(3 or 8
>> inequivalent atoms in primitive cell)!
>>
>> The calcultion can not proceed normally at the second call of lapw0
>> whether or not use the parallel of lapw0.
>>
>> The job will not stop, and the lapw0 (or lapw0_mpi) run without any
>> error infomation, but lapw0 (or lapw0_mpi) will not done after a long
>> long time.
>>
>> ======== case.dayfile
>> ===============================================================
>>
>>    start       (Fri Jun 11 00:08:00 CST 2010) with lapw0 (1/99 to go)
>>
>>    cycle 1     (Fri Jun 11 00:08:00 CST 2010)  (1/99 to go)
>>
>>>   lapw0 -grr -p       (00:08:00) starting parallel lapw0 at Fri Jun 11 00:08:00 CST 2010
>> -------- .machine0 : 16 processors
>> 0.824u 0.444s 0:10.82 11.6%     0+0k 0+0io 0pf+0w
>>>   lapw0 -p    (00:08:11) starting parallel lapw0 at Fri Jun 11 00:08:11 CST 2010
>> -------- .machine0 : 16 processors
>>
>>
>>
>>
>>
>> =====================================================================================
>>
>> It seems that the code can't handle the system which contains more
>> than two inequivalent atoms. We doubt there are still some bugs in
>> lapw0 about MBJLDA potential.
>>
>> The attachment could be used as a test example.
>>
>>
>> Thanks,
>>
>> Feng.
>>
>>
>>
>> 2010/6/10 Peter Blaha <pblaha at theochem.tuwien.ac.at>:
>>> Thank's for the report. I could verify the problem with the mpi-parallel
>>> version for mBJ and a corrected version is on the web for download.
>>>
>>> HOWEVER: Please be aware, that   lapw0_mpi  parallelizes (mainly) over the
>>> atoms. Thus for GaAs I do not expect any speedup by using more than 2
>>> processors.
>>>
>>> Furthermore: Do NOT blindly use a "parallel" calculations. For these small
>>> systems a sequential calculation (maybe with OMP_NUM_THREAD set to 2) might
>>> be FASTER than a 8 or more fold parallel calculation. (parallel overhead,
>>> disk I/O, "summary" steps, slower memory access, ...)
>>> Always compare the "real timings" of lapw0/1/2 in the dayfiles of a
>>> sequential
>>> and parallel calculation.
>>>
>>
>> ------------------------------------------------------------------------
>>
>> _______________________________________________
>> Wien mailing list
>> Wien at zeus.theochem.tuwien.ac.at
>> http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien

-- 

                                       P.Blaha
--------------------------------------------------------------------------
Peter BLAHA, Inst.f. Materials Chemistry, TU Vienna, A-1060 Vienna
Phone: +43-1-58801-15671             FAX: +43-1-58801-15698
Email: blaha at theochem.tuwien.ac.at    WWW: http://info.tuwien.ac.at/theochem/
--------------------------------------------------------------------------


More information about the Wien mailing list