[Wien] Multicore on i7 iMac

Peter Blaha pblaha at theochem.tuwien.ac.at
Sun Jul 21 09:40:19 CEST 2013


Yes, this is a different setting.
In any case, almost every exp. work gives some bond distances and you
should compare them carefully with yours.


Am 20.07.2013 10:14, schrieb pluto at physics.ucdavis.edu:
> Dear Prof. Blaha,
>
> Thank you for rapid advise.
>
> I copied the file into case.indmc and it works! I tried SOC on different
> atoms, and it converges well. No additional splitted atomic positions
> happened after initso.
>
> Some articles say, that the space group of GTO is 62_Pbnm. After typing in
> the 4 atomic positions Wien2k adds other positions automatically. However,
> then sgroup finds the space group 62_Pnma. I believe this does not change
> the actual crystal, and refers only to the permutation of axes?
>
> Yes, I used ORB, it seems it comes only in the second iteration:
>
> LAPW0 END
>   LAPW1 END
>   LAPW1 END
> LAPWSO END
>   LAPW2 END
>   LAPW2 END
> LAPWDM END
>   CORE  END
>   CORE  END
>   MIXER END
> in cycle 2    ETEST: 0   CTEST: 0
>   LAPW0 END
>   ORB   END
>   ORB   END
>   LAPW1 END
>   LAPW1 END
> LAPWSO END
>   LAPW2 END
>   LAPW2 END
> LAPWDM END
>   CORE  END
>   CORE  END
>   MIXER END
> in cycle 3  ...
>
> Regards,
> Lukasz
>
>
>
> I suppose you used   -orb  ???
>
> initso told you, that you have to deal with indmc/inorb yourself.
>
> I suppose you have a case.indm  ??
> SO needs a case.indmc.
>
> cp case.indm case.indmc
>
> However, when so has splitted your atoms, it could be that you need to
> edit the indmc (and inorb) file and add the corresponding lines for
> your eventually splitted positions....
>
> Am 19.07.2013 11:31, schrieb pluto at physics.ucdavis.edu:
>> Dear WIEN2k experts,
>>
>> My GdTiO3 bulk calculation (space group 62) works well on the iMac with
>> GGA+U. However, when I try to add SOC there is an error after the first
>> iteration:
>>
>> lplucin at iff1276:GTO_GGAU_SOC % more STDOUT
>>    LAPW0 END
>>    LAPW1 END
>>    LAPW1 END
>>    LAPW1 END
>>    LAPW1 END
>> LAPWSO END
>> LAPWSO END
>> LAPW2 - FERMI; weighs written
>>    LAPW2 END
>>    LAPW2 END
>>    SUMPARA END
>> LAPW2 - FERMI; weighs written
>>    LAPW2 END
>>    LAPW2 END
>>    SUMPARA END
>>
>>>     stop error: the required input file GTO_GGAU_SOC.indmc for the next
>> step could not be found
>>
>> Could you advise?
>>
>> Regards,
>> Lukasz
>>
>>
>>> Dear Prof. Blaha,
>>>
>>> Thank you for your comment, it helps.
>>>
>>> My slab was indeed wrong not having the inversion symmetry. I have now
>>> constructed the slab with the inversions symmetry (was found automatic by
>>> nn and sgroup), and the calculation is running. Actually you have
>>> explained that to me already in 2008 (see below)... so I'm a bit
>>> embarrassed
>>>
>>> The calculation is for Fe slab. UG says, that it is sufficient to do SCF
>>> without spin-orbit, and then initialize spin-orbit and do one single SCF
>>> iteration with spin-orbit. Or should I converge SCF again after including
>>> spin-orbit? In any case, once I get the slab running properly, I am
>>> planning to test and compare the results.
>>>
>>> Regards,
>>> Lukasz
>>>
>>>
>>>
>>>
>>> -------- Original Message --------
>>> Subject:     Re: [Wien] Fe slab
>>> Date:     Fri, 04 Jul 2008 14:56:26 +0200
>>> From:     Peter Blaha <pblaha at theochem.tuwien.ac.at>
>>> Reply-To:     A Mailing list for WIEN2k users
>>> <wien at zeus.theochem.tuwien.ac.at>
>>> To:     A Mailing list for WIEN2k users <wien at zeus.theochem.tuwien.ac.at>
>>>
>>>
>>> Why would you use such a struct file ?
>>>
>>> a) With limited experience, start with small models, 5 or 7 layers only.
>>> b) I don't know how this struct file was created, but for sure a 15
>>> layer Fe(001) slab can have inversion symmetry and I'm pretty sure that
>>> WIEN should be able to find the proper symmetry (sgroup) when you allow
>>> for it.    Remove ALL numbering for atomes (Fe1,2,3,...) and run the
>>> initialization. sgroup (or nn in most cases) should always group 2 atoms
>>> together (make them equivalent, except the center). sgroup should also
>>> shift the atoms along z.
>>> Your calculations will be 4 times faster when you have inversion symmetry!
>>> c) When going for thicker slabs, you should also improve your vacuum. It
>>> does not make sense to go to a thick slab, but have surface-surface
>>> interactions through the vacuum.
>>> d) k-mesh: why would one use a "2" fold k-mesh in z-direction. With
>>> recent WIEN2k versions you can also specify a 21x21x1 mesh for kgen.
>>> e) Some cases may need more than 40 iterations. As long as it does not
>>> diverge, just continue.
>>> f) Eventually TEMP with some broadening (0.005) may help convergence.
>>> However, in particular with magnetic systems, make sure that the
>>> broadening does not influence your magnetism and recheck with smaller
>>> broadening.
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>   From this message alone, we one cannot say anything, maybe except that
> the
>>> calculations seem to have diverged (large CTEST).
>>>
>>> But when starting from a converged calculation, this is very unusual.
>>>
>>> PS: In a "slab" calculation it is almost always possible to setup the slab
>>> such that it has inversion symmetry and thus you would have case.in1
>>> instead of case.in1c.
>>> Inversion for slabs is a) much faster, b) avoids spurious dipoles from two
>>> different surfaces.
>>>
>>> On 04/17/2013 11:03 PM, pluto at physics.ucdavis.edu wrote:
>>>> Dear Prof. Blaha and WIEN2k experts,
>>>>
>>>> I have 4 physical cores (Intel(R) Core(TM) i7-3770 CPU @ 3.40GHz). It
>>>> seems that on my compilation using HT and filling up all 8 threads makes
>>>> some particular calculation just a bit faster compared to the settings
>>>> you
>>>> have suggested, but with HT CPU gets more hot (fan is on more often), so
>>>> it makes no sense. I will use the settings you have recommended.
>>>>
>>>> I have now an error for the slab with spin-polarized but without spin
>>>> orbit (see below part of STDOUT file). I tried to look at old emails
>>>> from
>>>> this group, but could not quickly find a solution. Same slab has
>>>> converged
>>>> before in a non-parallel mode with spin-polarized and with spin-orbit. I
>>>> use cutoff 8 Ry and:
>>>>
>>>> K-VECTORS FROM UNIT:4  -11.0       5.5   933   emin/emax/nband
>>>>
>>>> in case.in1c.
>>>>
>>>> Regards,
>>>> Lukasz
>>>>
>>>>
>>>> ...
>>>>     CORE  END
>>>>     CORE  END
>>>>     MIXER END
>>>> in cycle 22    ETEST: .3816818050000000   CTEST: 1.9705727
>>>>     LAPW0 END
>>>>     LAPW1 END
>>>>     LAPW1 END
>>>>     LAPW1 END
>>>>     LAPW1 END
>>>> LAPW2 - FERMI; weighs written
>>>>     LAPW2 END
>>>>     LAPW2 END
>>>>     SUMPARA END
>>>> LAPW2 - FERMI; weighs written
>>>>     LAPW2 END
>>>>     LAPW2 END
>>>>     SUMPARA END
>>>>     CORE  END
>>>>     CORE  END
>>>>     MIXER END
>>>> in cycle 23    ETEST: .2942263650000000   CTEST: 2.3441252
>>>>     LAPW0 END
>>>>     LAPW1 END
>>>>     LAPW1 END
>>>>     LAPW1 END
>>>>     LAPW1 END
>>>> FERMI - Error
>>>> cp: .in.tmp: No such file or directory
>>>>
>>>>>      stop error
>>>>
>>>>
>>>>
>>>>
>>>> How many "real" cores do you have ? Most likely only 4 (the 8 comes from
>>>> hyperthreading, but for numerical intensive application one should
>>>> probably not use hyperthreading).
>>>>
>>>> So the "best" performance can probably be reached either by:
>>>>
>>>> OMP_NUM_THREADS=2   and 2 lines in .machines or
>>>> OMP_NUM_THREADS=1   and 4 lines in .machines
>>>>
>>>> (it may even depend on the number of k-points in the specific case ..)
>>>>
>>>>
>>>>
>>>> On 04/16/2013 02:49 PM, pluto at physics.ucdavis.edu wrote:
>>>>> Dear Prof. Blaha,
>>>>>
>>>>> Thank you for the answer. In the meantime I have realized this mistake.
>>>>>
>>>>> I have now all 8 threads practically fully utilized (HT Intel i7 in
>>>>> iMac)
>>>>> for lapw1 and lapw2. It reduced the iteration from approx. 7.2min to
>>>>> 5.5min (compared to utilizing 4 threads only with OMP_NUM_THREADS).
>>>>>
>>>>> I think it solves my problems for now. Again thank you for your support
>>>>> and rapid answers.
>>>>>
>>>>> Regards,
>>>>> Lukasz
>>>>>
>>>>>
>>>>>
>>>>>> Your   .machines file is wrong. It contains more than one hostname per
>>>>>> line (or has a localhost:2)
>>>>>>
>>>>>> With the proper .machines file, mpirun is not needed:
>>>>>>      > bash: mpirun: command not found
>>>>>>
>>>>>> --------------------  .machines --------------
>>>>>> 1:localhost
>>>>>> 1:localhost
>>>>>>
>>>>>> This file with split the klist into two parts and run two lapw1-jobs
>>>>>> simultaneously.
>>>>>>
>>>>>> On 04/16/2013 11:48 AM, pluto at physics.ucdavis.edu wrote:
>>>>>>> Hello Prof. Blaha, Prof. Marks,
>>>>>>>
>>>>>>> ssh localhost works now without login!!
>>>>>>>
>>>>>>> I have more errors now when trying to run parallel mode, see below.
>>>>>>>
>>>>>>> In UG there are sections 5.5.1 (k-point parallelization) and 5.5.2
>>>>>>> (MPI
>>>>>>> parallelization). I understand these two modes are separate, and I
>>>>>>> would
>>>>>>> like to focus on k-point parallelization for now. I am not sure why
>>>>>>> there
>>>>>>> is an error regarding the mpirun. My parallel_options file is now:
>>>>>>>
>>>>>>> setenv USE_REMOTE 0
>>>>>>> setenv MPI_REMOTE "1"
>>>>>>> setenv WIEN_GRANULARITY 1
>>>>>>>
>>>>>>> But with other options I have the same error.
>>>>>>>
>>>>>>> I would appreciate if there is something obvious which I do wrong. In
>>>>>>> any
>>>>>>> case I will continue to work on the issue with my IT department here.
>>>>>>>
>>>>>>> Regards,
>>>>>>> Lukasz
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>       LAPW0 END
>>>>>>> bash: mpirun: command not found
>>>>>>>
>>>>>>> real    0m0.001s
>>>>>>> user    0m0.000s
>>>>>>> sys    0m0.000s
>>>>>>> Mo-bulk-so.scf1_1: No such file or directory.
>>>>>>>        ERROR IN OPENING UNIT:           9
>>>>>>>              FILENAME:
>>>>>>>       ./Mo-bulk-so.vector_1
>>>>>>>          STATUS: old          FORM:unformatted
>>>>>>> OPEN FAILED
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> -------- Original Message --------
>>>>>>> Subject:     Re: [Wien] Multicore on i7 iMac
>>>>>>> Date:     Mon, 15 Apr 2013 08:49:39 -0500
>>>>>>> From:     Laurence Marks <L-marks at northwestern.edu>
>>>>>>> Reply-To:     A Mailing list for WIEN2k users
>>>>>>> <wien at zeus.theochem.tuwien.ac.at>
>>>>>>> To:     A Mailing list for WIEN2k users
>>> <wien at zeus.theochem.tuwien.ac.at>
>>>>>>>
>>>>>>> You may also be able to turn off USE_REMOTE and MPI_REMOTE (set both
>>>>>>> to 0) and/or use something other than ssh to launch processes.
>>>>>>>
>>>>>>> On Mon, Apr 15, 2013 at 8:33 AM, Peter Blaha
>>>>>>> <pblaha at theochem.tuwien.ac.at> wrote:
>>>>>>>> Try it again. I think it ask this disturbing question only once !
>>>>>>>>
>>>>>>>> otherwise:   you must be able to do:
>>>>>>>>
>>>>>>>> ssh localhost
>>>>>>>>
>>>>>>>> and login without any other response.
>>>>>>>>
>>>>>>>>> The authenticity of host 'localhost (::1)' can't be established.
>>>>>>>>> RSA key fingerprint is
>>>>>>>>> 50:c3:da:fa:0c:35:c5:aa:d1:b4:c1:52:a1:18:08:c2.
>>>>>>>>> Are you sure you want to continue connecting (yes/no)? yes
>>>>>>>>>
>>>>>>>>> ^C
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> -------- Original Message --------
>>>>>>>>> Subject:      Re: [Wien] Multicore on i7 iMac
>>>>>>>>> Date:         Mon, 15 Apr 2013 08:16:11 +0200
>>>>>>>>> From:         Peter Blaha <pblaha at theochem.tuwien.ac.at>
>>>>>>>>> Reply-To:     A Mailing list for WIEN2k users
>>>>>>> <wien at zeus.theochem.tuwien.ac.at>
>>>>>>>>> To:   A Mailing list for WIEN2k users
>>>>>>>>> <wien at zeus.theochem.tuwien.ac.at>
>>>>>>>>>
>>>>>>>>> As you could see from your "top" command, only 1 core is used.
>>>>>>>>>
>>>>>>>>> The "simplest" thing is to set:
>>>>>>>>>
>>>>>>>>> export OMP_NUM_THREADS=2 (or 4)    (a commented line is already in
>>>>>>>>> your
>>>>>>>>> .bashrc after  "userconfig_lapw")
>>>>>>>>>
>>>>>>>>> This will use 2 (4) cores for parts of WIEN2k which uses the
>>>>>>>>> mkl-library.
>>>>>>>>> ---------------------------
>>>>>>>>> Next is k-parallel mode   (see UG for description), where you can
>>>>>>>>> use
>>>>>>>>> all
>>>>>>>>> your cores.
>>>>>>>>>
>>>>>>>>> We also have mpi-parallel, but I would not recommend it for a
>>>>>>>>> single
>>>>>>>>> mac,
>>>>>>>>> unless you have a problem with just one k-point.
>>>>>>>>>
>>>>>>>>> Please also notice the recent posting on the mailing-list about a
>>>>>>>>> recommended compiler option for a Mac (-heap-arrays), otherwise you
>>>>>>>>> cannot
>>>>>>>>> run wien2k on larger systems
>>>>>>>>>
>>>>>>>>> Peter Blaha
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> From:pluto at physics.ucdavis.edu
>>>>>>>>> Date:04/14/2013 07:48 PM
>>>>>>>>> To:"A Mailing list for WIEN2k users"
>>>>>>>>> <wien at zeus.theochem.tuwien.ac.at>
>>>>>>>>>
>>>>>>>>> Hello WIEN2k experts,
>>>>>>>>>
>>>>>>>>> I have a very simple question.
>>>>>>>>>
>>>>>>>>> Do I need to edit the .machines file for the multicore operation of
>>>>>>>>> the
>>>>>>>>> Intel i7 Quad Core CPU?
>>>>>>>>>
>>>>>>>>> My IT department (in FZ Juelich, Germany) has helped to compile
>>>>>>>>> Wien2k
>>>>>>>>> on
>>>>>>>>> an iMac with i7 CPU. It works very nice, no problem to calculate 15
>>>>>>>>> layer
>>>>>>>>> slab. However, I have a feeling, that all is done on a single core,
>>>>>>>>> and
>>>>>>>>> this is a real waste to time. I attach the screenshot of "top"
>>>>>>>>> program
>>>>>>>>> under the terminal, with lapw1c doing 100 k-points for band
>>>>>>>>> structure.
>>>>>>>>>
>>>>>>>>> Regards,
>>>>>>>>> Lukasz
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> _______________________________________________
>>>>>>>>> Wien mailing list
>>>>>>>>> Wien at zeus.theochem.tuwien.ac.at
>>>>>>>>> http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
>>>>>>>>> SEARCH the MAILING-LIST at:
>>>>>>>>> http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> _______________________________________________
>>>>>>>>> Wien mailing list
>>>>>>>>> Wien at zeus.theochem.tuwien.ac.at
>>>>>>>>> http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
>>>>>>>>> SEARCH the MAILING-LIST at:
>>>>>>> http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html
>>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>>
>>>>>>>>                                             P.Blaha
>>>>>>>> --------------------------------------------------------------------------
>>>>>>>> Peter BLAHA, Inst.f. Materials Chemistry, TU Vienna, A-1060 Vienna
>>>>>>>> Phone: +43-1-58801-165300             FAX: +43-1-58801-165982
>>>>>>>> Email: blaha at theochem.tuwien.ac.at    WWW:
>>>>>>>> http://info.tuwien.ac.at/theochem/
>>>>>>>> --------------------------------------------------------------------------
>>>>>>>> _______________________________________________
>>>>>>>> Wien mailing list
>>>>>>>> Wien at zeus.theochem.tuwien.ac.at
>>>>>>>> http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
>>>>>>>> SEARCH the MAILING-LIST at:
>>>>>>> http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> -- Professor Laurence Marks Department of Materials Science and
>>>>>>> Engineering Northwestern University www.numis.northwestern.edu
>>>>>>> 1-847-491-3996 "Research is to see what everybody else has seen, and
>>>>>>> to
>>>>>>> think what nobody else has thought" Albert Szent-Gyorgi
>>>>>>> _______________________________________________ Wien mailing list
>>>>>>> Wien at zeus.theochem.tuwien.ac.at
>>>>>>> http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien SEARCH the
>>>>>>> MAILING-LIST at:
>>>>>>> http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html
>>>>>>>
>>>>>>> _______________________________________________
>>>>>>> Wien mailing list
>>>>>>> Wien at zeus.theochem.tuwien.ac.at
>>>>>>> http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
>>>>>>> SEARCH the MAILING-LIST at:
>>>>>>> http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html
>>>>>>>
>>>>>>
>>>>>> --
>>>>>>
>>>>>>                                            P.Blaha
>>>>>> --------------------------------------------------------------------------
>>>>>> Peter BLAHA, Inst.f. Materials Chemistry, TU Vienna, A-1060 Vienna
>>>>>> Phone: +43-1-58801-165300             FAX: +43-1-58801-165982
>>>>>> Email: blaha at theochem.tuwien.ac.at    WWW:
>>>>>> http://info.tuwien.ac.at/theochem/
>>>>>> --------------------------------------------------------------------------
>>>>>> _______________________________________________
>>>>>> Wien mailing list
>>>>>> Wien at zeus.theochem.tuwien.ac.at
>>>>>> http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
>>>>>> SEARCH the MAILING-LIST at:
>>>>>> http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html
>>>>>>
>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> Wien mailing list
>>>>> Wien at zeus.theochem.tuwien.ac.at
>>>>> http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
>>>>> SEARCH the MAILING-LIST at:
>>>> http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html
>>>>>
>>>>
>>>
>>> --
>>>
>>>                                         P.Blaha
>>> --------------------------------------------------------------------------
>>> Peter BLAHA, Inst.f. Materials Chemistry, TU Vienna, A-1060 Vienna
>>> Phone: +43-1-58801-165300             FAX: +43-1-58801-165982
>>> Email: blaha at theochem.tuwien.ac.at    WWW:
>>> http://info.tuwien.ac.at/theochem/
>>> --------------------------------------------------------------------------
>>> _______________________________________________
>>> Wien mailing list
>>> Wien at zeus.theochem.tuwien.ac.at
>>> http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
>>> SEARCH the MAILING-LIST at:
>>> http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html
>>>
>>>
>>>
>>
>> _______________________________________________
>> Wien mailing list
>> Wien at zeus.theochem.tuwien.ac.at
>> http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
>> SEARCH the MAILING-LIST at:
> http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html
>>
>

-- 
-----------------------------------------
Peter Blaha
Inst. Materials Chemistry, TU Vienna
Getreidemarkt 9, A-1060 Vienna, Austria
Tel: +43-1-5880115671
Fax: +43-1-5880115698
email: pblaha at theochem.tuwien.ac.at
-----------------------------------------


More information about the Wien mailing list