[Wien] MPI

Luis Ogando lcodacal at gmail.com
Fri Feb 22 14:46:29 CET 2013


" N.B., make sure to use the right blacs version when linking, this changes
with the different flavors of mpi. I often forget to do this. "

   Me too !!  :)
   Thank you again ! I am aware of your valuable advices !
   All the best,
                   Luis Ogando





2013/2/22 Laurence Marks <L-marks at northwestern.edu>

> Please be aware that the "-x LD_LIBRARY_PATH -x PATH" may be critical.
>
> To explain (for others if needed), openmpi does not by default propogate
> environmental variables, unless they have changed this recently. The
> authors of the code argue that this is for security reasons, and I believe
> openmpi has been designed to work with PBS and similar accounting systems
> common on large clusters.
>
> The "-x" propogates environmental variables, here the search path and
> information on shared libraries. On some systems you might need more.
>
> Unfortunately there is no robust way to propogate the environmental
> variables set by ulimit with openmpi. This has been patched by including a
> software setting for the relevant parameters which seems to work well
> except sometimes for Mac's. In principle it could crash on some systems if
> the sysadmin has set some restrictions.
>
> N.B., make sure to use the right blacs version when linking, this changes
> with the different flavors of mpi. I often forget to do this.
>
> ---------------------------
>
> Professor Laurence Marks
> Department of Materials Science and Engineering
> Northwestern University
> www.numis.northwestern.edu 1-847-491-3996
>
> "Research is to see what everybody else has seen, and to think what nobody
> else has thought"
> Albert Szent-Gyorgi
>
> On Feb 22, 2013 7:01 AM, "Luis Ogando" <lcodacal at gmail.com> wrote:
>
>>  Dear Prof. Blaha, Prof. Marks and Wien2k community,
>>
>>     I noticed that the "siteconfig_lapw" defines MPI_REMOTE as
>>
>>  setenv MPI_REMOTE "1"
>>
>>  even when one answers 0 to the correspondent question. I had previously
>> changed it to 0, but I believe that I recompiled something after that and
>> the value "1" was set again.
>>    I am doing another test to check if this the origin of my problems
>> with openmpi.
>>    Unfortunately, I am not proficient in shell scripts, so I can not
>> point the exact problem with the MPI_REMOTE setting (even if there is a
>> real one).
>>    Thank you one more time for the help,
>>                                     Luis Ogando
>>
>>
>>
>>
>>
>> 2013/2/20 Peter Blaha <pblaha at theochem.tuwien.ac.at>
>>
>>> On an SMP machine make sure you have in $WIENROOT/parallel_options
>>>
>>> setenv USE_REMOTE 0
>>> setenv MPI_REMOTE 0
>>>
>>> Am 20.02.2013 17:45, schrieb Luis Ogando:
>>>
>>>> Dear Prof. Marks,
>>>>
>>>>     Thank you very much for your prompt answer.
>>>>     I am using openmpi, but I believe that I am facing some of the
>>>> tricky issues you mentioned. I work in a SMP machine and the calculation
>>>> starts fine. After some tens of iterations, MPI suddenly asks for a
>>>> password and everything goes down to the drain.
>>>>     I am using open mpi 1.6. Do you recommend any older version ??
>>>>     All the best,
>>>>                    Luis Ogando
>>>>
>>>>
>>>>
>>>>
>>>> 2013/2/20 Laurence Marks <L-marks at northwestern.edu
>>>>  <mailto:L-marks at northwestern.**edu <L-marks at northwestern.edu>>>
>>>>
>>>>
>>>>     One that works.
>>>>
>>>>     Some versions of openmpi have problems although that is probably the
>>>>     best option for the future. There are some tricky issues with
>>>> openmpi
>>>>     related to how your flavor of ssh works, there is no standard and
>>>> some
>>>>     do not propogate kill commands which means that they can leave
>>>>     orphans.
>>>>
>>>>     An alternative is mvapich. In a benchmark that I did a few months
>>>> ago
>>>>     the Intel mpi was much better for AVX instructions, but that may
>>>> have
>>>>     changed.
>>>>
>>>>     Openmpi is easy to compile; mvapich can be a little trickier.
>>>>
>>>>     N.B., if you have fast connections, e.g. infiniband, they are more
>>>>     than fast enough and I have never seen this as rate limiting with
>>>>     Wien2k. With ethernet it matters.
>>>>
>>>>     On Wed, Feb 20, 2013 at 10:29 AM, Luis Ogando <lcodacal at gmail.com
>>>>      <mailto:lcodacal at gmail.com>> wrote:
>>>>      > Dear Wien2k community,
>>>>      >
>>>>      >    Is there any recommended flavor and version of an MPI compiler
>>>>     to use
>>>>      > with " Intel(R) Fortran Intel(R) 64 Compiler XE for applications
>>>>     running on
>>>>      > Intel(R) 64, Version 12.0.3.174 Build 20110309 " ?
>>>>      >    All the best,
>>>>      >                       Luis Ogando
>>>>
>>>>
>>>>
>>>>     --
>>>>     Professor Laurence Marks
>>>>     Department of Materials Science and Engineering
>>>>     Northwestern University
>>>>      www.numis.northwestern.edu <http://www.numis.**northwestern.edu<http://www.numis.northwestern.edu>
>>>> >
>>>>     1-847-491-3996 <tel:1-847-491-3996>
>>>>
>>>>     "Research is to see what everybody else has seen, and to think what
>>>>     nobody else has thought"
>>>>     Albert Szent-Gyorgi
>>>>     ______________________________**_________________
>>>>     Wien mailing list
>>>>      Wien at zeus.theochem.tuwien.ac.**at<Wien at zeus.theochem.tuwien.ac.at><mailto:
>>>> Wien at zeus.theochem.**tuwien.ac.at <Wien at zeus.theochem.tuwien.ac.at>>
>>>>     http://zeus.theochem.tuwien.**ac.at/mailman/listinfo/wien<http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> ______________________________**_________________
>>>> Wien mailing list
>>>> Wien at zeus.theochem.tuwien.ac.**at <Wien at zeus.theochem.tuwien.ac.at>
>>>> http://zeus.theochem.tuwien.**ac.at/mailman/listinfo/wien<http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien>
>>>>
>>>>
>>> --
>>> Peter Blaha
>>> Inst.Materials Chemistry
>>> TU Vienna
>>> Getreidemarkt 9
>>> A-1060 Vienna
>>> Austria
>>> +43-1-5880115671
>>>  ______________________________**_________________
>>> Wien mailing list
>>> Wien at zeus.theochem.tuwien.ac.**at <Wien at zeus.theochem.tuwien.ac.at>
>>> http://zeus.theochem.tuwien.**ac.at/mailman/listinfo/wien<http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien>
>>>
>>
>>
> _______________________________________________
> Wien mailing list
> Wien at zeus.theochem.tuwien.ac.at
> http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://zeus.theochem.tuwien.ac.at/pipermail/wien/attachments/20130222/7292d9bc/attachment.htm>


More information about the Wien mailing list