[Wien] scratch folder in k-point parallel calculation

Yundi Quan quanyundi at gmail.com
Fri Jun 28 05:09:31 CEST 2013


Hi, Oleg,
Thanks for you advice.


Yundi


On Thu, Jun 27, 2013 at 8:08 PM, Yundi Quan <yquan at ucdavis.edu> wrote:

>
>
> ---------- Forwarded message ----------
> From: Oleg Rubel <orubel at lakeheadu.ca>
> Date: Thu, Jun 27, 2013 at 7:46 PM
> Subject: Re: [Wien] scratch folder in k-point parallel calculation
> To: wien at zeus.theochem.tuwien.ac.at
>
>
> If you are talking about a mixed (MPI & k-parallel) job, there are
> suggestions in FAQs: http://www.wien2k.at/reg_user/**faq/pbs.html<http://www.wien2k.at/reg_user/faq/pbs.html>
>
> I use SGE scheduler and the following parallel_options file:
>
> setenv USE_REMOTE 0
> setenv MPI_REMOTE 0
> setenv WIEN_GRANULARITY 1
> setenv WIEN_MPIRUN "mpiexec -machinefile _HOSTS_ -n _NP_ _EXEC_"
>
> Eventually, it works with the local SCRATCH directory. I can share a
> submission script for the mixed MPI+k parallel job, if needed.
>
> Oleg
>
>
>
> On 13-06-27 4:46 PM, Yundi Quan wrote:
>
>> I forgot to add that after lapw2, all the output files should be written
>> to a shared folder (not to scratch folder of each node) so that sumpara
>> -p could access all the output files. In the version that I'm using
>> (11.0), it seems that lapw1 is not executed locally.
>>
>>
>> Yundi
>>
>>
>> On Thu, Jun 27, 2013 at 1:37 PM, Yundi Quan <yquan at ucdavis.edu
>> <mailto:yquan at ucdavis.edu>> wrote:
>>
>>
>>
>>     ---------- Forwarded message ----------
>>     From: *Stefaan Cottenier* <Stefaan.Cottenier at ugent.be
>>     <mailto:Stefaan.Cottenier@**ugent.be <Stefaan.Cottenier at ugent.be>>>
>>     Date: Thu, Jun 27, 2013 at 1:33 PM
>>     Subject: Re: [Wien] scratch folder in k-point parallel calculation
>>     To: A Mailing list for WIEN2k users <wien at zeus.theochem.tuwien.ac.**
>> at <wien at zeus.theochem.tuwien.ac.at>
>>     <mailto:wien at zeus.theochem.**tuwien.ac.at<wien at zeus.theochem.tuwien.ac.at>
>> >>
>>
>>
>>
>>         I'm working on a cluster with many nodes. Each node has its own
>>         /scratch
>>         folder which cannot be accessed by other nodes. My own data
>>         folder is
>>         accessible to all nodes. When the WIEN2k scratch folder is set
>>         to './',
>>         everything works fine except that all the data write/read are
>>         done in my
>>         own data folder which slows down the system. When the WIEN2k
>> scratch
>>         folder is set to '/scratch', then the scratch files such as
>>         case.vector_
>>         created on one node would not be visible to the other. This
>>         would not be
>>         a problem for lapw1 -p and lapw2 -p if each node sticks to
>> certain k
>>         points and searches for scratch files related to these k-points.
>> Is
>>         there a way of configuring WIEN2k to work in this manner?
>>
>>
>>     The k-point parallelization scheme works exactly in this way (mpi
>>     parallelization is different).
>>
>>     STefaan
>>
>>
>>     ______________________________**___________________
>>     Wien mailing list
>>     Wien at zeus.theochem.tuwien.ac._**_at
>>     <mailto:Wien at zeus.theochem.**tuwien.ac.at<Wien at zeus.theochem.tuwien.ac.at>
>> >
>>     http://zeus.theochem.tuwien.__**ac.at/mailman/listinfo/wien<http://ac.at/mailman/listinfo/wien>
>>
>>     <http://zeus.theochem.tuwien.**ac.at/mailman/listinfo/wien<http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien>
>> >
>>     SEARCH the MAILING-LIST at:
>>     http://www.mail-archive.com/__**wien@zeus.theochem.tuwien.ac._**
>> _at/index.html<http://www.mail-archive.com/__wien@zeus.theochem.tuwien.ac.__at/index.html>
>>     <http://www.mail-archive.com/**wien@zeus.theochem.tuwien.ac.**
>> at/index.html<http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html>
>> >
>>
>>
>>
>>
>>
>> ______________________________**_________________
>> Wien mailing list
>> Wien at zeus.theochem.tuwien.ac.**at <Wien at zeus.theochem.tuwien.ac.at>
>> http://zeus.theochem.tuwien.**ac.at/mailman/listinfo/wien<http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien>
>> SEARCH the MAILING-LIST at:  http://www.mail-archive.com/**
>> wien at zeus.theochem.tuwien.ac.**at/index.html<http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html>
>>
>>  ______________________________**_________________
> Wien mailing list
> Wien at zeus.theochem.tuwien.ac.**at <Wien at zeus.theochem.tuwien.ac.at>
> http://zeus.theochem.tuwien.**ac.at/mailman/listinfo/wien<http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien>
> SEARCH the MAILING-LIST at:  http://www.mail-archive.com/**
> wien at zeus.theochem.tuwien.ac.**at/index.html<http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://zeus.theochem.tuwien.ac.at/pipermail/wien/attachments/20130627/de6f04be/attachment.htm>


More information about the Wien mailing list