I forgot to add that after lapw2, all the output files should be written to a shared folder (not to scratch folder of each node) so that sumpara -p could access all the output files. In the version that I'm using (11.0), it seems that lapw1 is not executed locally. <div>
<br></div><div><br></div><div>Yundi<br><br><div class="gmail_quote">On Thu, Jun 27, 2013 at 1:37 PM, Yundi Quan <span dir="ltr"><<a href="mailto:yquan@ucdavis.edu" target="_blank">yquan@ucdavis.edu</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><br><br><div class="gmail_quote">---------- Forwarded message ----------<br>From: <b class="gmail_sendername">Stefaan Cottenier</b> <span dir="ltr"><<a href="mailto:Stefaan.Cottenier@ugent.be" target="_blank">Stefaan.Cottenier@ugent.be</a>></span><br>
Date: Thu, Jun 27, 2013 at 1:33 PM<br>Subject: Re: [Wien] scratch folder in k-point parallel calculation<br>To: A Mailing list for WIEN2k users <<a href="mailto:wien@zeus.theochem.tuwien.ac.at" target="_blank">wien@zeus.theochem.tuwien.ac.at</a>><br>
<br><br><div><div><br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
I'm working on a cluster with many nodes. Each node has its own /scratch<br>
folder which cannot be accessed by other nodes. My own data folder is<br>
accessible to all nodes. When the WIEN2k scratch folder is set to './',<br>
everything works fine except that all the data write/read are done in my<br>
own data folder which slows down the system. When the WIEN2k scratch<br>
folder is set to '/scratch', then the scratch files such as case.vector_<br>
created on one node would not be visible to the other. This would not be<br>
a problem for lapw1 -p and lapw2 -p if each node sticks to certain k<br>
points and searches for scratch files related to these k-points. Is<br>
there a way of configuring WIEN2k to work in this manner?<br>
</blockquote>
<br></div></div>
The k-point parallelization scheme works exactly in this way (mpi parallelization is different).<br>
<br>
STefaan<br>
<br>
<br>
______________________________<u></u>_________________<br>
Wien mailing list<br>
<a href="mailto:Wien@zeus.theochem.tuwien.ac.at" target="_blank">Wien@zeus.theochem.tuwien.ac.<u></u>at</a><br>
<a href="http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien" target="_blank">http://zeus.theochem.tuwien.<u></u>ac.at/mailman/listinfo/wien</a><br>
SEARCH the MAILING-LIST at: <a href="http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html" target="_blank">http://www.mail-archive.com/<u></u>wien@zeus.theochem.tuwien.ac.<u></u>at/index.html</a><br>
</div><br></div>
</blockquote></div><br></div>