<div dir="ltr">It is getting complicated when you do both MPI + k-point parallelization. In large calculations there is usually less k-points. Will it be possible to test MPI with the local scratch without k-point parallelization (i.e., k-point run sequentially)? This will help to mediate problems mentioned by Michael.<div>
<br></div><div>Oleg</div><div class="gmail_extra">
<br><br><div class="gmail_quote">On Thu, Feb 13, 2014 at 11:15 AM, Michael Sluydts <span dir="ltr"><<a href="mailto:Michael.Sluydts@ugent.be" target="_blank">Michael.Sluydts@ugent.be</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">
<div text="#000000" bgcolor="#FFFFFF">
<div>Hello César,<br>
<br>
To perform parallel calculations you do need a shared directory
between all nodes. As you have described '/home' appears to be a
form of shared storage.<br>
<br>
What its intention is, is of course not well-known to us. If it is
shared there is no direct reason it cannot function for wien,
however the problems that might occur are:<br>
-the connection to the shared storage is too slow - if it is not
meant to transfer files from and to during a calculation<br>
-there is not enough space - if it is just meant for temporary
storage and user settings<br>
-the /home is meant as a login system and not for actual user
usage<br>
<br>
Maybe contact the person who set up this cluster and ask them what
they recommend.<br>
<br>
<br>
Regards,<br>
<br>
Michael Sluydts<br>
<br>
César de la Fuente schreef op 13/02/2014 17:09:<br>
</div>
<blockquote type="cite"><div><div class="h5">
<pre>Hi,
I 'm doing some tests in the Memento cluster of the University of Zaragoza
on TiC system with a k- 100k pts , 4 nodes with 64 CPUs per node. It is a
system that does not share RAM and hard disks between nodes during
calculations. Initially the parallel computation with Wien2k stopped in the
first cycle because a file system problem. The variable $ SCRATCH point to
the local hard disks of each node used in the parallel computation.
Fortunately I was able to finish the calculation re-directing the variable $
SCRATCH to /home directory that it is shared by all nodes. The calculation
finish fine and it is correct, but I think that something is wrong. Wien2k
not seems to be originally designed for a parallel calculation using /home
as SCARTCH.
Is it correct to use the /home directory as SCRATCH in Wien2k ? , Can this
cause problems in the OS of Memento's cluster or in future wien2k
calculations?. In fact, I'm having other problems with wien2k in other
systems but I'm not sure if it is because SCRATCH points to /home directory
or not.
Thank you for your attention and appreciate any comment.
Sincerely,
Dr. César de la Fuente.
Depto. de Física de la Materia Condensada.
Edificio Torres-Quevedo
EINA-Universidad de Zaragoza.
C/María de Luna 3, 50018-Zaragoza (SPAIN).
</pre>
<br>
<fieldset></fieldset>
<br>
</div></div><pre>_______________________________________________
Wien mailing list
<a href="mailto:Wien@zeus.theochem.tuwien.ac.at" target="_blank">Wien@zeus.theochem.tuwien.ac.at</a>
<a href="http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien" target="_blank">http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien</a>
SEARCH the MAILING-LIST at: <a href="http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html" target="_blank">http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html</a>
</pre>
</blockquote>
<br>
</div>
<br>_______________________________________________<br>
Wien mailing list<br>
<a href="mailto:Wien@zeus.theochem.tuwien.ac.at">Wien@zeus.theochem.tuwien.ac.at</a><br>
<a href="http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien" target="_blank">http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien</a><br>
SEARCH the MAILING-LIST at: <a href="http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html" target="_blank">http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html</a><br>
<br></blockquote></div><br></div></div>