[Wien] machines file

Khuong P. Ong ongpk at ihpc.a-star.edu.sg
Fri Apr 8 12:01:15 CEST 2005


  Dear Prof. Peter Blaha and Jorissen Kevin,

  First of all I would like to thank for your help.

  Regarding to what Prof. Peter Blaha mentioned:

 > Do you have an mpi and sclapack installed ?

Yes, we have.

 >  If yes, check again your compile.msg in eg. SRC_lapw0

We checked and no error was reported from compile.msg file.

For Jorissen Kevin:

 >0/  Please provide us with the following information :
 >* your MPI software

we use mpich 125

 >* your fortran compiler

compiler = pathf90 (Fortran); pathcc (C)

 > and compilation settings

- 0 -freeform

 >* the libraries you are using

ATLAS: libatlas, libblas

libscalapack (from netlib.org)


 >2/   First you should check that the MPI commands are passed on correctly 
by looking at the parallel scripts lapw1para etc (The reason I >say this : 
using your second machines file, the program should not be looking for 
executables lapw1, but for executables lapw1_mpi !!  >Adding the -x switch 
on the first line of lapw1para gives you extra information).


We are trying to fix this. Could you explain more detail on this? Why can 
it pass lapw0_mpi but not lapw1_mpi? Thanks.

 >granularity:1
 >18:opto024
 >18:opto025
 >36:opto030

cannot run: lapw1 crashed in the first cycle.

  We did notice to what Prof. Blaha mentioned on the efficiency of mpi on 
dual node. The problem is not it will speed up the calculation or saves 
memory but when we run one wien2k job it will occupy all most > 99% CPU on 
one node. If in the same time someone run also wien2k, this will slow down 
the whole system. No one else can use host anymore :( because, in this 
case, 2 or 3 nodes will be occupied with more than 99% CPU. Due to this we 
would like to run wien2k in parallel mode.

We notice that we can run wien2k in parallel mode under the localhost, i.e.,

 >granularity:1
 >18:localhost
 >18:localhost
 >36:localhost

Everything is perfect with this .machines file (very fast, save a lot of 
time ) but only one person can use the system and no one else can use it :).

We are still looking for help on this. Many thanks in advance.

Have a nice weekend.

Regards,
Khuong
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://zeus.theochem.tuwien.ac.at/pipermail/wien/attachments/20050408/adbe0130/attachment.html


More information about the Wien mailing list