<div dir="ltr">Dear W2K, <div><br></div><div style>On an AIX 560 server with 16 processors, I have been running scf for NiO supercell (2x2x2) in serial as well as MPI parallel (one kpoint). The serial version runs fine. When running in parallel, the following error appears:</div>
<div style><br></div><div style><div>STOP LAPW2 - FERMI; weighs written</div><div>"errclr.f", line 64: 1525-014 The I/O operation on unit 99 cannot be completed because an errno value of 2 (A file or directory in the path name does not exist.) was received while opening the file. The program will stop.</div>
<div><br></div><div style>A similar error that appears which does not stop the program is the following:</div><div style><br></div><div style><div><div>STOP LAPW0 END</div><div>"inilpw.f", line 233: 1525-142 The CLOSE statement on unit 200 cannot be completed because an errno value of 2 (A file or directory in the path name does not exist.) was received while closing the file. The program will stop.</div>
<div>STOP LAPW1 END</div></div><div><br></div><div> </div><div style>The second error is always there, while the former only appears with more than 2 (4,8 or 16) processors. Running the scf in serial took ~6.5 minutes, in parallel with two processors ~9.5 minutes. The problem occurs regardless of MPI/USER_REMOTE set to 0 or 1.</div>
<div style><br></div><div style><br></div></div><div style>My compile options:</div><div style><br></div><div style><div>FC = xlf90</div><div>MPF = mpxlf90</div><div>CC = xlc -q64</div><div>FOPT = -O5 -qarch=pwr6 -q64 -qextname=flush:w2k_catch_signal</div>
<div>FPOPT = -O5 -qarch=pwr6 -q64 -qfree=f90 -qextname=flush:w2k_catch_signal:fftw_mpi_execute_dft</div><div>#DParallel = '-WF,-DParallel'</div><div>FGEN = $(PARALLEL)</div><div>LDFLAGS = -L /lapack-3.4.2/ -L /usr/lpp/ppe.poe/lib/ -L /usr/local/lib -I /usr/include -q64 -bnoquiet</div>
<div>R_LIBS = -llapack -lessl -lfftw3 -lm -lfftw3_essl_64 </div><div>RP_LIBS = $(R_LIBS) -lpessl -lmpi -lfftw3_mpi </div><div><br></div><div style>WIEN_MPI_RUN='poe _EXEC_ -procs _NP_'</div><div style><br></div>
<div style>.machines and host.list attached.</div><div style><br></div><div style>As always, any advice on this matter would be great, </div><div style><br></div><div style>Oliver Albertini</div></div></div></div>