[Wien] System configuration

Indranil mal indranil.mal at gmail.com
Tue May 28 19:46:15 CEST 2019


Thank you for kind response
After following all the instructions given by you I have installed WIEN2k
with Intel parallel compiler. After compiling I got

Compile time errors (if any) were:
SRC_lapw0/compile.msg:make[1]: *** [lapw0_mpi] Error 1
SRC_lapw0/compile.msg:make: *** [para] Error 2
SRC_nlvdw/compile.msg:make[1]: *** [nlvdw_mpi] Error 1

On Tue, May 28, 2019 at 11:14 PM Indranil mal <indranil.mal at gmail.com>
wrote:

> Thank you for kind response
> After following all the instructions given by you I have installed WIEN2k
> with Intel parallel compiler. After compiling I got
>
> Compile time errors (if any) were:
> SRC_lapw0/compile.msg:make[1]: *** [lapw0_mpi] Error 1
> SRC_lapw0/compile.msg:make: *** [para] Error 2
> SRC_nlvdw/compile.msg:make[1]: *** [nlvdw_mpi] Error 1
> SRC_nlvdw/compile.msg:make: *** [para] Error 2
>
>
>
> On Tue, May 28, 2019 at 11:48 AM Gavin Abo <gsabo at crimson.ua.edu> wrote:
>
>> Continuing from post [1], I did a parallel mpi compile of WIEN2k 18.2
>> with fftw 3.3.8 (without ELPA), where -gcc-sys had to be added to CFLAGS
>> [2], and siteconfig completed having no compile errors as seen below.
>>
>> username at computername:~$ cd ~
>> username at computername:~$ wget http://www.fftw.org/fftw-3.3.8.tar.gz
>> ...
>> username at computername:~$ tar xvf fftw-3.3.8.tar.gz
>> ...
>> username at computername:~$ mv fftw-3.3.8 fftw3
>> username at computername:~$ cd fftw3
>> username at computername:~/fftw3$ ./configure FCC=ifort CC=icc MPICC=mpiicc
>> CFLAGS="-gcc-sys" --enable-mpi --prefix=$HOME/fftw3
>> ...
>> username at computername:~/fftw3$ make
>> ...
>> username at computername:~/fftw3$ make install
>> ...
>> username at computername:~/fftw3$ ls -l ~/fftw3/include ~/fftw3/lib
>> /home/username/fftw3/include:
>> total 220
>> -rw-r--r-- 1 username username  2447 May 27 22:57 fftw3.f
>> -rw-r--r-- 1 username username 54596 May 27 22:57 fftw3.f03
>> -rw-r--r-- 1 username username 31394 May 27 22:57 fftw3.h
>> -rw-r--r-- 1 username username 26983 May 27 22:57 fftw3l.f03
>> -rw-r--r-- 1 username username 18678 May 27 22:57 fftw3l-mpi.f03
>> -rw-r--r-- 1 username username 36969 May 27 22:57 fftw3-mpi.f03
>> -rw-r--r-- 1 username username  9624 May 27 22:57 fftw3-mpi.h
>> -rw-r--r-- 1 username username 25682 May 27 22:57 fftw3q.f03
>>
>> /home/username/fftw3/lib:
>> total 2108
>> drwxr-xr-x 3 username username    4096 May 27 22:57 cmake
>> -rw-r--r-- 1 username username 1933432 May 27 22:57 libfftw3.a
>> -rwxr-xr-x 1 username username     893 May 27 22:57 libfftw3.la
>> -rw-r--r-- 1 username username  201232 May 27 22:57 libfftw3_mpi.a
>> -rwxr-xr-x 1 username username     939 May 27 22:57 libfftw3_mpi.la
>> drwxr-xr-x 2 username username    4096 May 27 22:57 pkgconfig
>> username at computername:~/fftw3$ cd ~/WIEN2k
>> username at computername:~/WIEN2k$ ./siteconfig
>> ...
>>    Selection: P
>> ...
>>    Shared Memory Architecture? (y/N):N
>>    Do you know/need a command to bind your jobs to specific nodes?
>>    (like taskset -c). Enter N / your_specific_command: N
>> ...
>>    Set MPI_REMOTE to  0 / 1: 1
>> ...
>>       Remote shell (default is ssh) = ssh
>> ...
>>       Remote copy (default is scp) = scp
>> ...
>> Do you have MPI, ScaLAPACK, ELPA, or FFTW installed and intend to run
>>     finegrained parallel?
>>
>>     This is useful only for BIG cases (50 atoms and more / unit cell)
>>     and your HARDWARE has at least 16 cores (or is a cluster with
>> Infiniband)
>>     You need to KNOW details about your installed MPI, ELPA, and FFTW )
>>
>>     (y/N) y
>> ...
>>     Your compiler: mpiifort
>> ...
>>   Do you want to use a present ScaLAPACK installation? (Y,n): Y
>> ...
>>   Do you want to use the MKL version of ScaLAPACK? (Y,n):Y
>> ...
>> Do you use Intel MPI? (Y,n):Y
>> ...
>>    Your SCALAPACK_LIBS are:    -lmkl_scalapack_lp64
>> -lmkl_blacs_intelmpi_lp64
>>
>>    These options derive from your chosen settings:
>>
>>    SCALAPACKROOT:
>> /opt/intel/compilers_and_libraries_2019.4.243/linux/mkl/lib/
>>    SCALAPACK_LIBNAME:   mkl_scalapack_lp64
>>    BLACSROOT: /opt/intel/compilers_and_libraries_2019.4.243/linux/mkl/lib/
>>    BLACS_LIBNAME:       mkl_blacs_intelmpi_lp64
>>    MKL_TARGET_ARCH:     intel64
>> Is this correct? (Y,n): Y
>>   Do you want to use a present FFTW installation? (Y,n): Y
>> To abort the FFTW setup enter 'x' at any point!
>>   Do you want to automatically search for FFTW installations? (Y,n):
>> Y
>>   Please specify a comma separated list of directories to search! (If no
>> list is entered, /usr/local and /opt will be searched as default):
>> /home/username/fftw3
>>   Finding the required fftw2/3 mpi-files in /home/username/fftw3 ....
>>
>> /home/username/fftw3/lib/libfftw3_mpi.a
>> /home/username/fftw3/mpi/.libs/libfftw3_mpi.a
>> could not find fftw ....
>> Your present FFTW choice is:
>> Please specify whether you want to use FFTW3 (default) or FFTW2 (FFTW3 /
>> FFTW2): FFTW3
>>
>> Present FFTW root directory is:
>> Do you want to use a FFTW version from the list above? (Y,n):
>> Y
>> Please enter the line number of the chosen version!
>> 1
>>
>> The present target architecture of your FFTW library is: lib
>> Please specify the target achitecture of your FFTW library (e.g. lib64)
>> or accept present choice (enter):
>>
>> The present name of your FFTW library: fftw3
>> Please specify the name of your FFTW library or accept present choice
>> (enter):
>>
>>
>>    Your FFTW_OPT are:   -DFFTW3 -I/home/username/fftw3/include
>>    Your FFTW_LIBS are:  -L/home/username/fftw3/lib -lfftw3
>>    Your FFTW_PLIBS are: -lfftw3_mpi
>>
>>    These options derive from your chosen Settings:
>>
>>    FFTWROOT:            /home/username/fftw3/
>>    FFTW_VERSION:        FFTW3
>>    FFTW_LIB:            lib
>>    FFTW_LIBNAME:        fftw3
>> Is this correct? (Y,n): Y
>>
>>   Do you want to use ELPA? (y,N):
>> N
>>
>>     Please specify your parallel compiler options or accept the
>> recommendations (Enter - default)!:
>>
>>     Please specify your MPIRUN command or accept the recommendations
>> (Enter - default)!:
>>
>> ...
>>
>>     Current settings:
>>
>>           Parallel compiler      : mpiifort
>>           SCALAPACK_LIBS         : -lmkl_scalapack_lp64
>> -lmkl_blacs_intelmpi_lp64
>>           FFTW_OPT               : -DFFTW3 -I/home/username/fftw3/include
>>           FFTW_LIBS              : -L/home/username/fftw3/lib -lfftw3
>>           FFTW_PLIBS             : -lfftw3_mpi
>>           ELPA_OPT               :
>>           ELPA_LIBS              :
>>           FPOPT(par.comp.options): -O1 -FR -mp1 -w -prec_div -pc80 -pad
>> -ip -DINTEL_VML -traceback -assume buffered_io -I$(MKLROOT)/include
>>           MPIRUN command         : mpirun -np _NP_ -machinefile _HOSTS_
>> _EXEC_
>>
>>
>>       S   Accept, Save, and Quit
>>       R   Restart Configuration
>>       Q   Quit and abandon changes
>>
>>     Please accept and save these settings, restart the configuration, or
>> abandon
>>     your changes.
>>     If you want to change anything later on you can redo this whole
>> configuration
>>     process or you can change single items in "Compiling Options".
>> Selection: S
>> ...
>>    Selection: R
>> ...
>>    Selection: A
>> ...
>> Compile time errors (if any) were:
>>
>> ...
>>    Selection: Q
>> ...
>> username at computername:~/WIEN2k$ ls -l *_mpi
>> -rwxr-xr-x 1 username username  1249480 May 27 23:51 dstart_mpi
>> -rwxr-xr-x 1 username username  3639968 May 27 23:51 hfc_mpi
>> -rwxr-xr-x 1 username username  3631648 May 27 23:51 hf_mpi
>> -rwxr-xr-x 1 username username 14045712 May 27 23:52 lapw0_mpi
>> -rwxr-xr-x 1 username username  1825512 May 27 23:52 lapw1c_mpi
>> -rwxr-xr-x 1 username username  1816920 May 27 23:52 lapw1_mpi
>> -rwxr-xr-x 1 username username  3081184 May 27 23:52 lapw2c_mpi
>> -rwxr-xr-x 1 username username  3081104 May 27 23:52 lapw2_mpi
>> -rw-r--r-- 1 username username  1533560 May 27 23:52 lapwso_mpi
>> -rwxr-xr-x 1 username username  2613072 May 27 23:52 nlvdw_mpi
>> -rwxr-xr-x 1 username username  2279792 May 27 23:52 nmrc_mpi
>> -rwxr-xr-x 1 username username  2279896 May 27 23:52 nmr_mpi
>>
>> [1]
>> https://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/msg18663.html
>> [2] https://software.intel.com/en-us/forums/intel-c-compiler/topic/804830
>> _______________________________________________
>> Wien mailing list
>> Wien at zeus.theochem.tuwien.ac.at
>> http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
>> SEARCH the MAILING-LIST at:
>> http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://zeus.theochem.tuwien.ac.at/pipermail/wien/attachments/20190528/216fdf38/attachment.html>


More information about the Wien mailing list