Page 2 of 2

Re: MPIRUN errors

Posted: Thu Jul 07, 2016 3:53 am
by eliephys78
Dear Samuel,

It was indeed the pseudopotential that is causing the problem. i tried a pz (LDA) pseudopotential and no mpirun errors anymore. however the code stopped with an error:

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error in routine wannierize (1):
inconsistent nscf and elph k-grids
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

stopping ...

I do not understand why. I have nk1=12, nq2=12, nq3=12 and nk1=6, nk2=6, nk3=6 (which give the 28 q points).

Re: MPIRUN errors

Posted: Thu Jul 07, 2016 6:02 am
by carla.verdi
Hi,

From your input I see you have nk1=12, nk2=12, nk3=12. This requires you to run the nscf calculation beforehand with a 12x12x12 uniform mesh in the input. If this is not the case you get an error.

Best
Carla

Re: MPIRUN errors

Posted: Thu Jul 07, 2016 2:54 pm
by The Pauli Principle
I don't think you can have a q grid which is finer than your k grid, since you would then get k+q points for which you do not have the electronic wavefunction, so you should always choose nk1=i*nq1 where i is an integer number.

Re: MPIRUN errors

Posted: Sun Jul 10, 2016 7:49 pm
by eliephys78
Thanks Carla,

This mistake was obvious. Sorry. i was just in a hurry. Everything is running smoothly again but the code is "frozen" at the following stage since yesterday:

Using uniform q-mesh: 20 20 20
Size of q point mesh for interpolation: 8000
Using uniform k-mesh: 20 20 20
Size of k point mesh for interpolation: 16000
Max number of k points per pool: 500

Fermi energy coarse grid = -1.184660 eV

===================================================================

Fermi energy is read from the input file: Ef = -1.184700 eV

===================================================================

ibndmin = 6 ebndmin = -0.183
ibndmax = 7 ebndmax = -0.089


Number of ep-matrix elements per pool : 9000 ~= 70.31 Kb (@ 8 bytes/ DP)

Is it because of the huge number of points??

The number of processors and pools are such that:

Parallel version (MPI & OpenMP), running on 512 processor cores
Number of MPI processes: 32
Threads/MPI process: 16
K-points division: npool = 32

Re: MPIRUN errors

Posted: Mon Jul 11, 2016 3:55 pm
by sponce
Dear eliephys78,

Its probably not frozen but just computing and showing no progression.

This has been improved for the new version (September) and progress will be shown.

If the calculation runs with smaller grid and you do not run into memory issue (you usually can check that by login on the node if you have the access) then its just a matter of computing it.

Best,

Samuel

Re: MPIRUN errors

Posted: Thu Jul 14, 2016 6:50 am
by Danansmith
Kinda late....but thank you! Nice post. I like it. Thanks for sharing these information.


_____________________________________________________________________________________________


popular bubble football?Bubble soccer

Re: MPIRUN errors

Posted: Tue Jul 19, 2016 4:31 pm
by eliephys78
Thank you all for your help and support

Re: MPIRUN errors

Posted: Mon Nov 28, 2022 1:43 pm
by Kunsa
Waiting for input...
Waiting for input...

IMPORTANT: XC functional enforced from input :
Exchange-correlation= PZ
( 1 1 0 0 0 0 0)
Any further DFT definition will be discarded
Please, verify this is what you really want


G-vector sticks info
--------------------
sticks: dense smooth PW G-vecs: dense smooth PW
Sum 379 379 163 6657 6657 1907

Using Slab Decomposition


%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error in routine read_rhog (1):
error reading file ./MgB2.save/charge-density
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

stopping ...
This is the error is occurred when I tried to repeat EPW example provided in GitHub.The EPW version installed in the batch system is EPW 5.4.1. The charge density file is already in the directory but not read while compiling epw.x job program. How I fix this error? Any help is appriciated

Re: MPIRUN errors

Posted: Tue Nov 29, 2022 5:54 am
by Kunsa
Thanks, dear all! I revealed the trick by myself: if it is helpful to someone Looks how I revealed it.
1. error has occurred while I use the run program
mpiexec -np 32 pw.x <epw.in>epw.out
2. So, my problem is solved when i use
mpiexec /home/kunsa24/q-e-qe-7.0/bin/epw.x -nk 16 -in epw.in>epw.out

Thanks. Some one may get it helpful.