Wannier90 calculation related issue
Posted: Wed May 30, 2018 4:18 pm
Dear Experts,
I got this error while running wannier90 code
Program PW2WANNIER v.6.2 (svn rev. 14038) starts on 30May2018 at 21:38: 2
This program is part of the open-source Quantum ESPRESSO suite
for quantum simulation of materials; please cite
"P. Giannozzi et al., J. Phys.:Condens. Matter 21 395502 (2009);
"P. Giannozzi et al., J. Phys.:Condens. Matter 29 465901 (2017);
URL http://www.quantum-espresso.org",
in publications or presentations arising from this work. More details at
http://www.quantum-espresso.org/quote
Parallel version (MPI), running on 4 processors
MPI processes distributed on 1 nodes
R & G space division: proc/nbgrp/npool/nimage = 4
Reading nscf_save data
Reading data from directory:
./Graphene.save/
IMPORTANT: XC functional enforced from input :
Exchange-correlation = PZ ( 1 1 0 0 0 0)
Any further DFT definition will be discarded
Please, verify this is what you really want
Parallelization info
--------------------
sticks: dense smooth PW G-vecs: dense smooth PW
Min 118 118 49 43266 43266 11501
Max 119 119 50 43335 43335 11556
Sum 475 475 199 173243 173243 46137
Spin CASE ( default = unpolarized )
Wannier mode is: standalone
-----------------
*** Reading nnkp
-----------------
Checking info from wannier.nnkp file
Something wrong!
rlatt(i,j) = 0.40474288765015115 at(i,j)= 1.0000000000000000
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error in routine pw2wannier90 (4):
Direct lattice mismatch
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
stopping ...
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
Please suggest me a way to resolve this issue.
Thanks and regards,
Anindya Bose
JRF,IIIT ALLAHABAD
I got this error while running wannier90 code
Program PW2WANNIER v.6.2 (svn rev. 14038) starts on 30May2018 at 21:38: 2
This program is part of the open-source Quantum ESPRESSO suite
for quantum simulation of materials; please cite
"P. Giannozzi et al., J. Phys.:Condens. Matter 21 395502 (2009);
"P. Giannozzi et al., J. Phys.:Condens. Matter 29 465901 (2017);
URL http://www.quantum-espresso.org",
in publications or presentations arising from this work. More details at
http://www.quantum-espresso.org/quote
Parallel version (MPI), running on 4 processors
MPI processes distributed on 1 nodes
R & G space division: proc/nbgrp/npool/nimage = 4
Reading nscf_save data
Reading data from directory:
./Graphene.save/
IMPORTANT: XC functional enforced from input :
Exchange-correlation = PZ ( 1 1 0 0 0 0)
Any further DFT definition will be discarded
Please, verify this is what you really want
Parallelization info
--------------------
sticks: dense smooth PW G-vecs: dense smooth PW
Min 118 118 49 43266 43266 11501
Max 119 119 50 43335 43335 11556
Sum 475 475 199 173243 173243 46137
Spin CASE ( default = unpolarized )
Wannier mode is: standalone
-----------------
*** Reading nnkp
-----------------
Checking info from wannier.nnkp file
Something wrong!
rlatt(i,j) = 0.40474288765015115 at(i,j)= 1.0000000000000000
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error in routine pw2wannier90 (4):
Direct lattice mismatch
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
stopping ...
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
Please suggest me a way to resolve this issue.
Thanks and regards,
Anindya Bose
JRF,IIIT ALLAHABAD