EPW Superconductivity
Posted: Fri Jul 29, 2016 10:45 am
Dear all,
> I am trying to calculate *electron-phonon coupling strangth,
> anisotropic Elishberg spectral function , anisotrotic superconducting gap*
> using EPW as instructed from the site given below
>
> http://epw.org.uk/Main/About
>
> http://epw.org.uk/Documentation/Tutorial
>
> I am trying to run EPW on bulk FCC Lead. I shall be highly obliged if
> anyone help me to find my error. I have calculated using the following
> steps:
>
> mpirun -np 4 ../../../../bin/pw.x < scf.in > scf.out
> mpirun -np 4 ../../../../bin/ph.x < ph.in > ph.out
> python pp.py < pp.in
> mpirun -np 4 ../../../../bin/pw.x < scf_epw.in > scf_epw.out
> mpirun -np 4 ../../../../bin/pw.x -npool 4 < nscf_epw.in > nscf_epw.out
> mpirun -np 4 ../../../src/epw.x -npool 4 < epw.in > epw.out
>
> all the calculation were completed successfully except the last epw
> calculation
>
> *mpirun -np 4 ../../../src/epw.x -npool 4 < epw.in <http://epw.in> >
> epw.out*
>
> It is showing error :
>
> Program EPW v.4.0.0 starts on 28Jul2016 at 10:27:15
>
> This program is part of the open-source Quantum ESPRESSO suite
> for quantum simulation of materials; please cite
> "P. Giannozzi et al., J. Phys.:Condens. Matter 21 395502 (2009);
> URL http://www.quantum-espresso.org",
> in publications or presentations arising from this work. More details
> at
> http://www.quantum-espresso.org/quote
>
> Parallel version (MPI), running on 4 processors
> R & G space division: proc/nbgrp/npool/nimage = 4
>
>
> %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
> *Error in routine epw_readin (1):*
> * reading input_epw namelist*
>
> %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
>
> stopping ...
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 1.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
>
>
> *My input file is :*
>
> &inputepw
> prefix = 'pb'
> amass(1) = 207.2
> outdir = './'
>
> elph = .true.
> kmaps = .false.
> epbwrite = .true.
> epbread = .false.
>
> epf_mem = .true.
> etf_mem = .true.
>
> epwwrite = .true.
> epwread = .false.
>
> nbndsub = 4
> nbndskip = 0
>
> wannierize = .true.
> num_iter = 300
> dis_win_max = 18
> dis_froz_max= 8
> proj(1) = 'Pb:sp3'
>
> iverbosity = 0
>
> elinterp = .true.
> phinterp = .true.
>
> tshuffle2 = .true.
> tphases = .false.
>
> elecselfen = .true.
> phonselfen = .true.
> a2f = .true.
>
> parallel_k = .true.
> parallel_q = .false.
>
> fsthick = 0.5
> eptemp = 0.045
> degaussw = 0.045
> degaussq = 0.05
>
> dvscf_dir = './save'
> filukk = './pb.ukk'
>
> nk1 = 6
> nk2 = 6
> nk3 = 6
>
> nq1 = 6
> nq2 = 6
> nq3 = 6
>
> nqf1 = 4
> nqf2 = 4
> nqf3 = 4
> nkf1 = 30
> nkf2 = 30
> nkf3 = 30
> /
> 16 cartesian
> 0.0000000 0.0000000 0.0000000 0.0909090
> -0.1666667 0.1666667 -0.1666667 0.0909090
> -0.3333333 0.3333333 -0.3333333 0.0909090
> 0.5000000 -0.5000000 0.5000000 0.0909090
> 0.0000000 0.3333333 0.0000000 0.0909090
> -0.1666667 0.5000000 -0.1666667 0.0909090
> 0.6666667 -0.3333333 0.6666667 0.0909090
> 0.5000000 -0.1666667 0.5000000 0.0909090
> 0.3333333 0.0000000 0.3333333 0.0909090
> 0.0000000 0.6666667 0.0000000 0.0909090
> 0.8333333 -0.1666667 0.8333333 0.0909090
> 0.6666667 0.0000000 0.6666667 0.0909090
> 0.0000000 -1.0000000 0.0000000 0.0909090
> 0.6666667 -0.3333333 1.0000000 0.0909090
> 0.5000000 -0.1666667 0.8333333 0.0909090
> -0.3333333 -1.0000000 0.0000000 0.0909090
>
>
> --
> Yours sincerely
>
> Gargee Bhattacharyya
> ?PhD Pursuing
> Metallurgy Engineering & Materials Science
> IIT Indore ?
>
> M.Tech (VLSI Design & Microelectronics Technology)
> Department of ETCE
> Jadavpur University
>
> _______________________________________________
> Pw_forum mailing list
> Pw_forum@pwscf.org
> http://pwscf.org/mailman/listinfo/pw_forum
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://pwscf.org/pipermail/pw_forum/att ... -0001.html
------------------------------
_______________________________________________
Pw_forum mailing list
Pw_forum@pwscf.org
http://pwscf.org/mailman/listinfo/pw_forum
End of Pw_forum Digest, Vol 108, Issue 29
*****************************************
> I am trying to calculate *electron-phonon coupling strangth,
> anisotropic Elishberg spectral function , anisotrotic superconducting gap*
> using EPW as instructed from the site given below
>
> http://epw.org.uk/Main/About
>
> http://epw.org.uk/Documentation/Tutorial
>
> I am trying to run EPW on bulk FCC Lead. I shall be highly obliged if
> anyone help me to find my error. I have calculated using the following
> steps:
>
> mpirun -np 4 ../../../../bin/pw.x < scf.in > scf.out
> mpirun -np 4 ../../../../bin/ph.x < ph.in > ph.out
> python pp.py < pp.in
> mpirun -np 4 ../../../../bin/pw.x < scf_epw.in > scf_epw.out
> mpirun -np 4 ../../../../bin/pw.x -npool 4 < nscf_epw.in > nscf_epw.out
> mpirun -np 4 ../../../src/epw.x -npool 4 < epw.in > epw.out
>
> all the calculation were completed successfully except the last epw
> calculation
>
> *mpirun -np 4 ../../../src/epw.x -npool 4 < epw.in <http://epw.in> >
> epw.out*
>
> It is showing error :
>
> Program EPW v.4.0.0 starts on 28Jul2016 at 10:27:15
>
> This program is part of the open-source Quantum ESPRESSO suite
> for quantum simulation of materials; please cite
> "P. Giannozzi et al., J. Phys.:Condens. Matter 21 395502 (2009);
> URL http://www.quantum-espresso.org",
> in publications or presentations arising from this work. More details
> at
> http://www.quantum-espresso.org/quote
>
> Parallel version (MPI), running on 4 processors
> R & G space division: proc/nbgrp/npool/nimage = 4
>
>
> %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
> *Error in routine epw_readin (1):*
> * reading input_epw namelist*
>
> %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
>
> stopping ...
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 1.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
>
>
> *My input file is :*
>
> &inputepw
> prefix = 'pb'
> amass(1) = 207.2
> outdir = './'
>
> elph = .true.
> kmaps = .false.
> epbwrite = .true.
> epbread = .false.
>
> epf_mem = .true.
> etf_mem = .true.
>
> epwwrite = .true.
> epwread = .false.
>
> nbndsub = 4
> nbndskip = 0
>
> wannierize = .true.
> num_iter = 300
> dis_win_max = 18
> dis_froz_max= 8
> proj(1) = 'Pb:sp3'
>
> iverbosity = 0
>
> elinterp = .true.
> phinterp = .true.
>
> tshuffle2 = .true.
> tphases = .false.
>
> elecselfen = .true.
> phonselfen = .true.
> a2f = .true.
>
> parallel_k = .true.
> parallel_q = .false.
>
> fsthick = 0.5
> eptemp = 0.045
> degaussw = 0.045
> degaussq = 0.05
>
> dvscf_dir = './save'
> filukk = './pb.ukk'
>
> nk1 = 6
> nk2 = 6
> nk3 = 6
>
> nq1 = 6
> nq2 = 6
> nq3 = 6
>
> nqf1 = 4
> nqf2 = 4
> nqf3 = 4
> nkf1 = 30
> nkf2 = 30
> nkf3 = 30
> /
> 16 cartesian
> 0.0000000 0.0000000 0.0000000 0.0909090
> -0.1666667 0.1666667 -0.1666667 0.0909090
> -0.3333333 0.3333333 -0.3333333 0.0909090
> 0.5000000 -0.5000000 0.5000000 0.0909090
> 0.0000000 0.3333333 0.0000000 0.0909090
> -0.1666667 0.5000000 -0.1666667 0.0909090
> 0.6666667 -0.3333333 0.6666667 0.0909090
> 0.5000000 -0.1666667 0.5000000 0.0909090
> 0.3333333 0.0000000 0.3333333 0.0909090
> 0.0000000 0.6666667 0.0000000 0.0909090
> 0.8333333 -0.1666667 0.8333333 0.0909090
> 0.6666667 0.0000000 0.6666667 0.0909090
> 0.0000000 -1.0000000 0.0000000 0.0909090
> 0.6666667 -0.3333333 1.0000000 0.0909090
> 0.5000000 -0.1666667 0.8333333 0.0909090
> -0.3333333 -1.0000000 0.0000000 0.0909090
>
>
> --
> Yours sincerely
>
> Gargee Bhattacharyya
> ?PhD Pursuing
> Metallurgy Engineering & Materials Science
> IIT Indore ?
>
> M.Tech (VLSI Design & Microelectronics Technology)
> Department of ETCE
> Jadavpur University
>
> _______________________________________________
> Pw_forum mailing list
> Pw_forum@pwscf.org
> http://pwscf.org/mailman/listinfo/pw_forum
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://pwscf.org/pipermail/pw_forum/att ... -0001.html
------------------------------
_______________________________________________
Pw_forum mailing list
Pw_forum@pwscf.org
http://pwscf.org/mailman/listinfo/pw_forum
End of Pw_forum Digest, Vol 108, Issue 29
*****************************************