EPW error opening wfc file

This section is dedicated to compilation problems

Moderator: stiwari

Post Reply
mesgd
Posts: 2
Joined: Sun Nov 25, 2018 6:17 pm
Affiliation:

EPW error opening wfc file

Post by mesgd »

Hi,

I am trying to compile the latest version of QE(6.3.0)+EPW(5.0.0) with intel-compiler+mkl(18.0) and openmpi(1.8.8).

Everything is fine for pw.x, ph.x, wannier90...(All successfully pass the tests). However when I am trying to run epw code in test-suite, I keep getting this error...

....
-------------------------------------------------------------------
Wannierization on 3 x 3 x 3 electronic grid
-------------------------------------------------------------------

Spin CASE ( default = unpolarized )

Initializing Wannier90


Initial Wannier projections

( 0.00000 0.00000 0.00000) : l = -3 mr = 1
( 0.00000 0.00000 0.00000) : l = -3 mr = 2
( 0.00000 0.00000 0.00000) : l = -3 mr = 3
( 0.00000 0.00000 0.00000) : l = -3 mr = 4

- Number of bands is ( 4)
- Number of total bands is ( 4)
- Number of excluded bands is ( 0)
- Number of wannier functions is ( 4)
- All guiding functions are given

Reading data about k-point neighbours

- All neighbours are found


%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error in routine readwfc (20):
error opening wfc file
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

stopping ...

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error in routine readwfc (20):
error opening wfc file
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

stopping ...

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error in routine readwfc (20):
error opening wfc file
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
[/i]
....














And this is my make.inc file...

# make.inc. Generated from make.inc.in by configure.

# compilation rules

.SUFFIXES :
.SUFFIXES : .o .c .f .f90

# most fortran compilers can directly preprocess c-like directives: use
# $(MPIF90) $(F90FLAGS) -c $<
# if explicit preprocessing by the C preprocessor is needed, use:
# $(CPP) $(CPPFLAGS) $< -o $*.F90
# $(MPIF90) $(F90FLAGS) -c $*.F90 -o $*.o
# remember the tabulator in the first column !!!

.f90.o:
$(MPIF90) $(F90FLAGS) -c $<

# .f.o and .c.o: do not modify

.f.o:
$(F77) $(FFLAGS) -c $<

.c.o:
$(CC) $(CFLAGS) -c $<



# Top QE directory, useful for locating libraries, linking QE with plugins
# The following syntax should always point to TOPDIR:
TOPDIR = $(dir $(abspath $(filter %make.inc,$(MAKEFILE_LIST))))
# if it doesn't work, uncomment the following line (edit if needed):

# TOPDIR = /home/rcf-proj2/an/yaoyu/bin/QE.6.3.0

# DFLAGS = precompilation options (possible arguments to -D and -U)
# used by the C compiler and preprocessor
# To use libxc (v>=3.0.1), add -D__LIBXC to DFLAGS
# See include/defs.h.README for a list of options and their meaning
# With the exception of IBM xlf, FDFLAGS = $(DFLAGS)
# For IBM xlf, FDFLAGS is the same as DFLAGS with separating commas

# MANUAL_DFLAGS = additional precompilation option(s), if desired
# BEWARE: it does not work for IBM xlf! Manually edit FDFLAGS
MANUAL_DFLAGS =
DFLAGS = -D__INTEL -D__MPI -D__SCALAPACK -D__DFTI
FDFLAGS = $(DFLAGS) $(MANUAL_DFLAGS)

# IFLAGS = how to locate directories with *.h or *.f90 file to be included
# typically -I$(TOPDIR)/include -I/some/other/directory/
# the latter contains .e.g. files needed by FFT libraries
# for libxc add -I/path/to/libxc/include/

IFLAGS = -I$(TOPDIR)/include -I$(TOPDIR)/FoX/finclude -I$(TOPDIR)/S3DE/iotk/include/ -I/usr/usc/intel/18.1/compilers_and_libraries_2018.1.163/linux/mkl/include -I/usr/usc/intel/18.1/compilers_and_libraries_2018.1.163/linux/mkl/include/fftw

# MOD_FLAG = flag used by f90 compiler to locate modules

MOD_FLAG = -I

# BASEMOD_FLAGS points to directories containing basic modules,
# while BASEMODS points to the corresponding module libraries
# Each Makefile can add directories to MODFLAGS and libraries to QEMODS

BASEMOD_FLAGS= $(MOD_FLAG)$(TOPDIR)/iotk/src \
$(MOD_FLAG)$(TOPDIR)/Modules \
$(MOD_FLAG)$(TOPDIR)/FFTXlib \
$(MOD_FLAG)$(TOPDIR)/LAXlib \
$(MOD_FLAG)$(TOPDIR)/UtilXlib \
$(MOD_FLAG)$(TOPDIR)/FoX/finclude

# Compilers: fortran-90, fortran-77, C
# If a parallel compilation is desired, MPIF90 should be a fortran-90
# compiler that produces executables for parallel execution using MPI
# (such as for instance mpif90, mpf90, mpxlf90,...);
# otherwise, an ordinary fortran-90 compiler (f90, g95, xlf90, ifort,...)
# If you have a parallel machine but no suitable candidate for MPIF90,
# try to specify the directory containing "mpif.h" in IFLAGS
# and to specify the location of MPI libraries in MPI_LIBS

MPIF90 = mpifort
F90 = ifort
CC = mpicc
F77 = mpifort

# GPU architecture (Kepler: 35, Pascal: 60, Volta: 70 )
GPU_ARCH=

# CUDA runtime (Pascal: 8.0, Volta: 9.0)
CUDA_RUNTIME=

# CUDA F90 Flags
CUDA_F90FLAGS=

# C preprocessor and preprocessing flags - for explicit preprocessing,
# if needed (see the compilation rules above)
# preprocessing flags must include DFLAGS and IFLAGS

CPP = mpicc -E
CPPFLAGS = $(DFLAGS) $(IFLAGS)

# compiler flags: C, F90, F77
# C flags must include DFLAGS and IFLAGS
# F90 flags must include MODFLAGS, IFLAGS, and FDFLAGS with appropriate syntax

CFLAGS = -O3 -g $(DFLAGS) $(IFLAGS)
F90FLAGS = $(FFLAGS) -nomodule -fpp $(FDFLAGS) $(CUDA_F90FLAGS) $(IFLAGS) $(MODFLAGS)
FFLAGS = -O3 -g

# compiler flags without optimization for fortran-77
# the latter is NEEDED to properly compile dlamch.f, used by lapack

FFLAGS_NOOPT = -O0 -assume byterecl -g -traceback

# compiler flag needed by some compilers when the main program is not fortran
# Currently used for Yambo

FFLAGS_NOMAIN =

# Linker, linker-specific flags (if any)
# Typically LD coincides with F90 or MPIF90, LD_LIBS is empty
# for libxc, set LD_LIBS=-L/path/to/libxc/lib/ -lxcf90 -lxc

LD = mpifort
LDFLAGS =
LD_LIBS =

# External Libraries (if any) : blas, lapack, fft, MPI

# If you have nothing better, use the local copy via "--with-netlib" :
# BLAS_LIBS = /your/path/to/espresso/LAPACK/blas.a
# BLAS_LIBS_SWITCH = internal

BLAS_LIBS = ${MKLROOT}/lib/intel64/libmkl_scalapack_lp64.a -Wl,--start-group ${MKLROOT}/lib/intel64/libmkl_intel_lp64.a ${MKLROOT}/lib/intel64/libmkl_sequential.a ${MKLROOT}/lib/intel64/libmkl_core.a ${MKLROOT}/lib/intel64/libmkl_blacs_openmpi_lp64.a -Wl,--end-group -lpthread -lm -ldl
BLAS_LIBS_SWITCH = external

# If you have nothing better, use the local copy via "--with-netlib" :
# LAPACK_LIBS = /your/path/to/espresso/LAPACK/lapack.a
# LAPACK_LIBS_SWITCH = internal
# For IBM machines with essl (-D__ESSL): load essl BEFORE lapack !
# remember that LAPACK_LIBS precedes BLAS_LIBS in loading order

LAPACK_LIBS = ${MKLROOT}/lib/intel64/libmkl_scalapack_lp64.a -Wl,--start-group ${MKLROOT}/lib/intel64/libmkl_intel_lp64.a ${MKLROOT}/lib/intel64/libmkl_sequential.a ${MKLROOT}/lib/intel64/libmkl_core.a ${MKLROOT}/lib/intel64/libmkl_blacs_openmpi_lp64.a -Wl,--end-group -lpthread -lm -ldl
LAPACK_LIBS_SWITCH = external

SCALAPACK_LIBS = ${MKLROOT}/lib/intel64/libmkl_scalapack_lp64.a -Wl,--start-group ${MKLROOT}/lib/intel64/libmkl_intel_lp64.a ${MKLROOT}/lib/intel64/libmkl_sequential.a ${MKLROOT}/lib/intel64/libmkl_core.a ${MKLROOT}/lib/intel64/libmkl_blacs_openmpi_lp64.a -Wl,--end-group -lpthread -lm -ldl

# nothing needed here if the the internal copy of FFTW is compiled
# (needs -D__FFTW in DFLAGS)

FFT_LIBS = ${MKLROOT}/lib/intel64/libmkl_scalapack_lp64.a -Wl,--start-group ${MKLROOT}/lib/intel64/libmkl_intel_lp64.a ${MKLROOT}/lib/intel64/libmkl_sequential.a ${MKLROOT}/lib/intel64/libmkl_core.a ${MKLROOT}/lib/intel64/libmkl_blacs_openmpi_lp64.a -Wl,--end-group -lpthread -lm -ldl
# HDF5
HDF5_LIB =
FOX_LIB = -L$(TOPDIR)/FoX/lib -lFoX_dom -lFoX_sax -lFoX_wxml -lFoX_common\
-lFoX_utils -lFoX_fsys
FOX_FLAGS =
# For parallel execution, the correct path to MPI libraries must
# be specified in MPI_LIBS (except for IBM if you use mpxlf)

MPI_LIBS =

# IBM-specific: MASS libraries, if available and if -D__MASS is defined in FDFLAGS

MASS_LIBS =

# CUDA libraries
CUDA_LIBS=
CUDA_EXTLIBS =

# ar command and flags - for most architectures: AR = ar, ARFLAGS = ruv

AR = ar
ARFLAGS = ruv

# ranlib command. If ranlib is not needed (it isn't in most cases) use
# RANLIB = echo

RANLIB = ranlib

# all internal and external libraries - do not modify

FLIB_TARGETS = all

LIBOBJS = $(TOPDIR)/clib/clib.a $(TOPDIR)/iotk/src/libiotk.a
LIBS = $(CUDA_LIBS) $(SCALAPACK_LIBS) $(LAPACK_LIBS) $(FOX_LIB) $(FFT_LIBS) $(BLAS_LIBS) $(MPI_LIBS) $(MASS_LIBS) $(HDF5_LIB) $(LD_LIBS)

# wget or curl - useful to download from network
WGET = wget -O

# Install directory - not currently used
PREFIX = /usr/local









Any suggestions would be appreciated. Thanks a lot!

Yu Yao
Vahid
Posts: 101
Joined: Fri Apr 08, 2016 11:02 pm
Affiliation:

Re: EPW error opening wfc file

Post by Vahid »

I have run into this error a couple of years ago. I traced it to the intel compiler particularly intel 16 and 17. When I compiled EPW with gcc, the error disappeared.

In addition, intel14 seemed to work just fine. It is likely that there are other factors which generate the error you cited.

Cheers,
Vahid

Vahid Askarpour
Department of Physics and Atmospheric Science
Dalhousie University,
Halifax, NS, Canada
sponce
Site Admin
Posts: 616
Joined: Wed Jan 13, 2016 7:25 pm
Affiliation: EPFL

Re: EPW error opening wfc file

Post by sponce »

Hello,

I am currently building a new test-farm to test this. I can say that intel 17 should be working.
However I do not know yet about intel 18.0.
I will test intel 18 very soon and add it to the supported compiler if it works.

I therefore suggest that you use intel 17 or lower in the meantime.

Best wishes,
Samuel
Prof. Samuel Poncé
Chercheur qualifié F.R.S.-FNRS / Professeur UCLouvain
Institute of Condensed Matter and Nanosciences
UCLouvain, Belgium
Web: https://www.samuelponce.com
mesgd
Posts: 2
Joined: Sun Nov 25, 2018 6:17 pm
Affiliation:

Re: EPW error opening wfc file

Post by mesgd »

Hi Vahid and Samuel,

Thanks for your response and help. I will try to recompile it with intel17.

Best,
Yu
jqhuang
Posts: 12
Joined: Mon Jan 07, 2019 2:24 pm
Affiliation:

Re: EPW error opening wfc file

Post by jqhuang »

Hi, mesgd,
I have encountered the same CRASH during running epw5.0. The compiler i use is Intel_compiler/16.0.3+IMPI/5.1.3.210. Can you give me some advices if you have solved this problem.

Best,
jqhuang
Post Reply