## Question about 2018 Tutorial (Tue.5, exercise 1)

General discussion around the EPW software

Moderator: hlee

alpinnt
Posts: 1
Joined: Mon Aug 08, 2022 6:57 am
Affiliation: University of Tokyo

### Question about 2018 Tutorial (Tue.5, exercise 1)

Dear EPW experts,

I have a question regarding the penultimate and final steps of exercise 1 in the 2018 tutorial, Tuesday.5 session in below link:
https://indico.ictp.it/event/8301/sessi ... al/0/0.pdf

After executing the penultimate step:
/home/nfs3/smr3191/q-e/bin/matdyn.x < matdyn.in.dos > matdyn.out.dos in page 5,

I obtain a 'lambda' file that is shown in page 7.

Broadening 0.0050 lambda 4.6503 dos(Ef) 7.4402 omega_ln [K] 58.9663
Broadening 0.0100 lambda 2.9129 dos(Ef) 5.1936 omega_ln [K] 54.7458
Broadening 0.0150 lambda 2.3700 dos(Ef) 4.2509 omega_ln [K] 49.0727
Broadening 0.0200 lambda 2.0155 dos(Ef) 3.7926 omega_ln [K] 46.7651
Broadening 0.0250 lambda 1.7924 dos(Ef) 3.5674 omega_ln [K] 46.0507
Broadening 0.0300 lambda 1.6801 dos(Ef) 3.4591 omega_ln [K] 46.0377
Broadening 0.0350 lambda 1.6336 dos(Ef) 3.4121 omega_ln [K] 46.2094
Broadening 0.0400 lambda 1.6184 dos(Ef) 3.3959 omega_ln [K] 46.2935
Broadening 0.0450 lambda 1.6156 dos(Ef) 3.3942 omega_ln [K] 46.2262
Broadening 0.0500 lambda 1.6163 dos(Ef) 3.3990 omega_ln [K] 46.0477

From this file, I can already see the values for lambda and omega_ln. (the columns in bold fonts, shown in the first line for example)
I suppose when convergence is finally reached, these lines will give very similar values, regardless of broadening.

If so, can we directly substitute the lambda and omega_ln values there to compute McMillan formula in Eq. (4) by hand?

is it necessary to perform the last step in page 6?
/home/nfs3/smr3191/q-e/bin/lambda.x < lambda.in > lambda.out

I thought this last step was only required prior to QE v4.2, where the omega_ln calculations are not yet included in matdyn.x execution.
https://www.mail-archive.com/users@list ... 06376.html

If it is still necessary, would you please elaborate ?

Thank you.
Best regards,
Alpin