Bug in MD run of VASP5.3

Problems running VASP: crashes, internal errors, "wrong" results.


Moderators: Global Moderator, Moderator

Post Reply
Message
Author
ppphysics
Newbie
Newbie
Posts: 2
Joined: Fri Mar 22, 2013 7:27 am

Bug in MD run of VASP5.3

#1 Post by ppphysics » Tue Jan 28, 2014 7:43 am

Hey everyone, I am using vasp5.3 to do some MD calculation but some error happened.
I am using ifort and impi to compile and run vasp5.3. It is fine with a regular run but I found some bug when I am using Advanced MD techniques Module.
Does anyone knows how to fix it ? I can provide my Makefile if necessary.

Here is my INCAR:

SYSTEM = DIAMOND

STARTPARAMETER FOR THIS RUN:
NWRITE = 1; LPETIM=F WRITE-FLAG & TIMER
ISTART = 0 JOB : 0-NEW 1-CONT 2-SAMECUT

ELECTRONIC RELAXATION 1
ENCUT = 300 EV
NELM = 200
EDIFF = 1E-05 STOPPING-CRITERION FOR ELM
NELMIN = 4
BMIX = 2.00
ISPIN = 1

IONIC RELAXATION
NBLOCK = 1; KBLOCK = 100 INNER BLOCK; OUTER BLOCK
IBRION = 0 IONIC RELAX: 0-MD 1-QUASI-NEW 2-CG
ISIF = 3 CALCULATE STRESS WITH CONSTANT UNIT CELL VOLUME
ISYM = 0
LCORR = T HARRIS-CORRECTION TO FORCES
TEBEG = 300
TEEND = 300
SMASS = 0.1 NOSE MASS
POTIM = 2 TIME STEP IN FS
IWAVPR = 12
EDIFFG = 0.1E-4

ELECTRONIC RELAXATION 2
IALGO = 48 ALGORITHM
LDIAG = T SUB-SPACE DIAGONALISATION
LREAL = F REAL-SPACE PROJECTION
WEIMIN = 0

PREC = NORMAL
NBANDS = 256
LWAVE = .FALSE.
LCHARG = .FALSE.
APACO = 10.0 ! DISTANCE FOR P.C.
NSW = 2000
MDALGO=3
LBLUEOUT=.TRUE.
LANGEVIN_GAMMA_L=5

Error looks like follow:

MPIR_Allgatherv_impl(1002):
MPIR_Allgatherv(958)......:
MPIR_Allgatherv_intra(708):
MPIR_Localcopy(381).......: memcpy arguments alias each other, dst=0xbc1ed40 src=0xbc1ed40 len=131072
Fatal error in PMPI_Allgatherv: Internal MPI error!, error stack:
PMPI_Allgatherv(1430).....: MPI_Allgatherv(sbuf=0xaf9ad50, scount=8192, MPI_DOUBLE_COMPLEX, rbuf=0xaf5ad50, rcounts=0xacdc400, displs=0xacdc450, MPI_DOUBLE_COMPLEX, comm=0x84000003) failed
MPIR_Allgatherv_impl(1002):
MPIR_Allgatherv(958)......:
MPIR_Allgatherv_intra(708):
MPIR_Localcopy(381).......: memcpy arguments alias each other, dst=0xaf9ad50 src=0xaf9ad50 len=131072
Fatal error in PMPI_Allgatherv: Internal MPI error!, error stack:
PMPI_Allgatherv(1430).....: MPI_Allgatherv(sbuf=0xc4add50, scount=8192, MPI_DOUBLE_COMPLEX, rbuf=0xc44dd50, rcounts=0xc1cf440, displs=0xc1cf490, MPI_DOUBLE_COMPLEX, comm=0x84000003) failed
MPIR_Allgatherv_impl(1002):
MPIR_Allgatherv(958)......:
MPIR_Allgatherv_intra(708):
MPIR_Localcopy(381).......: memcpy arguments alias each other, dst=0xc4add50 src=0xc4add50 len=131072
Fatal error in PMPI_Allgatherv: Internal MPI error!, error stack:
PMPI_Allgatherv(1430).....: MPI_Allgatherv(sbuf=0xc1fae30, scount=8192, MPI_DOUBLE_COMPLEX, rbuf=0xc17ae30, rcounts=0xbf0f560, displs=0xbf0f5b0, MPI_DOUBLE_COMPLEX, comm=0x84000003) failed
MPIR_Allgatherv_impl(1002):
MPIR_Allgatherv(958)......:
MPIR_Allgatherv_intra(708):
MPIR_Localcopy(381).......: memcpy arguments alias each other, dst=0xc1fae30 src=0xc1fae30 len=131072
Fatal error in PMPI_Allgatherv: Internal MPI error!, error stack:
PMPI_Allgatherv(1430).....: MPI_Allgatherv(sbuf=0xaea3e30, scount=8192, MPI_DOUBLE_COMPLEX, rbuf=0xae03e30, rcounts=0xab98560, displs=0xab985b0, MPI_DOUBLE_COMPLEX, comm=0x84000003) failed
MPIR_Allgatherv_impl(1002):
MPIR_Allgatherv(958)......:
MPIR_Allgatherv_intra(708):
MPIR_Localcopy(381).......: memcpy arguments alias each other, dst=0xaea3e30 src=0xaea3e30 len=131072
Fatal error in PMPI_Allgatherv: Internal MPI error!, error stack:
PMPI_Allgatherv(1430).....: MPI_Allgatherv(sbuf=0xc38ccf0, scount=8192, MPI_DOUBLE_COMPLEX, rbuf=0xc2cccf0, rcounts=0xc04e440, displs=0xc04e490, MPI_DOUBLE_COMPLEX, comm=0x84000003) failed
MPIR_Allgatherv_impl(1002):
MPIR_Allgatherv(958)......:
MPIR_Allgatherv_intra(708):
MPIR_Localcopy(381).......: memcpy arguments alias each other, dst=0xc38ccf0 src=0xc38ccf0 len=131072
Fatal error in PMPI_Allgatherv: Internal MPI error!, error stack:
PMPI_Allgatherv(1430).....: MPI_Allgatherv(sbuf=0xb712d50, scount=8192, MPI_DOUBLE_COMPLEX, rbuf=0xb632d50, rcounts=0xb3b4400, displs=0xb3b4450, MPI_DOUBLE_COMPLEX, comm=0x84000003) failed
MPIR_Allgatherv_impl(1002):
MPIR_Allgatherv(958)......:
MPIR_Allgatherv_intra(708):
MPIR_Localcopy(381).......: memcpy arguments alias each other, dst=0xb712d50 src=0xb712d50 len=131072
Last edited by ppphysics on Tue Jan 28, 2014 7:43 am, edited 1 time in total.

admin
Administrator
Administrator
Posts: 2921
Joined: Tue Aug 03, 2004 8:18 am
License Nr.: 458

Bug in MD run of VASP5.3

#2 Post by admin » Wed Jan 29, 2014 6:10 pm

this is a bug in your MPI installation, not in VASP
Last edited by admin on Wed Jan 29, 2014 6:10 pm, edited 1 time in total.

cchang
Newbie
Newbie
Posts: 12
Joined: Mon Jan 28, 2013 11:22 pm

Bug in MD run of VASP5.3

#3 Post by cchang » Thu May 01, 2014 6:59 pm

This is not an MPI "bug"; Intel MPI 4.1 onwards enforces the MPI 2.2 standard, which prohibits buffer aliasing. To get around this enforcement, set environment variable I_MPI_COMPATIBILITY to 3 or 4. I got the same error with vasp 5.3.5 and IMPI 4.1.1 and 4.1.3, and verified the solution. See https://software.intel.com/en-us/forums/topic/392347 for more.
Last edited by cchang on Thu May 01, 2014 6:59 pm, edited 1 time in total.

Post Reply