MPI_Waitall中的致命错误:无效的MPI_Request

时间:2016-10-24 07:34:13

标签: fortran runtime-error mpi

我把我的代码放在下面,但是当我运行它时,我遇到了这个错误:

aborting job:
 Fatal error in MPI_Waitall: Invalid MPI_Request, error stack:
 MPI_Wait(171): MPI_Waitall(count=4, req_array=0x000001400E0DE0,status_array = 0x0000000000012FD8C) failed
 MPI_Waitall(96). : Invalid MPI_Request
 MPI_Waitall(96). : Invalid MPI_Request

当我使用阻止发送和接收时,我的代码给了我一个正确答案但是当我使用非阻塞发送和接收时我收到错误。

这是我的代码:

   integer reqs(4)   ! required variable for non-blocking calls 
   integer stats(MPI_status_SIZE,4)   ! required variable for WAITALL routine 


      call MPI_INIT( ierr )
      call MPI_COMM_RANK( MPI_COMM_WORLD, taskid, ierr )
      call MPI_COMM_SIZE( MPI_COMM_WORLD, numtasks, ierr )

! Send data to the left neighbor
if ((taskid > 0) ) then
call MPI_ISEND(phi0(1,1),N_z,MPI_DOUBLE_PRECISION,taskid-1,11,MPI_COMM_WORLD,&
 reqs(1),ierr)
end if

!  Send data to the right neighbor
if (taskid < numtasks-1) then
call MPI_ISEND(phi0(1,cols),N_z,MPI_DOUBLE_PRECISION,taskid+1,10,MPI_COMM_WORLD,&
 reqs(2),ierr)
end if

    ! Receives data from the neighbor on the left
if (taskid > 0) then
call MPI_IRECV(phi0(1,0),N_z,MPI_DOUBLE_PRECISION,taskid-1,10,MPI_COMM_WORLD,&
 reqs(3),ierr)
end if 

    ! From the right side of the neighbors to get the data
if (taskid < numtasks-1) then
call MPI_IRECV(phi0(1,cols+1),N_z,MPI_DOUBLE_PRECISION,taskid+1,11,MPI_COMM_WORLD,&
 reqs(4),ierr)
end if

call MPI_WAITALL(4, reqs, status_array, ierr)

0 个答案:

没有答案
相关问题