MPI_Scatter中的致命错误:无效的通信器

时间:2013-11-03 20:56:46

标签: fortran mpi fortran90

我试图使用MPI Scatter并行添加矩阵的DO循环,我收到以下错误。我已初始化所有MPI变量,已完成

  call MPI_INIT( ierr )
  call MPI_COMM_RANK( MPI_COMM_WORLD, myid, ierr )
  call MPI_COMM_SIZE( MPI_COMM_WORLD, numprocs, ierr ) 
程序开始时的

和最后的MPI_Finalise(ierr)。但是我仍然在尝试做MPI Scatter的错误

Fatal error in MPI_Scatter: Invalid communicator, error stack:
MPI_Scatter(766): MPI_Scatter(sbuf=0x6ab2a0, scount=0, INVALID DATATYPE,         
rbuf=0x7fff39a99398, rcount=0, INVALID DATATYPE, root=0, comm=0x43380000) failed
MPI_Scatter(641): Invalid communicator

我在代码中到处使用默认通信器MPI_COMM_WORLD。我试图通过将矩阵的部分发送到多个进程来并行化矩阵更新操作。输入矩阵是x,输出矩阵是y。 X是REAL * 8数据类型的矩阵,具有N行和N列。还

nx = N/numberofprocesses

我的MPI_Scatter代码如下

if (processid.eq.0) then
 call MPI_Scatter(x, nx, MPI_DOUBLE_PRECISION,MPI_IN_PLACE, nx, MPI_DOUBLE_PRECISION, 0, MPI_COMM_WORLD, ierr) 
else 
call MPI_Scatter(x, nx, MPI_DOUBLE_PRECISION, x, nx, MPI_DOUBLE_PRECISION, 0, MPI_COMM_WORLD, ierr) 
end if

我的收集代码是

if (processid.eq.0) then
   call MPI_Gather(MPI_IN_PLACE, nx, MPI_DOUBLE_PRECISION, y, nx, MPI_DOUBLE_PRECISION, 0, MPI_COMM_WORLD, ierr)
else
   call MPI_Gather(y, nx, MPI_DOUBLE_PRECISION, y, nx, MPI_DOUBLE_PRECISION, 0, MPI_COMM_WORLD, ierr)
end if

错误的原因是什么?

0 个答案:

没有答案
相关问题