Parallel processing stopped working with error: object 'mcinteractive' not found

时间:2019-04-17 02:18:43

标签: r parallel-processing rstudio

For a long time I've been successfully running a program which uses parallel processing. a couple of days ago to code stopped working with the error message:

    "Error in get("mcinteractive", pkg) : object 'mcinteractive' not 
    found

    traceback()
    8: get("mcinteractive", pkg)

    7: .customized_mcparallel({
    result <- mclapply(X, function(...) {
    res <- FUN(...)
    writeBin(1L, progressFifo)
    return(res)
    }, ..., mc.cores = mc.cores, mc.preschedule = mc.preschedule,
    mc.set.seed = mc.set.seed, mc.cleanup = mc.cleanup, 
    mc.allow.recursive = mc.allow.recursive)
    if ("try-error" %in% sapply(result, class)) {
    writeBin(-1L, progressFifo)
    }
    close(progressFifo)
    result
    })

    6: pbmclapply(1:N, FUN = function(i) {
    max_score = max(scores[i, ])
    topLabels = names(scores[i, scores[i, ] >= max_score - 
    fine.tune.thres])
    if (length(topLabels) == 0) {
    return(names(which.max(scores[i, ])))
    }

(I have more traceback if you are interested, but I think it mainly belongs to the "surrounding" code and is not so interesting for the error per se. Tell me if you need it and I'll make an edit!)

I do not know anything about parallel processing, and I haven't been able to understand the issue by digging into the code. From what I've understood, parallel::mcparallel is a function containing the argument mcinteractive for which you can choose TRUE or FALSE. Earlier I got the tip to decrease the number of cores used in the processing. Before I used 16 cores without any issues. After the error started occurring I tried to set the number of cores to both 8 and 1 with the same result. If it is some memory problem I guess I'm in the wrong forum, sorrysorrysorry!! But I only experience problems when using RStudio, which is why I'm writing here. The only other thing that I can think of, that might be related, is that my processing (through RStudio) sometimes gets stuck and the only thing I found is that the RAM memory is full and I have to restart the computer. Then the processing works as usual again. However, this does not help with the new error when using parallel computation.

Do anyone recognize this issue and have any lead to what could be the cause? Is it the code, teh package, studioR or my computer? Any checks I can run? :)

Edit:

Including a short version of the error while searching the code after changing pbmclapply to mclapply.

> packageVersion("parallel") 
[1] ‘3.4.4’

> labels = parallel::pbmclapply(1:N, FUN = function(i) {
. . .
+   }, mc.cores = numCores)
Error: 'pbmclapply' is not an exported object from 'namespace:parallel'



> labels = pbmcapply::pbmclapply(1:N, FUN = function(i) {
. . .

+   }, mc.cores = numCores)
Error in get("mcinteractive", pkg) : object 'mcinteractive' not found




> labels = parallel::mclapply(1:N, FUN = function(i) {
. . .
+   }, mc.cores = numCores)
Warning message:
In parallel::mclapply(1:N, FUN = function(i) { :
  all scheduled cores encountered errors in user code



#inside mclapply

> job.res <- lapply(seq_len(cores), inner.do)
Error in mcfork() : could not find function "mcfork"

#inside inner.do

> f <- parallel::mcfork()
Error: 'mcfork' is not an exported object from 'namespace:parallel'

Edit 2: came a bit further in my error searching.

I had to add a triple colon before a lot of functions for parallel, meaning that i'm attaching an internal function (?), which in turn should mean that paralell is no longer part of my search path(?)

parallel:::mcfork()
parallel:::mc.advance.stream()
parallel:::selectChildren()
parallel:::isChild()

#Had to change .check_ncores(cores) to

parallel::detectCores()

1 个答案:

答案 0 :(得分:0)

发生此问题是因为pbmclapply已更新,现在仅适用于R> 3.5,更新R解决了我的问题。

相关问题